Reputation: 15997
I have been using the technique of embedding dlls (embedded resource) into an exe and using the following code to resolve the unknown dlls at runtime.
AppDomain.CurrentDomain.AssemblyResolve += (sender, args) =>
{
String resourceName = "Project.lib." + new AssemblyName(args.Name).Name + ".dll";
using (var stream = Assembly.GetExecutingAssembly().GetManifestResourceStream(resourceName))
{
Byte[] assemblyData = new Byte[stream.Length];
stream.Read(assemblyData, 0, assemblyData.Length);
return Assembly.Load(assemblyData);
}
};
However when I embed the Spark View Engine dll (for example) it falls over. But only in one particular place. Spark itself dynamically generates class's on the fly. These class then reference Spark (using Spark
etc). It is at this point I get the following error.
The type 'Spark.Class' is defined in an assembly that is not referenced. You must add a reference to the assembly 'Spark'
I'm pretty sure that this has nothing to do with the Spark view engine but to do with referencing an embedded assembly from within a dynamically generated class.
Update: stacktrace
An Exception has occurred when running the Project Tasks Message: Spark.Compiler.BatchCompilerException: Dynamic view compilation failed. c:\Users\Adam\AppData\Local\Temp\kdsjyhvu.0.cs(6,14): error CS0012: The type 'Spark.AbstractSparkView' is defined in an assembly that is not referenced. You must add a reference to assembly 'Spark, Version=1.5.0.0, Culture=neutral, PublicKeyToken=7f8549eed921a12c' at Spark.Compiler.BatchCompiler.Compile(Boolean debug, String languageOrExtension, String[] sourceCode) at Spark.Compiler.CSharp.CSharpViewCompiler.CompileView(IEnumerable
1 viewTemplates, IEnumerable
1 allResources) at Spark.SparkViewEngine.CreateEntryInternal(SparkViewDescriptor descriptor, Boolean compile) at Spark.SparkViewEngine.CreateEntry(SparkViewDescriptor descriptor) at Spark.SparkViewEngine.CreateInstance(SparkViewDescriptor descriptor) at ProjectTasks.Core.Templater.Populate(String templateFilePath, Object data) in \ProjectTasks\Core\Templater.cs:line 33 at ProjectTasks..Core.EmailTemplates.RenderImpl(String name, Object data) in \ProjectTasks\Core\EmailTemplates.cs:line 19 at ProjectTasks.Tasks.EmailUsersWithIncompleteModules.Run() in \ProjectTasks\Tasks\EmailUsersWithIncompleteModules.cs:line 41 at ProjectTasks.MaintenanceTaskRunner.Run(Boolean runNow, IMaintenanceTask[] tasks) in \ProjectTasks\MaintenanceTaskRunner.cs:line 25 at ProjectTasks.Initialiser.Init(String[] args) in \ProjectTasks\Initialiser.cs:line 30
Anyone have any ideas on a resolution if indeed there is one at all?
Upvotes: 1
Views: 1084
Reputation: 8685
As others have said, the problem lies with the fact that the CodeDom produces artifacts on disk that it then subsequently needs access to in order to render the views.
Apart from the fact that embedding Spark is a potential memory hog anyway, I believe there's a potential solution to this problem. Given the fact that the problem is caused by dynamic view generation on the fly, why not take advantage of Spark's batch compilation option to generate the dll's for your views as part of your build.
you can use code similar to the following to achieve this:
var factory = new SparkViewFactory(settings)
{
ViewFolder = new FileSystemViewFolder(viewsLocation)
};
// And generate all of the known view/master templates into the target assembly
var batch = new SparkBatchDescriptor(targetPath);
factory.Precompile(batch);
In the end, you should have an output dll which will contain compiled views, and you can then embed that dll the same way you are embedding the main Spark.dll.
Hope that helps
Rob
Upvotes: 0
Reputation: 942197
The stack trace strongly suggests that Spark is using System.CodeDom to dynamically generate assemblies. That requires reference assemblies to be files on disk, the C# compiler runs out-of-process. This is normally not a problem because you'd have Spark.dll in the same directory as your EXE.
You cannot make this work.
Fwiw: this technique is horribly wasteful of system resources. You double the amount of memory required for assemblies. It is the expensive kind of memory as well, it cannot be shared between processes and is backed by the paging file instead of the assembly file. You can also buy yourself some serious type identity trouble. .NET already supports deployment in a single file. It is called setup.exe
Upvotes: 1
Reputation: 16782
I guess Spark uses CodeDom for dynamic code generation. CSharpCodeProvider internally generates source code and runs csc.exe to obtain new types. Since csc.exe needs physical files as references then AssemblyResolve trick will not help in this case.
Upvotes: 2