MEF unable to oversrite DirectoryCatalog file. Access Denied - mef

I'm having issues with MEF where I have a DirectoryCatalog and in a later stage want to overwrite the assembly and "refresh" the catalog.
The problem i'm running into is that the file simply is "in use" and I can't overwrite the file. Normaly you are able to overwrite a .Net assembly.
I quess MEF has it in use, but how does this match with Recompilation?!
Here is my code example. Even with local variables the file is still in use.
I've also tried to have the assembly in both application and plugins folder but then the app folder version is used and therefor overwriting does not make a difference.
public RecompilationExample()
{
DirectoryInfo dir = new DirectoryInfo(".\\plugin");
if (!dir.Exists)
dir.Create();
DirectoryCatalog d;
CompositionContainer c;
d = new DirectoryCatalog(".\\plugin");
d.Changed += new EventHandler<ComposablePartCatalogChangeEventArgs>(d_Changed);
c = new CompositionContainer(d);
c.ExportsChanged += new EventHandler<ExportsChangeEventArgs>(c_ExportsChanged);
c.ComposeParts(this);
}

Normaly you are able to overwrite a .Net assembly.
As far as I know, no. A loaded .NET assembly cannot be overwritten. You also can't unload a loaded assembly (except by unloading the entire AppDomain it is hosted in).
What you can do instead is to use shadow copying, i.e. copying the assembly and then loading the copy. You can enable this with the AppDomainSetup.ShadowCopyFiles property. This is typically used in ASP.NET and allows you to overwrite the original file, but not in a way that influences the running process - until you restart it.
See also this other answer I wrote in response to a similar question. Long story short: You can use DirectoryCatalog.Refresh to add new assemblies on the fly, but not to replace or remove them. When you need to replace assemblies, the best solution is to restart your process.

Related

Remove/Add References and Compile antique VB6 application using Powershell

I've been given the task of researching whether one can use Powershell to automate the managing of References in VB6 application and then compile it's projects afterwards.
There are 3 projects. I requirement is to remove a specific reference in each project. Then, compile projects from bottom up (server > client > interface) and add reference back in along the way. (remove references, compile server.dll >add client reference to server.dll, compile client.dll > add interface reference to client.dll, compile interface.exe)
I'm thinking no, but I was still given the task of finding out for sure. Of course, where does one go to find this out? Why here of course, StackOverflow.
References are stored in the project .VBP files which are just text files. A given reference takes up exactly one line of the file.
For example, here is a reference to DAO database components:
Reference=*\G{00025E01-0000-0000-C000-000000000046}#5.0#0#C:\WINDOWS\SysWow64\dao360.dll#Microsoft DAO 3.6 Object Library
The most important info is everything to the left of the path which contains the GUID (i.e., the unique identifier of the library, more or less). The filespec and description text are unimportant as VB6 will update that to whatever it finds in the registry for the referenced DLL.
An alternate form of reference is for GUI controls, such as:
Object={BDC217C8-ED16-11CD-956C-0000C04E4C0A}#1.1#0; tabctl32.ocx
which for whatever reason never seem to have a path anyway. Most likely you will not need to modify this type of reference, because it would almost certainly break forms in the project which rely on them.
So in your Powershell script, the key task would be to either add or remove the individual reference lines mentioned in the question. Unless you are using no form of binary compatibility, the GUID will remain stable. Therefore, you could essentially hardcode the strings you need to add/remove.
Aside from all that, its worth thinking through why you need to take this approach at all. Normally to build a VB6 solution it is totally unnecessary to add/remove references along the way. Also depending on your choice of deployment techniques, you are probably using either project or binary compatibility which tends to keep the references stable.
Lastly, I'll mention that there are existing tools such as Kinook's Visual Build Pro which already know how to build groups of VB6 projects and if using a 3rd party tool like that is an option, could save you a lot of work.

Powerbuilder DataStore fail only when deployed as EXE (but succeeds as DataWindow)

I have an app which works great in the development environment but misbehaves when deployed as an EXE. When I click deploy and make an EXE, all of my queries which are run through DataStore objects succeed (SQLCode 0) but return zero rows. Out of frustration I changed to visible datawindows and it magically worked again under an EXE. So I made the datawindows invisible and it continued to work. This is just bizarre. I have another powerbuilder app which is much larger, uses lots of DataStore objects (on the same database) and those work great.
DataStore ds_wacn
ds_wacn = create datastore
ds_wacn.DataObject = 'd_plateaccessions'
ds_wacn.SetTransObject(SQLCA)
ds_wacn.Retrieve(sLoad, iPlate)
IF SQLCA.SQLCode < 0 then ...
// Succeeds in development, fetches zero rows under EXE
dw_wacn.SetTransObject(SQLCA)
dw_wacn.Retrieve(sLoad, iPlate)
IF SQLCA.SQLCode < 0 then ...
// Succeeds in development and in EXE
I was very careful to make sure that the app that works and the one that fails are using the same settings to connect to the database (but still could be a problem there). This is Powerbuilder 11.5.1
Very likely your DataWindow object isn't being compiled into the EXE.
When you compile an EXE, PowerBuilder starts at the Application object and intelligently tries to determine which objects should be included. Since d_plateaccessions is only referenced in a string in a script, it isn't included.
There are two ways around this.
You can create a PBD for the PBL containing the DataWindow. PBD creation blindly includes all objects in the PBL. This method is quite popular, and many people just mark all their PBLs for PBD creation and deploy the PBDs.
You can alternatively create a PBR for the EXE, telling the compiler to force certain DataWindows and graphic files into the EXE. If you really want a single EXE, but don't want the effort of building an appropriate PBR, you can use PBL Peeper to generate PBRs and scripts to force all DataWindows and objects (and find all relevant graphics) into a compiled EXE, using the PBR Builder Plus report.
Good luck,
Terry.
I have a problem with your first 4 rows.
DataStore ds_wacn
ds_wacn.DataObject = 'd_plateaccessions'
ds_wacn.SetTransObject(SQLCA)
dw_wacn.Retrieve(sLoad, iPlate)
Do you really retrieve on dw_wacn instead of ds_wacn?
And there isn't "create" for your local datastore.
I don't use frenquently local datastore but in this case the code is like this in our program
dataStore ds_myDs
ds_myDds = create datastore
ds_myDds.DataObject = 'myDataObject'
ds_myDds.SetTransObject(SQLCA)
ds_myDds.Retrieve( /*arguments or not*/)
/*
some code
*/
destroy ds_myDs

GWT: Get constants in server side

I'm trying to get the constants (ConstantsWithLookup) stored in the client side in my server side, but it can't figure out how to do it. I have my constants interface and my constants properties in the same folder.
I've tried tips of other similar threads with no success.
I tried Hermes, gwt-i18n-server, gwt-dmesg, GTWI18N, using a ResourceBundle, trying to get source file properties.
For the first two, it seems that the main reason is the outdated support for the newest GWT version. As for the ResourceBundle, it cannot find the properties file because at deployment, there isn't a properties file, just a Constants.class.
I'm trying to avoid changing my properties file to another location (like /WEB-INF/constants).
I'm using Hermes with GWT 2.5.0.rc1, and it works fine. Usage:
put hermes-1.2.0.jar into war/WEB-INF/lib
Then on the server side write something like
MyConstantsWithLookup my = Hermes.get(MyConstantsWithLookup.class, "de");
String string = my.getString(key);
A properties file MyConstantsWithLookup.properties must exist in the same package as MyConstantsWithLookup.java, even if that properties file is empty (which might be the case if you're using #DefaultStringValue etc.)
Also add MyConstantsWithLookup_de.properties etc.
Make sure, that these properties files are copied next to your classes when compiling. Javac doesn't do that, so it must be done in an additional build step (Eclipse usually does this automatically, but it won't happen by itself when you build e.g. with Ant)
Many build setups will skip the java and properties files from the "client" package when compiling the server side. In that case, put your constants files in the "shared" package (if you have one).

MEF parts list sometimes empty

I'm currently using MEF and a DirectoryCatalog to load some parts from some extension DLLs. It works for me, and most of the people that use the program, but some users experience the parts not being loaded at all. Collecting some debug information, it seems that MEF does load the DLLs (catalog.LoadedFiles lists them), but that no parts are listed in catalog.Parts.
One user is on XP sp3 and one is on Windows 7, so I don't think that the OS is the problem. Does anyone have some idea of why this would be happening?
The following is the code that actually creates the container, in case it would help with anything.
private static IEnumerable<Task> CreateTypes()
{
CompositionContainer container = GetContainer();
var exp = container.GetExports<Task>();
return exp.Select(e => e.Value);
}
private static CompositionContainer container;
public static CompositionContainer GetContainer()
{
if (container != null)
return container;
DirectoryCatalog catalog = new DirectoryCatalog(ExtensionDirectory, "*.dll");
container = new CompositionContainer(catalog);
return container;
}
(Yes, I'm answering my own question...more than a year later...)
http://mikehadlow.blogspot.com/2011/07/mef-directorycatalog-fails-to-load.html
Basically, because some people downloaded the program using IE then unzipped with Windows Explorer, the DLLs were marked as being from the internet, so MEF refused to load their parts, though they still showed up inside the catalog.
The solution (for my situation at least) was simply to delete the alternate data streams that said that the DLLs were from the internet, as described in the above link.
but the other one is in "C:\Spiele" which sounds like a user-created folder
Reminds me of this:
Assembly Load Issues
In .NET, there are different contexts into which an assembly can be loaded. The default load context is generally the best one to use, but it cannot load assemblies that are not in the application base directory, subdirectories of the application base which are included in the probing path, or the GAC. When using a DirectoryCatalog or passing a path to the AssemblyCatalog constructor, MEF will attempt to load assemblies in the default load context. However, if the assemblies are not in the probing path or the GAC, this will not be possible, and MEF will load them in the load-from context instead.
The load-from context can lead to type identity issues that result in an InvalidCastException, MissingMethodException, or other errors. In order to avoid these issues in a MEF application, you can put any extension directories under the application base directory, and add them to the probing private path in your application configuration file. For other options and more information about assembly loading in .NET, see the Best Practices for Assembly Loading MSDN document.
See the source "How to Debug and Diagnose MEF Failures" for more debugging information and tools.

Using OpenWrap with Scope

I may not fully understand the wiki article on scoping, so forgive me if this sounds dumb.
Intro:
I have a solution (ABC.sln) with over 40 projects and am trying to implement OpenWrap for package management.
So I did the following in the solution's root folder:
o init-wrap -all
That worked fine: I now have a file called SLN.wrapdesc in the solution's root folder. All of the .csproj files in the subfolders contain the OpenWrap targets line.
I then proceded to add the different wraps to the solution with:
o add-wrap -Name xxx
Again, this worked fine: I have some wraps in the wraps folder, and the build doesn't break after removing the old references from the projects.
Problem:
All of the contents of the wraps are going to all of the projects, even for those that don't need it. I would like to be able to specify which wraps go where, eg AjaxControlToolkit only goes into web projects.
What I tried
First, I removed the AjaxControlToolkit from the wrapdesc:
o remove-wrap AjaxControlToolkit
This causes the build to break (as expected). Then I tried the following:
1. Try to add the wrap back with a scope:
o add-wrap -Name AjaxControlToolkit -scope webproject
This simply puts the wrap back in the wraps folder. I then added <OpenWrap-Scope>customscope</OpenWrap-Scope> to the project file, but the build still broke.
2. Try and manually add a file called ABC.webproject.wrapdesc to the root folder. This causes the following error when I try to open the solution:
The "exists" function only accepts a scalar value, but its argument "#(_WrapFile->'%(FullPath)')" evaluates to "D:\Projects\ABC.webproject.wrapdesc;D:\Projects\ABC.wrapdesc" which is not a scalar value.
I guess it doesn't like 2 wrapdesc files. That is strange because the wiki says "...you can add a second descriptor alongside your default descriptor..."
So now I'm stuck. Anyone have any ideas?
The per-msbuild file is really not a recommended approach to managing dependencies. Doing it per project is not quite the design philosophy behind OpenWrap, so the system is not quite optimized for those scenarios.
If you don't need something from those assemblies then the easiest way to solve it is to not use the references by not using any code from those packages. This solves the problem very easily as nothing will get loaded (or even need to be on disk) if no code has been added to it.
That said, add-wrap -scope newscope will create an additional .wrapdesc file that will add the new dependency to the new scope, by creating a myProject.newscope.wrapdesc file independently of the original myProject.wrapdesc.
If you do want to do this per-project, have you tried using the convention-based scoping? Something like:
directory-structure: src\*{scope: Web=WebProjects}*
Would take any project in a folder child of src containing Web in the name and assign those to the WebProjects scope.
I know that one has worked fine for my projects so far, although you do have to restart VS as it aggressively caches certain files and will not see the change.
Customizing the msbuild file itself is not fully tested (and the wiki entry was very much a design spec rather than final documentation, not all of it has been built that way) so it may or may not work. Happy to take a look if you can open a bug ticket on http://github.com/openrasta/openwrap/issues