nunit-console "could not load file or assembly" using MySolution.sln - nunit

I'm to use nunit-console to run all of the tests in my solution.
I did this:
c:\some\path>nunit-console-x86.exe MySolution.sln
NUnit-Console version 2.6.2.12296
Copyright (C) 2002-2012 Charlie Poole.
Copyright (C) 2002-2004 James W. Newkirk, Michael C. Two, Alexei A. Vorontsov.
Copyright (C) 2000-2002 Philip Craig.
All Rights Reserved.
Runtime Environment -
OS Version: Microsoft Windows NT 6.1.7601 Service Pack 1
CLR Version: 2.0.50727.5466 ( Net 3.5 )
ProcessModel: Default DomainUsage: Default
Execution Runtime: net-3.5
Could not load file or assembly 'MyNamespace.Administration, Version=0.0.0.1, Culture=neutral, PublicKeyToken=null' or one of its dependencies. The system cannot find the file specified.
So, I decided to try nunit-x86.exe I did File > Open Project > MySolution.sln and got this:
---------------------------
NUnit
---------------------------
Test load failed!
System.IO.FileNotFoundException : Could not load file or assembly
'MyNamespace.Administration, Version=0.0.0.1, Culture=neutral,
PublicKeyToken=null' or one of its dependencies. The system cannot
find the file specified.
For further information, use the Exception Details menu item.
---------------------------
OK
---------------------------
The exception can be found here
What is happening and how do I fix it? (without having to maintain a MySolution.nunit file)
More information
MyNamespace.Administration is not even one of the dlls that contains tests, which means that nunit fails trying to load it to look for tests to run. Knowing this I edited the file created by nunit-x86.exe (MySolution.nunit) and removed all dlls that did not have tests. Sure enough, the tests work (in both gui and console). This is not acceptable for me because it would mean that I have to keep yet another configuration file. Nunit supporting .sln files was supposed to avoid this.
My tests run fine using TestDriven.Net (but I really need to run them using nunit-console)
I have looked at this answer but I cannot make sense of what the fusion log viewer says. Would posting that log help? Assembly binding Log Viewer, lists 3 files being created:
nunit-agent-x86.exe, this one seems to be trying to find MyNamespace.Administration.dll/EXE inside the nunit directories
Tests_24398275 x2 - one looking for nunit.core in my project folders and another looking for unit.core.interfaces inside my project folders. I would pay little attention to these two since they also appear in my manually edited .nunit project).
(per andreister comment) The problem seems to be with the project/assembly itself and not the creation method. If I create a .nunit project and try to add MyNamespace.Administration to it (using 'Add Assembly...' or 'Add VS project...') it fails.
Calling nunit-console-x86 somepath/bin/Debug/MyNamespace.Administration.dll directly works.

Reposting my reply on nunit-discuss:
The NUnit feature of loading VS solutions is really fairly limited and intended to work with simple projects or as a quick way to create an NUnit project file - i.e. load the solution and save as an NUnit project, then edit the xml file that is created. Since the solution file format doesn't indicate which files are tests, NUnit attempts to load each project to check if it contains any tests. (This is the same thing that Visual Studio 2012 and later does when using the test explorer window, btw.)
As you suggest, I think the particular assembly fails to load because of having a dependency that is one level up. When loading either a VS solution file or an NUnit project file, NUnit sets the application base to the directory containing the solution or project. That's why an NUnit project file one level up works.
The designers' intent in this sort of situation is that you would create an NUnit project file. I recognize that this is somewhat inconvenient, since it gives you another configuration file to maintain. I'm open to suggestions regarding the use of globs either on the command line or within the project file. Any such changes would probably go into the next major upgrade, NUnit 3.0.

Unfortunately, even after posting on nunit-discuss group I was unable to find a proper solution for this problem.
nunit-discuss group confirmed that my tests are failing because of having a dependency that is one level up.
I did however found an acceptable work-around.
Since calling the .dlls directly didn't have the same issues.
I could do this with globs, but I'm on windows... but I have git bash installed.
Taking advantage of my somewhat rigid project structure and naming convention I managed to do this:
"C:\Program Files (x86)\Git\bin\bash.exe" -c 'nunit-console-x86.exe //framework=net-4.5 //xml:nunitresults.xml MysolutionFolder/Tests/*/bin/Debug/*.Tests.dll'
Please note that I took advantage of my naming convention. This is very important to do in order to reduce the number of arguments.
When I did nunit-console-x86 MysolutionFolder/*/*/bin/Debug/*.dll instead of MysolutionFolder/Tests/*/bin/Debug/*.Tests.dll I got an error from nunit-console-x86 saying Bad file number.
Besides, it's faster if I just provide the right files.
If you have a more recent version of bash (4.0+, I think) you can instead use the following command (note the use of **):
"C:\Program Files (x86)\Git\bin\bash.exe" -c 'nunit-console-x86.exe //framework=net-4.5 //xml:nunitresults.xml MysolutionFolder/**/bin/Debug/*.Tests.dll'
Which is shorter and more permissive on the project structure.

Related

Compilation of Postgresql using pycparser- header files not found

I have installed pycparser that parses C code.
Using pycparser I want to parse an open source project, namely PostgreSQL(version-11.0). I have build it using Visual Studio Express 2017 compiler suite. However, during compilation it cannot find some header files, namely windows.h and winsock2.h.
While looking at the directory structure of the build PostgreSQL, I find that it does not have these header files. How to fix this issue?
Also a strange error occurred as:
postgresql/src/include/c.h:363:2: error: #error must have a working
64-bit integer datatype
Note: I am using Windows 10 64-bit platform and postgresql-11.0
The steps are as follows:
I downloaded visual studio 2017, Windows-10 SDK, Active Perl as described in the steps to build from source in PostgreSQL.
After this I open the developer command prompt of Visual Studio and navigate to the folder postgresql-11.0/src/tools/msvc
Use command "build" to build postgresql. The build process was successful, but still windows.h and winsock2.h was not found in directory structure of PostgreSQL.
I don't know pycparser, but your problem probably has two aspects to it:
You didn't give pycparser the correct list of include directories. The header files you mention are not part of PostgreSQL.
Maybe you can get the list from the environment of the Visual Studio prompt. I don't have a Windows here to verify that.
The error message means that neither HAVE_LONG_INT_64 nor HAVE_LONG_LONG_INT_64 are defined.
Now pg_config.h.win32, which is copied to pg_config.h during the MSVC install process, has the following:
#if (_MSC_VER > 1200)
#define HAVE_LONG_LONG_INT_64 1
#endif
Since you are not using MSVC, you probable don't have _MSC_VER set, which causes the error.
You could define _MSC_VER and see if you get to build then.
Essentially you are in a tight spot here, because pycparser is not a supported build procedure, so you'll have to dig into the source and fix things as you go. Without an understanding of the PostgreSQL source and the build process, you probably won't get far.

SSIS deployment using DeploymentFileCompilerTask

I've read a lot around TFS deployment of SSIS packages - I have a VS2012 SSIS project and have created a .proj file using "DeploymentFileCompilerTask" to build the project into a .ispac:-
<UsingTask TaskName="DeploymentFileCompilerTask" AssemblyFile="..\tools\IntegrationServices.Build\Microsoft.SqlServer.IntegrationServices.Build.dll" />
...
<DeploymentFileCompilerTask
InputProject="$(SSISProjPath)"
Configuration="Release"
RootOutputDirectory="$(OutDir)"
ProtectionLevel="DontSaveSensitive">
i.e. the technique as outlined :-
https://gist.github.com/kulmam92/6433329
However the build is failing with :-
"Could not load file or assembly 'Microsoft.SqlServer.DTSRuntimeWrap, Version=11.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91' or one of its dependencies. The system cannot find the file specified."
The question I have is what is the minimum I need on the server that executes the above. I have read SSDT and/or SSIS needs to be deployed - there is also a technique relating to recompiling a Codeplex project (as outlined in http://sqlblog.com/blogs/jamie_thomson/archive/2010/09/14/ssis-msbuild-task-now-included-in-msbuild-extension-pack.aspx) or combining particular components of SQL Server (http://www.networksteve.com/enterprise/topic.php/SSIS_Package_Deployment_to_Server_only_running_SSIS_(and_not_SQL/?TopicId=29646&Posts=2).
I don't really want to get too granular, I just wanted a definitive answer to a fairly common problem - can anyone assist.
To anyone who has the same problem, the issue was 32-bit Vs 64-bit related - not the fact that the deployment was lacking a binary (in the GAC, etc)
Within the Build Definition (default build.xml file) go to Process | Advance | MSBuild Platform | Change from 'Auto' to 'X86'

oledb32.dll cannot improt ado.net while creating setup file

iam creating setup file in ado.net but whenever i build my project it is give me 'oledb32.dll' should be excluded because its source file 'C:\Program Files\Common Files\System\Ole DB\oledb32.dll' is under Windows System File Protection error so that's why i download in net and try to import that dll in my project but this file cannot import detected dependencies folder ..oledb32.dll file is important for show patient details in excel format so can you all expert give me any suggestion or advice
I ran into this issue as well and couldn't find a concrete answer until now.
I'm using a development box, checking in code via SVN, and running CruiseControl.NET to execute devenv.exe to automatically build the project (I don't use MSBuild because Microsoft hasn't implemented a solution for building Setup projects yet, and I assume this is what you are also using). The setup project would build fine on the dev box but on the build server it kept coming up with that same error.
The MSDN explanation can be found here, it's not very descriptive, but that's basically what needs to be done. The more concrete answer can be found here. Basically you have to open up VS on your build server and go in and exclude oledb32.dll (and any other problem files) and voila it finally builds and creates the MSI file! Hope this was helpful for you.

Test projects not reading app.config in TeamCity -> NUnit phase

Well we are facing a strange problem with JetBrains TeamCity induced unit tests on our main project where tests from few library projects are failing regularly. Apparently, it's not reading the config file (coming from app.config and nicely stored in project -> bin -> debug -> projectName.dll.config).
Hints or tips on what could be the real issue would be highly appreciated.
I've got the same problem and wasted a couple of hours to figure out what the problem is.
In our case, the NUnit plugin was configured to run the tests from:
**\*Tests.dll
Though this sounds to be OK, it has turned out that this pattern will not only match to the MyTests.dll in the bin\Debug folder but also to the obj\Debug\MyTests.dll. The obj folder is used internally for the compilation and does not contain the config file.
Finally the solution was to change the plugin configuration to
**\bin\Debug\*Tests.dll
Actually we use a system variable for the build configuration so we did not have the "Debug" hard-coded. Using bin* might be also dangerous when the workspace is also used for Debug/Release builds and you don't have a full cleanup specified.
You might wonder why I did not realize the test count mismatch (actually it was doubled, because they were running once from bin and once from obj), but this is typical: while everything is green, you don't care about the count. When we have introduced the first test depending on the config, we had only one failure (because the one from bin was passing), so the duplication was not outstanding.
In addition to Gaspar Nagy's accepted answer, check to see if your project has multiple test dlls and one of them is referencing another.
This causes the referenced dll to be run twice, and the copy that was in the other dll's folder to not have the proper app.config entries. The proper fix is to remove any and all references from the other test project.
TeamCity (v6.5.4) has its own NUnit Test Runner and there seems to be an inconsistency between it and the NUnit GUI test runner (2.5.10). The NUnit GUI Test Runner follows the long standing convention of expecting the configuration file name to be of the format .config. You can see this in NUnit by looking at Project -> Edit...
TeamCity on the other hand is looking for an app.config.
Your options are to either:
Set the NUnit GUI to point to app.config and include the resultant
nunit project in your source control.
Have both an app.config and a .config - syncing both
manualy.
Add a step to your build process to copy .config to
app.config (or vice versa).
I had similar woes
This may help; additionally we had issues where this still would not work - we ended up copying the relevant config sections into the highest level config file. (i.e. if it was a web app copy it into the Web.Config) - fairly kludgy but we had wasted a few days on the issue
I learned recently that app.config files are not read for a class library... Maybe this link could help :)
app.config for a class library
If you need a config file for your "unit" tests then you are doing it wrong. Proper unit testing never needs configuration or access to the database, file system etc. You should change your testing strategy.
Good point to start is mark your tests that need configuration with the[Category("Integration")] annotation and set the Teamcity test runner to ignore this category. Then you should focus on refactoring these test.

Visual Studio 2010 Publish Web feature not including all DLLs

I have an ASP.NET MVC 2 application.
Web project contains a reference to SomeProject
SomeProject contains references to ExternalAssembly1 and ExternalAssembly2.
SomeProject explicitly calls into ExternalAssembly1, but NOT ExternalAssembly2.
ExternalAssembly1 calls into ExternalAssembly2
When I perform a local build everything is cool. All DLLs are included in the bin\debug folder. The problem is that when I use the Publish Web command in Visual Studio 2010, it deploys everything except ExternalAssembly2.
It appears to ignore assemblies that aren't directly used (remember, ExternalAssembly2 is only used by ExternalAssembly1).
Is there any way I can tell Visual Studio 2010 to include ExternalAssembly2?
I can write a dummy method that calls into ExternalAssembly2. This does work, but I really don't want to have dummy code for the sole purpose of causing VS2010 to publish the DLL.
None of these answers are sufficient in my mind. This does seem to be a genuine bug. I will update this response if I ever find a non-hack solution, or Microsoft fixes the bug.
Update:
Doesn't seem promising.
https://connect.microsoft.com/VisualStudio/feedback/details/731303/publish-web-feature-not-including-all-dlls
I am having this same problem (different assemblies though). If I reference the assemblies in my web project, then they will get included in the publish output, but they should be included anyway because they are indirect dependencies:
Web Project ---> Assembly A ---> Assembly B
On build, assemblies A and B are outputed to the \bin folder. On publish, only assembly A is outputed to the publish folder.
I have tried changing the publish settings to include all files in the web project, but then I have files in my publish output that shouldn't be deployed.
This seems like a bug to me.
I had the same problem with VS2010 and a WCF Service Application.
It turns out that if your (directly or indirectly) referenced DLL's are deployed to GAC, the VS publishing feature excludes them. Once I removed the assemblies from GAC, publishing feature started working as expected.
I guess VS is assuming that if your assemblies can be located in GAC on the machine you build, they will be located in GAC on the target machine as well. At least in my case this assumption is false.
My tests show that the external assemblies get published when I have a reference on them in the web project. I do not have to write any dummy code to make it work. This seems acceptable to me.
I agree with Nicholas that this seems to be a bug in visual studio. At least it escapes me what the reason for the behavior could be.
I have created this issue as a bug on Microsoft Connect. If anyone experiencing it could vote it up https://connect.microsoft.com/VisualStudio/feedback/details/637071/publish-web-feature-not-including-all-dlls then hopefully we'll get something done about it.
If you go into the ExternalAssembly2 reference property list and change the "Copy Local" to "True" i think that might solve your issue.
I don't know if you are watching this still but I found the solution (I had the exact same issue) via this MSDN article. Under "build action" for the file choose "Content" that should include it in the list of files publish brings over.
I have created a new Connect bug here https://connect.microsoft.com/VisualStudio/feedback/details/731303/publish-web-feature-not-including-all-dlls
I've also attached a solution and detailed steps to reproduce this issue. Lets hope this time they won't close it as Can't Reproduce.
Vote for this connect issue if you experience the missing dll problem.
Copy local did the trick. I had an issue that the Newtonsoft.Json assembly get included in the deploymeny package. Copy local was set to false.
I am experiencing the same type of issue with a web project. I have a web project that references assembly A which references assembly B. It worked fine for some time but today it was broken. I did a rebuild of the solution and this time it deployed everything correctly.
I had this same problem today. I published my web project and realized that not all of the reference DLL's were there. In particular, the indirect DLL references.
It turns out that the directory in which I was publishing to was out of disk space (network share). I had just enough space to publish all the files except for few indirect reference DLL's. The sad part is that VS08 didn't throw any errors. It just published the files are usual. I cleared out some HDD space and everything worked fine.
I didn't find the HDD space issue until I tried to manually move the DLL's over.
in my case it is quite tricky.
Reference to ExternalAssembly2 is not required to Build the project but vital for run-time since we use reflection to configure Unity container.
So, I delete the reference - build the project successfully, but get run-time error.
If I preserve the reference I can Build and Run the application but I cannot Publish it with ExternalAssembly2 - get run-time exception as well.
This is happen because of internal VS2010 assemblies optimization.
So, what we can do here?
1. Put some unrequired peice of code to use any ExternalAssembly2's class.
2. escape from reflection and use static assemblies linking.
Hope this helps to smbd.
I got the same problem and this is a VS2010 bug if there's a reference link like:
Web Project --> custom project --> assembly1 -->(indirectly) assembly2.
For now I find if I reference the Assembly1 in the web project, then assembly2 is included in the bin folder.
So I had to add an additional reference link like:
Web project --> assembly1 -->(indirectly) assembly2.
Then VS can recognize assembly2 and include its dll file in publish action.