We currently carry out development on a mapped drive. When I write nunit tests against a test assembly it will pick up the assembly, however does not recognise any of the tests.
If I move the solultion etc to a local drive and reference it again then everything works fine.
What I really woiuld like to know is why this is being caused, and how I can carry on using a network drive for development.
Per http://geekswithblogs.net/TimH/archive/2007/08/02/114340.aspx, NUnit apparently does not have appropriate permissions to access the assembly when on a network drive. The suggested fix is to add a post-build event to copy the assembly to a local temp directory and run NUnit off that copied assembly:
Within VS, open the project properties.
Go to the Build Events tab and enter the following 'Post-build event command line':
del /q c:\temp\nunit*.*
copy "$(TargetDir)." c:\temp\nunit
A potential issue you may have as a result of this change is related to the AppBase as per Unable to load <mytest> because it is not located under Appbase. The answer there is to update the Settings element within the .nunit file to include an app base of C:\Temp\NUnit then update the assembly element's path to remove any leading directory information.
Related
I would like to use the Microsoft.Data.SqlClient namespace/ objects in a PowerShell script.
Two (click1, click2) Github posts provide a way to load the correct dll's, but the solutions don't seem to work anymore.
E.g. result of the first solution while copying the next files:
(Packages copied from .nuget/packages folder)
Microsoft.Data.SqlClient.dll
Microsoft.Data.SqlClient.SNI.x64.dll
Microsoft.Identity.Client.dll
Result: Could not load file or assembly 'System.Runtime, Version=6.0.0.0'
In addition, I've tried to create a Dummy Console App -> Added the Microsoft.Data.SqlClient Nuget package -> built the project and copied all dll's to the same folder as the PS script.
As soon as I start the script (using the 'Add-Type -Path' construction), it results in errors, such as 'couldn't load file or assembly - wrong version...' (this is strange, because the folder contains all dll's...)
Could you provide an alternative solution/ steps in order to use the described package in a PS script?
Is it possible include arbitrary files (in this case a .csv) from a TwinCAT project direct to the Boot directory of a PLC?
By using PATH_BOOTPATH in the file open/read FBs it is possible to load files from this directory in a convenient manner regardless of whether using a CE or Windows deployment, However deployment of files to this location seems to be the sticking point.
I know that a copy of the project code is included within the CurrentConfig<Project>.tpzip file, but this file is not easily accessible from code, or updateable.
I've found the 'Additional Files' section within the system configuration, but it makes little sense.
Adding a file from inside the project as a 'Relative' path doesn't seem to do anything
Adding a file from inside the project as an external path includes the file (via symbolic links?) in the 'CurrentConfig.tszip' file, which has the same issues as the .tpzip
Adding an external file as an external path again includes the file inside of the .tszip.
I'm willing to accept that this might not be possible, but it just feels odd that the PATH_BOOTPRJ and PATH_BOOTPATH roots are there and not accessing useful paths.
Deployment
To quote Beckhoff:
Deployment is used to set up commands that are to be executed during the installation and startup of an application.
The event types are essentially at what stage of the deployment process the command is performed, where the command can either be copying a file or execution of a script/program.
Haven't performed extensive testing but between absolute/relative pathing and execution this should solve nearly all issues with deployment configuration.
I am doing some modifications on OpenCover and NUnit to suit my needs.
Briefly, I want to get coverage information even when shadow-copy is enabled in Nunit. However, OpenCover is unable to track an assembly when the pdb file is absent. When shadow-copy is enabled in NUnit, the assembly under test is copied to a shadow directory and OpenCover fails to find the corresponding pdf file.
At first, I thought that .Net Runtime just didn't copy pdb files to shadow directory. But After I investigated this problem further, I found out Runtime will copy pdb files but not the same time when assemblies are copied.
My understanding now is that Runtime will first copy assemblies to shadow directories. And then those assemblies will be loaded and Opencover is notified. OpenCover find out that there are no pdb files, so these assemblies are ignored. Sometime later, pdb files are copied, but OpenCover failed to notice this fact thus failed to track these assemblies.
So my question is what is the exact time that .Net Runtime copies pdb files to shadow directories? And is it possible that I modify OpenCover so that it will be able to track these shadow-copied assemblies?
However, OpenCover is unable to track an assembly when the pdb file is absent.
This is by design as instrumenting every assembly that is loaded without a PDB means instrumenting every IL operation rather than each sequence point, the information of which is in the PDB.
Sometime later, pdb files are copied, but OpenCover failed to notice this fact thus failed to track these assemblies.
If the PDB is copied later then its too late for OpenCover as the runtime has already loaded the assembly so OpenCover made its instrumentation decisions.
Now, OpenCover uses various locations to look for PDBs
it looks in the same folder as the assembly was loaded from
it looks in the folder set by -workingdir
it looks in its current directory
However, to support /noshadow I usually find the -mergebyhash option resolves this if not then I use #2 from the list above the easiest to use
I've a class library in visual studio with a method that just checks if specified file exists or not. If I pass just file name (without full path) of some text file which exists in the bin directory, it works fine by identifying its existence.
Hence File.Exists("myfile.txt") works if myfile.txt is in bin directory.
But when I load a test case from NUnit GUI which executes this method, it fails to read the file. Likely because bin directory executing NUnit is different than original bin where dll and myfile.txt reside. How can I tackle this in my NUnit without resorting to hardcoded full path?
In your tests pass a relative path to the method of the class under test. This avoids resorting to a hard coded full path and as long as your test project is always in the same location relative to your source project it'll work.
e.g. if you have your source set up like this:
\Solution\src\Project\bin\debug\myFile.txt
\Solution\test\TestProject\bin\debug\TestAssembly.dll
The relative path will be #"..\..\..\..\Project\bin\debug\myfile.txt"
Update
I'm not quite sure why your tests are running from a temporary folder. I either use a test runner such as Resharper or set up my test project as follows:
Open the project properties for the project containing your tests.
Go to the Debug tab and set the following values:
Start external program: Enter the location of nunit.exe, e.g. on my PC it's installed to C:\Program Files\NUnit 2.5.5\bin\net-2.0\nunit.exe.
Command line arguments: Enter the name of your assembly containing your tests followed by the run argument, e.g. TestProject.dll /run.
Set the project containing your tests as the StartUp Project.
Hit F5.
This way your tests will always run from bin\debug (depending on how your build is configured), so you can rely on projects always being in the same relative location.
When I add a file to my setup deployment project, Visual Studio won't allow me to edit the "SourcePath" to resolve an environment variable like $(DLL_PATH). It adds the file with the source path on my local machine and builds fine locally. When the same project is built on another machine, it won't work unless that machine also has the exact same path to files needed.
I want the SourcePath to resolve the $(DLL_PATH) so as long as a machine has it defined correctly the MSI package will build fine.
Not sure about the subst, since I have no control over what the other build machine looks like. If I try to assign a known directory to a virtual drive, it could possibly fail right?
Your best bet is to use subst.exe or a junction point to create a virtual directory. See here for information on junction points. Subst.exe simply creates a virtual drive letter. Put all of the deployable files in some directory tree with well-defined, constanct sub-paths, and make the root of that tree a junction point or virtual drive.
Actually what I did was setup a script.cmd to run after my project output is built to copy the dependencies to a folder that is relative to the actual project folder from the declared $(DLL_PATH). The setup project actually uses relative paths to the project, not absolute ones. So this works no matter what the build machine looks like. Then a script to remove this folder at the end.