iam creating setup file in ado.net but whenever i build my project it is give me 'oledb32.dll' should be excluded because its source file 'C:\Program Files\Common Files\System\Ole DB\oledb32.dll' is under Windows System File Protection error so that's why i download in net and try to import that dll in my project but this file cannot import detected dependencies folder ..oledb32.dll file is important for show patient details in excel format so can you all expert give me any suggestion or advice
I ran into this issue as well and couldn't find a concrete answer until now.
I'm using a development box, checking in code via SVN, and running CruiseControl.NET to execute devenv.exe to automatically build the project (I don't use MSBuild because Microsoft hasn't implemented a solution for building Setup projects yet, and I assume this is what you are also using). The setup project would build fine on the dev box but on the build server it kept coming up with that same error.
The MSDN explanation can be found here, it's not very descriptive, but that's basically what needs to be done. The more concrete answer can be found here. Basically you have to open up VS on your build server and go in and exclude oledb32.dll (and any other problem files) and voila it finally builds and creates the MSI file! Hope this was helpful for you.
Related
I've been playing around with Azure Devops lately to host a NuGet package as an artifact, which I would then use in another project of mine.
So far so good, I managed to get the package and to use it as intended, but I'd like to be able to debug it as well so I had to add symbols (as far as I've understood?). So I added a publish step in my pipeline for the symbols which succeeds and the .pdb file gets published. I refer to my symbols feed in Visual Studio by connecting to DevOps in the settings Debug > Symbols.
When debugging the code it correctly downloads the .pdb file to the temp location and all the whilst the code is running it's staying there.
Under the debugger > windows > modules it actually tells me that the symbols are correctly loaded whilst debugging, but as soon as I try to step into the code I get the error: ".cs not found".
I've tried multiple things such as clearing the symbols cache, changing settings in debug for "own code only" and "allow source server support" etc. But to no avail.
Did I miss a step or am I doing something horribly wrong?
Debug NuGet package using Azure Devops Symbole Server resulting to class not found
That because you do not enable Source Link, which supports Visual Studio knows where it should look to download the source code while debugging.
To debug the source code, we need to have source code, pdb (or /Z7) contains debug information which is like mapping between executable code and your source code. With pdb VS debugger knows where in source files each instruction is located, but it still needs to have source files to show you the code.
So, we have to enable the Source Link. Edit the .csproj file and include the following code in the first PropertyGroup element:
<PublishRepositoryUrl>true</PublishRepositoryUrl>
<EmbedUntrackedSources>true</EmbedUntrackedSources>
You could check the similar thread for some more details.
On the other hand, you could also add the source code in the nuget package as a lightweight solution:
Check my previous thread for details.
Hope this helps.
I have migrated my MSBuild-Integrated solution into Automatic Package Restore. It works on Visual Studio but when I try running the command
nuget restore Path/To/MySolution.sln
(I try doing that in my Package Manager console as well as in my Jenkins "Windows batch command" build step)
but in those cases I get an error The solution file has two projects named "1_2".
I cannot find these projects in my solution. Any ideas?
Sorry for not answering before but I ended up finding out that the solution had two websites created from the local IIS and they ended with a version number that was the same. Visual Studio named the project with this end only, which made this solution with two 'projects' with the same name. I could see it by looking in the sln file. Looks like VS does not treat this edge case :-/
I'm to use nunit-console to run all of the tests in my solution.
I did this:
c:\some\path>nunit-console-x86.exe MySolution.sln
NUnit-Console version 2.6.2.12296
Copyright (C) 2002-2012 Charlie Poole.
Copyright (C) 2002-2004 James W. Newkirk, Michael C. Two, Alexei A. Vorontsov.
Copyright (C) 2000-2002 Philip Craig.
All Rights Reserved.
Runtime Environment -
OS Version: Microsoft Windows NT 6.1.7601 Service Pack 1
CLR Version: 2.0.50727.5466 ( Net 3.5 )
ProcessModel: Default DomainUsage: Default
Execution Runtime: net-3.5
Could not load file or assembly 'MyNamespace.Administration, Version=0.0.0.1, Culture=neutral, PublicKeyToken=null' or one of its dependencies. The system cannot find the file specified.
So, I decided to try nunit-x86.exe I did File > Open Project > MySolution.sln and got this:
---------------------------
NUnit
---------------------------
Test load failed!
System.IO.FileNotFoundException : Could not load file or assembly
'MyNamespace.Administration, Version=0.0.0.1, Culture=neutral,
PublicKeyToken=null' or one of its dependencies. The system cannot
find the file specified.
For further information, use the Exception Details menu item.
---------------------------
OK
---------------------------
The exception can be found here
What is happening and how do I fix it? (without having to maintain a MySolution.nunit file)
More information
MyNamespace.Administration is not even one of the dlls that contains tests, which means that nunit fails trying to load it to look for tests to run. Knowing this I edited the file created by nunit-x86.exe (MySolution.nunit) and removed all dlls that did not have tests. Sure enough, the tests work (in both gui and console). This is not acceptable for me because it would mean that I have to keep yet another configuration file. Nunit supporting .sln files was supposed to avoid this.
My tests run fine using TestDriven.Net (but I really need to run them using nunit-console)
I have looked at this answer but I cannot make sense of what the fusion log viewer says. Would posting that log help? Assembly binding Log Viewer, lists 3 files being created:
nunit-agent-x86.exe, this one seems to be trying to find MyNamespace.Administration.dll/EXE inside the nunit directories
Tests_24398275 x2 - one looking for nunit.core in my project folders and another looking for unit.core.interfaces inside my project folders. I would pay little attention to these two since they also appear in my manually edited .nunit project).
(per andreister comment) The problem seems to be with the project/assembly itself and not the creation method. If I create a .nunit project and try to add MyNamespace.Administration to it (using 'Add Assembly...' or 'Add VS project...') it fails.
Calling nunit-console-x86 somepath/bin/Debug/MyNamespace.Administration.dll directly works.
Reposting my reply on nunit-discuss:
The NUnit feature of loading VS solutions is really fairly limited and intended to work with simple projects or as a quick way to create an NUnit project file - i.e. load the solution and save as an NUnit project, then edit the xml file that is created. Since the solution file format doesn't indicate which files are tests, NUnit attempts to load each project to check if it contains any tests. (This is the same thing that Visual Studio 2012 and later does when using the test explorer window, btw.)
As you suggest, I think the particular assembly fails to load because of having a dependency that is one level up. When loading either a VS solution file or an NUnit project file, NUnit sets the application base to the directory containing the solution or project. That's why an NUnit project file one level up works.
The designers' intent in this sort of situation is that you would create an NUnit project file. I recognize that this is somewhat inconvenient, since it gives you another configuration file to maintain. I'm open to suggestions regarding the use of globs either on the command line or within the project file. Any such changes would probably go into the next major upgrade, NUnit 3.0.
Unfortunately, even after posting on nunit-discuss group I was unable to find a proper solution for this problem.
nunit-discuss group confirmed that my tests are failing because of having a dependency that is one level up.
I did however found an acceptable work-around.
Since calling the .dlls directly didn't have the same issues.
I could do this with globs, but I'm on windows... but I have git bash installed.
Taking advantage of my somewhat rigid project structure and naming convention I managed to do this:
"C:\Program Files (x86)\Git\bin\bash.exe" -c 'nunit-console-x86.exe //framework=net-4.5 //xml:nunitresults.xml MysolutionFolder/Tests/*/bin/Debug/*.Tests.dll'
Please note that I took advantage of my naming convention. This is very important to do in order to reduce the number of arguments.
When I did nunit-console-x86 MysolutionFolder/*/*/bin/Debug/*.dll instead of MysolutionFolder/Tests/*/bin/Debug/*.Tests.dll I got an error from nunit-console-x86 saying Bad file number.
Besides, it's faster if I just provide the right files.
If you have a more recent version of bash (4.0+, I think) you can instead use the following command (note the use of **):
"C:\Program Files (x86)\Git\bin\bash.exe" -c 'nunit-console-x86.exe //framework=net-4.5 //xml:nunitresults.xml MysolutionFolder/**/bin/Debug/*.Tests.dll'
Which is shorter and more permissive on the project structure.
My aim is to have package restore working on a build server so that I don't have to check in binaries. At the moment, I'm simply trying to get it to work on my own machine using Visual Studio.
Here's what I've done so far:
Followed the instructions here http://docs.nuget.org/docs/workflows/using-nuget-without-committing-packages, including both setting the Tools-Options flag and the environment variable (belt and braces)
Installed the NuGetEnablePackageRestore package as suggested here NuGet package restore consent without NuGet
Checked everything in (the .nuget solution folder and its contents), but not the binaries I want to reference, because that's the whole point of the exercise
Here's what I'm doing:
Check out solution
Verify that nunit.framework.dll and moq.dll are not present in the checked out solution
Build the solution
Visual Studio complains that Moq is missing. I search for the dlls in the solution directory and find that:
nunit.framework.dll is present in the appropriate bin folders
Moq.dll is nowhere to be found
But there's more. This is truly mysterious, but if I do a fresh checkout, disconnect from the internet and build, I get precisely the same results - nunit.framework.dll is there, but moq.dll is not. The build process has conjured nunit.framework.dll literally from nowhere.
So it's something of an understatement to say that I am completely baffled. Can anyone suggest answers to the following questions:
Why is package restore not downloading Moq?
Where on earth is the build process getting nunit.framework.dll, if not the internet?
In vs, Options, Package Manager... there's a section "Package Cache", if you click on the "Browse" button it will take you to the location of the nuget cache in your machine.
Okay, I noticed in the documentation that enabling package restore was supposed to modify project files in order to add a new target. My project files did not have this change. Right-clicking the solution title in VS and selecting 'Manage NuGet packages...' then added the required changes and everything built as it should.
I checked, and package restore still appears to work when I have no internet access, so I'm still mystified about that. Does NuGet maintain some kind of cache of binaries outside the solution?
I have an ASP.NET MVC 2 application.
Web project contains a reference to SomeProject
SomeProject contains references to ExternalAssembly1 and ExternalAssembly2.
SomeProject explicitly calls into ExternalAssembly1, but NOT ExternalAssembly2.
ExternalAssembly1 calls into ExternalAssembly2
When I perform a local build everything is cool. All DLLs are included in the bin\debug folder. The problem is that when I use the Publish Web command in Visual Studio 2010, it deploys everything except ExternalAssembly2.
It appears to ignore assemblies that aren't directly used (remember, ExternalAssembly2 is only used by ExternalAssembly1).
Is there any way I can tell Visual Studio 2010 to include ExternalAssembly2?
I can write a dummy method that calls into ExternalAssembly2. This does work, but I really don't want to have dummy code for the sole purpose of causing VS2010 to publish the DLL.
None of these answers are sufficient in my mind. This does seem to be a genuine bug. I will update this response if I ever find a non-hack solution, or Microsoft fixes the bug.
Update:
Doesn't seem promising.
https://connect.microsoft.com/VisualStudio/feedback/details/731303/publish-web-feature-not-including-all-dlls
I am having this same problem (different assemblies though). If I reference the assemblies in my web project, then they will get included in the publish output, but they should be included anyway because they are indirect dependencies:
Web Project ---> Assembly A ---> Assembly B
On build, assemblies A and B are outputed to the \bin folder. On publish, only assembly A is outputed to the publish folder.
I have tried changing the publish settings to include all files in the web project, but then I have files in my publish output that shouldn't be deployed.
This seems like a bug to me.
I had the same problem with VS2010 and a WCF Service Application.
It turns out that if your (directly or indirectly) referenced DLL's are deployed to GAC, the VS publishing feature excludes them. Once I removed the assemblies from GAC, publishing feature started working as expected.
I guess VS is assuming that if your assemblies can be located in GAC on the machine you build, they will be located in GAC on the target machine as well. At least in my case this assumption is false.
My tests show that the external assemblies get published when I have a reference on them in the web project. I do not have to write any dummy code to make it work. This seems acceptable to me.
I agree with Nicholas that this seems to be a bug in visual studio. At least it escapes me what the reason for the behavior could be.
I have created this issue as a bug on Microsoft Connect. If anyone experiencing it could vote it up https://connect.microsoft.com/VisualStudio/feedback/details/637071/publish-web-feature-not-including-all-dlls then hopefully we'll get something done about it.
If you go into the ExternalAssembly2 reference property list and change the "Copy Local" to "True" i think that might solve your issue.
I don't know if you are watching this still but I found the solution (I had the exact same issue) via this MSDN article. Under "build action" for the file choose "Content" that should include it in the list of files publish brings over.
I have created a new Connect bug here https://connect.microsoft.com/VisualStudio/feedback/details/731303/publish-web-feature-not-including-all-dlls
I've also attached a solution and detailed steps to reproduce this issue. Lets hope this time they won't close it as Can't Reproduce.
Vote for this connect issue if you experience the missing dll problem.
Copy local did the trick. I had an issue that the Newtonsoft.Json assembly get included in the deploymeny package. Copy local was set to false.
I am experiencing the same type of issue with a web project. I have a web project that references assembly A which references assembly B. It worked fine for some time but today it was broken. I did a rebuild of the solution and this time it deployed everything correctly.
I had this same problem today. I published my web project and realized that not all of the reference DLL's were there. In particular, the indirect DLL references.
It turns out that the directory in which I was publishing to was out of disk space (network share). I had just enough space to publish all the files except for few indirect reference DLL's. The sad part is that VS08 didn't throw any errors. It just published the files are usual. I cleared out some HDD space and everything worked fine.
I didn't find the HDD space issue until I tried to manually move the DLL's over.
in my case it is quite tricky.
Reference to ExternalAssembly2 is not required to Build the project but vital for run-time since we use reflection to configure Unity container.
So, I delete the reference - build the project successfully, but get run-time error.
If I preserve the reference I can Build and Run the application but I cannot Publish it with ExternalAssembly2 - get run-time exception as well.
This is happen because of internal VS2010 assemblies optimization.
So, what we can do here?
1. Put some unrequired peice of code to use any ExternalAssembly2's class.
2. escape from reflection and use static assemblies linking.
Hope this helps to smbd.
I got the same problem and this is a VS2010 bug if there's a reference link like:
Web Project --> custom project --> assembly1 -->(indirectly) assembly2.
For now I find if I reference the Assembly1 in the web project, then assembly2 is included in the bin folder.
So I had to add an additional reference link like:
Web project --> assembly1 -->(indirectly) assembly2.
Then VS can recognize assembly2 and include its dll file in publish action.