My aim is to have package restore working on a build server so that I don't have to check in binaries. At the moment, I'm simply trying to get it to work on my own machine using Visual Studio.
Here's what I've done so far:
Followed the instructions here http://docs.nuget.org/docs/workflows/using-nuget-without-committing-packages, including both setting the Tools-Options flag and the environment variable (belt and braces)
Installed the NuGetEnablePackageRestore package as suggested here NuGet package restore consent without NuGet
Checked everything in (the .nuget solution folder and its contents), but not the binaries I want to reference, because that's the whole point of the exercise
Here's what I'm doing:
Check out solution
Verify that nunit.framework.dll and moq.dll are not present in the checked out solution
Build the solution
Visual Studio complains that Moq is missing. I search for the dlls in the solution directory and find that:
nunit.framework.dll is present in the appropriate bin folders
Moq.dll is nowhere to be found
But there's more. This is truly mysterious, but if I do a fresh checkout, disconnect from the internet and build, I get precisely the same results - nunit.framework.dll is there, but moq.dll is not. The build process has conjured nunit.framework.dll literally from nowhere.
So it's something of an understatement to say that I am completely baffled. Can anyone suggest answers to the following questions:
Why is package restore not downloading Moq?
Where on earth is the build process getting nunit.framework.dll, if not the internet?
In vs, Options, Package Manager... there's a section "Package Cache", if you click on the "Browse" button it will take you to the location of the nuget cache in your machine.
Okay, I noticed in the documentation that enabling package restore was supposed to modify project files in order to add a new target. My project files did not have this change. Right-clicking the solution title in VS and selecting 'Manage NuGet packages...' then added the required changes and everything built as it should.
I checked, and package restore still appears to work when I have no internet access, so I'm still mystified about that. Does NuGet maintain some kind of cache of binaries outside the solution?
Related
Cloning a project and it looks like it comes down fine, until I look in the references and they all have the yellow warning triangle. Then my Error List shows all the references as Warnings. My application is on Visual Studio 2017 and it is MVC with C# coding using .NET Framework 4.7.2. In my normal application, original, the references are perfectly fine and no errors and it works great. Builds and works fine. Once I bring down the clone is then the references are lost. I've done a build on the cloned version and it shows all the references as warnings.
I've double clicked a reference and received an error popup box of "This project cannot be viewed in the object browser because it is unavailable or not yet built. Please ensure that the project is available and built". It feels like Azuredev-ops is just missing my references and their location. I am the only one working on this, so there should be no conflicts. I've posted and cloned right after posting, with same result. My code and Web configs look just fine. I have cloned on other people's system too, and same problem with this application. FYI, many of my other applications are working fine using the clone. Just 1 application is having issues. My references are separated in that some are Copy Local True while others are False. I've also removed a reference and added it back, and it comes back with a warning as if it were never added.
Warning message for individual reference:
The referenced component 'EntityFramework' could not be found.
I am also seeing errors for NuGet packages not being there, but when I look I see the package folder with all it's components in it.
It says that for all my windows core references. And just warning symbols in my reference folder for other references.
I am expecting no errors when I bring down the clone. I've upload using my machine and cloning should be able to use the same reference locations. It just feels like AzureDev-Ops is stripping my reference links out, and then going I can't find them.
Azure-DevOps clone shows references as warnings
To resolve this issue, you should make sure of the following:
Make sure you have checked those two options Allow NuGet to download missing packages and Automatically check for missing packages during build in Visual Studio:
Make sure you do not check the \packages folder to the source control.
When you clone the project from Azure-DevOps server and get missing reference error, you should use the NuGet command line Update-Package -reinstall in the Package Manager Console to force reinstall the package references into project. Check this thread for some more info.
Note: Especially need to pay attention to the third point.
Update:
Error:Mircrosoft.CodeDom.Providers.DotNetCompilerPlatform.2.0.0
According to the error message, it seems you are not add your packages to your packages source in Visual Studio.
You should publish your custom packages to the nuget feed or you can create you local nuget feed, then add the nuget feed path or local feed path to the package source:
Check this document for some more details.
Hope this helps.
I figured it out. I went to my original and did the Update-Package -reinstall. It came back with the Microsoft.CodeDom.Providers.DotNetCompiler.Platform.2.0.0 not there, but it added one, just not 2.0.0. Then I ran an uninstall of the Microsoft.CodeDom.Providers.DotNetCompiler removing it from my system. Then I posted my original up to AzureDev-ops. My clone came down, not all the references were messed up, some were still, but I did a rebuild and that cleared it up. Thank you to the responses, it pointed me in the right direction.
I have an ASP.NET MVC 4 application. I used NuGet to update all of the NuGet packages that were installed when I created the application. One of the packages was Microsoft.Bcl.Build.
After updating these, NuGet displayed the following message at the bottom of its window:
I have since restarted Visual Studio several times, but the message still exists. When I checked the installed packages, it did appear that the updated version (1.0.8) of the package was present.
How can I fix this?
Instead of deleting all of ~/packages, see if there are any *.deleteme files in ~/packages and delete them. Then restart Visual Studio.
I believe this problem is caused by the packages being read-only or otherwise inaccessible at the file system level.
Packages under source control
Temporary work-around (untested)
Check out the entire packages folder prior to telling NuGet to restart Visual Studio to delete the packages.
Permanent work-around
I found that this could be permanently resolved by removing the packages from source control and instead using NuGet Package Restore.
Packages not under source control
Temporary Work-Around
I worked around this by deleting from the solution's packages folder all of the files that referenced the package in question. Specifically, these were:
Folder: Microsoft.Bcl.Build.1.0.7
File: Microsoft.Bcl.Build.1.0.7.deleteme
In my case, the relevant package folders remained in ~\packages, although they were empty. I deleted the folders and restarted Visual Studio, and this warning went away.
I've just deleted the folders of each package that had error in the Packages folder in my solution folder and also deleted the .deleteme files and everything works fine!
1) Delete the entire ~\packagesfolder.
2) Restart VS.
3) Go to Manage NuGet Packages and Restore
I'll agree that this can happen when your packages folder is under source control. If you like to have it there, instead of removing the bindings you can check it all out, remove the package with the NuGet Package Manager, and then check in after wards.
In my experience, I found my answer on this thread, but using a combination of a couple of different answers above so I thought I would share what I found.
I had the exact same issue with "Microsoft.Bcl.Build" as the original poster. I had been trying to update references for other functionality using NuGet and had issues with some of the updates (compatibility then rollbacks). After this NuGet failure, I started getting this error.
I initially used the selected answer and Jedidja's answer and was able to get this to work, but it only partially solved my problem. It did fix the VS restart error, but it caused a downstream issue with TFS as I could no longer check in the project as it was expecting that "*.deleteme" file. This got me thinking, so I did some testing. When I restored the file from recycle bin, I started getting the restart error again.
Here is where I deviated from the posted answers and got my full resolution to my version of the problem.
When I checked into TFS this time, the project checked everything in (after I got the projects all updated using NuGet while the "*.deleteme" file was deleted). Once it checked everything in, I noticed that file was still pending check-in so I checked the solution in again and TFS accepted that file, but it was as a deletion....assuming it checked in the first time and then VS auto deleted it which required the second check-in. Anyway....after the last pending change check-in, the file was gone and VS no longer complained about needing to be restarted. I can't say for sure because the problem is gone, but I get the feeling if I had checked the code in before deleting the file in the first place it might have solved the problem without manual file manipulation.
** Hi, everybody.**
i resolve this problem this ways.
If you have source control run the vs as administrator ( it is important )
in the solution packages -> delete thing about packages.
sample -> i deleted all entity framework version folders.
restart the vs
open solution and solution right click -> manage nuget packages for this solution.
you will see restore button :) restore
that is all.
If you are using Entity Framework 6, then you can install the NuGet package "EntityFramework.SqlServerCompact".
This enabled me to use the standard ASP.NET Identity tooling that comes with the project templates for 2013 and MVC5.
I want to create a NuGet package from a machine that is on the office intranet, but blocks all connections to the internet.
Both NuGetPackageExplorer.application and NuGet.exe will show the exception that "No connection could be made because the target machine actively refused it".
Installing packages works fine as we have a local network folder with the .nupkg packages we use.
Is there a tool I can use to create a NuGet package on that machine?
Update:
I created a issue on codeplex for this: https://nuget.codeplex.com/workitem/3196
What I ended up doing is downloading the source code from CodePlex, going into the CommandLine project, deleting UpdateCommand.cs, and rebuilding the project. I then grabbing the exe which I renamed NuGetOffline.exe and put it along with NuGet.Core.dll to somewhere in the Path.
Update
The download page for NuGet does not have the current version of NuGet.exe. As of writing this, none of the three downloads on the page work offline and the Other Downloads have several version of Nuget.Tools, but not the current version of NuGet.exe. Go here instead for nuget.exe. Use that instead of that custom build.
The Package Explorer link on the download page is just the ClickOnce installer which does work offline. You need to find the local executable here.
I haven't been able to get "Enable NuGet Package Restore" to work on the intranet. This closed work item describes the problem. The last comment says that "2.0 should no longer run into this issue", but I am using NuGet Package Manager 2.2.400116.9051.
Is there a way to load a package from an alternative server when Visual Studio Package Manager (NuGet) is responding with a "The remote server returned an error: (503) Server Unavailable" message?
This is an obscure condition that will likely only occur on an "enterprisy" network environment. If these conditions apply you:
you are required to access the Internet via an HTTP proxy server
the HTTP proxy server requires a valid user ID & password (or AD authentication) to allow requests to proceed
you've been messing with cool developer tools that were ported to Windows from a Linux/Unix environment
the new cool tool(s) work after adding the HTTP_PROXY (or possibly HTTPS_PROXY or both) environment variable(s)
you can access the NuGet servers from a browser without getting a 503 error
Then it's likely you broke NuGet by inadvertently invoking this configuration feature. I'm not sure exactly how the environment variable breaks NuGet but I suspect NuGet is detecting & using the http_proxy URL but sending an empty user ID & password which causes the HTTP proxy to reject the request.
Fix: remove the environment variable(s) you added and see if the cool tool can be configured to use an HTTP proxy without them.
Update: Ran into a version of this issue with the NuGet config file referenced in the "this configuration feature" link above. Open this file:
%appdata%\nuget\nuget.config
in your favorite editor. If it contains elements with http_proxy or https_proxy then removing these elements may fix the issue too.
PS: Hopefully I'll get an up vote from Colonel Panic :-)
If you have used the package in the past it is probably in your cache. You can add the local cache as an available package source by going into the Library Package Manager Settings under the Tools menu in Visual Studio. For Visual Studio 2012, choose Tools, Library Package Manager, Package Manager Settings, and then click on Package Sources.
In the Available package sources section, type a name like "Cache" and then in for the source, browse to %LocalAppData%\NuGet\Cache. You may need to use Windows Explorer to translate %LocalAppData%\NuGet\Cache into the full path (usually C:\Users\YourAccountName\AppData\Local\NuGet\Cache).
Once you have the Cache as an available source, you can now use the Package Manager Console (found under the View menu under Other Windows or also under the Tools menu under Library Package Manager).
From the Console (which is a PowerShell window with commandlets for NuGet) you can type "get-help NuGet" to see available commands.
Then using Get-Package, you can get a list of Package ID's. Make sure the "Package source" is set to "Cache" (or whatever you called it) and the Default project is set to the project you need manipulate, both of these are dropdowns located at the top of the Page Manager Console. You can also use the Get-Project to verify you are working against the correct project in your solution.
Finally, you can type Install-Package and when prompted enter the Package ID from the output of the Get-Package commandlet.
i had also this problem, it was becouse of my network.
if you have any blocking on your Internet, (like in companies internet or etc..)
you may not allowed to download the nuget package.
try to download the package in another network, maybe it can help you!
Talbott's answer did not work for me, as my cache was empty. However, if you have used the package in another solution, you can copy the items you want from the "packages" folder in the other solution to a packages folder in your target solution.
If you have no packages installed in the target solution, you may need to add the following to a repositories.xml file in the packages folder:
<?xml version="1.0" encoding="utf-8"?>
<repositories>
</repositories>
After doing that, the packages appeared to be installed in my solution and I was able to add them to projects.
Additional Note: I had to use the "Manage NuGet Packages for Solution" option at the solution level to add the package to individual projects. Using Install-Package from the console still returns a 503 even though the packages is already installed in the solution.
You can also get this error if you are using a VPN client (e.g. Cisco AnyConnect) and you have recently renewed your VPN certificate. The issue can occur after you have updated your certificate, but before you have rebooted. A reboot resolves the issue.
It is a pretty old question, but I have just encountered the same problem. In my case it occurred because I had more than one nuget package source configured in the Visual Studio Package Manager. In my company we use NuGet to get mainstream packages and MyGet for our own stuff.
When I attempted to pull a pretty big package it failed with a 503 code and the error link looked pretty odd, it had MyGet in it istead of NuGet. Turns out Visual Studio package manager tried to pull it from another source despite having NuGet chosen as a current source. Disabling other sources and then proceeding with a download fixed it.
Hopefully it will help somebody who stumbled upon this thread just like I did.
Another possible reason for recieving 503: If you're using Azure DevOps feed, then NuGet packages are limited to 500 MB.
I have an ASP.NET MVC 2 application.
Web project contains a reference to SomeProject
SomeProject contains references to ExternalAssembly1 and ExternalAssembly2.
SomeProject explicitly calls into ExternalAssembly1, but NOT ExternalAssembly2.
ExternalAssembly1 calls into ExternalAssembly2
When I perform a local build everything is cool. All DLLs are included in the bin\debug folder. The problem is that when I use the Publish Web command in Visual Studio 2010, it deploys everything except ExternalAssembly2.
It appears to ignore assemblies that aren't directly used (remember, ExternalAssembly2 is only used by ExternalAssembly1).
Is there any way I can tell Visual Studio 2010 to include ExternalAssembly2?
I can write a dummy method that calls into ExternalAssembly2. This does work, but I really don't want to have dummy code for the sole purpose of causing VS2010 to publish the DLL.
None of these answers are sufficient in my mind. This does seem to be a genuine bug. I will update this response if I ever find a non-hack solution, or Microsoft fixes the bug.
Update:
Doesn't seem promising.
https://connect.microsoft.com/VisualStudio/feedback/details/731303/publish-web-feature-not-including-all-dlls
I am having this same problem (different assemblies though). If I reference the assemblies in my web project, then they will get included in the publish output, but they should be included anyway because they are indirect dependencies:
Web Project ---> Assembly A ---> Assembly B
On build, assemblies A and B are outputed to the \bin folder. On publish, only assembly A is outputed to the publish folder.
I have tried changing the publish settings to include all files in the web project, but then I have files in my publish output that shouldn't be deployed.
This seems like a bug to me.
I had the same problem with VS2010 and a WCF Service Application.
It turns out that if your (directly or indirectly) referenced DLL's are deployed to GAC, the VS publishing feature excludes them. Once I removed the assemblies from GAC, publishing feature started working as expected.
I guess VS is assuming that if your assemblies can be located in GAC on the machine you build, they will be located in GAC on the target machine as well. At least in my case this assumption is false.
My tests show that the external assemblies get published when I have a reference on them in the web project. I do not have to write any dummy code to make it work. This seems acceptable to me.
I agree with Nicholas that this seems to be a bug in visual studio. At least it escapes me what the reason for the behavior could be.
I have created this issue as a bug on Microsoft Connect. If anyone experiencing it could vote it up https://connect.microsoft.com/VisualStudio/feedback/details/637071/publish-web-feature-not-including-all-dlls then hopefully we'll get something done about it.
If you go into the ExternalAssembly2 reference property list and change the "Copy Local" to "True" i think that might solve your issue.
I don't know if you are watching this still but I found the solution (I had the exact same issue) via this MSDN article. Under "build action" for the file choose "Content" that should include it in the list of files publish brings over.
I have created a new Connect bug here https://connect.microsoft.com/VisualStudio/feedback/details/731303/publish-web-feature-not-including-all-dlls
I've also attached a solution and detailed steps to reproduce this issue. Lets hope this time they won't close it as Can't Reproduce.
Vote for this connect issue if you experience the missing dll problem.
Copy local did the trick. I had an issue that the Newtonsoft.Json assembly get included in the deploymeny package. Copy local was set to false.
I am experiencing the same type of issue with a web project. I have a web project that references assembly A which references assembly B. It worked fine for some time but today it was broken. I did a rebuild of the solution and this time it deployed everything correctly.
I had this same problem today. I published my web project and realized that not all of the reference DLL's were there. In particular, the indirect DLL references.
It turns out that the directory in which I was publishing to was out of disk space (network share). I had just enough space to publish all the files except for few indirect reference DLL's. The sad part is that VS08 didn't throw any errors. It just published the files are usual. I cleared out some HDD space and everything worked fine.
I didn't find the HDD space issue until I tried to manually move the DLL's over.
in my case it is quite tricky.
Reference to ExternalAssembly2 is not required to Build the project but vital for run-time since we use reflection to configure Unity container.
So, I delete the reference - build the project successfully, but get run-time error.
If I preserve the reference I can Build and Run the application but I cannot Publish it with ExternalAssembly2 - get run-time exception as well.
This is happen because of internal VS2010 assemblies optimization.
So, what we can do here?
1. Put some unrequired peice of code to use any ExternalAssembly2's class.
2. escape from reflection and use static assemblies linking.
Hope this helps to smbd.
I got the same problem and this is a VS2010 bug if there's a reference link like:
Web Project --> custom project --> assembly1 -->(indirectly) assembly2.
For now I find if I reference the Assembly1 in the web project, then assembly2 is included in the bin folder.
So I had to add an additional reference link like:
Web project --> assembly1 -->(indirectly) assembly2.
Then VS can recognize assembly2 and include its dll file in publish action.