Nuget - store packages in source control, or not? - version-control

We currently don't use nuget for our dependencies, preferring to go old-skool way and stick them all in a libs folder and reference from there. I know. So 1990's.
Anyway, nuget has always made me feel a bit queasy... you know, reliance on the cloud and all that. As such, I'm find myself in the main agreeing with Mark Seeman (see here: http://blog.ploeh.dk/2014/01/29/nuget-package-restore-considered-harmful/) who says:
Personally, I always disable the feature and instead check in all packages in my repositories. This never gives me any problems.
Trouble is, this has changed in version 3, you can't store packages alongside the solution, as outlined here: https://oren.codes/2016/02/08/project-json-all-the-things/. Which sorta screws up checking them into source code.
So, am I worrying about nothing here? Should I drink from the nuget well, or side with Mr Seeman and er on the side of caution?

Storing NuGet packages in source control is a really, really bad idea.
I accidentally did it once and I ended up bloating my source code considerably, and that was before .NET Core...
Drink deep from the NuGet well. Most software components are packaged in a similar way these days (NPM, Bower etc). The referenced blog post is two years old and package management is changing rapidly in the .NET world, so here's some of my experience lately.
NuGet packages can't be deleted from nuget.org. They can be hidden,
but if your application requests a hidden package it will download it
as normal. It'll never disappear into the void.
'Enable Package Restore' is no longer glitchy because it's now a default option in NuGet 2.7+. You have no choice anymore.
Packages are no longer stored per solution but per machine, which will save a ton of bandwidth and will decrease the initial fetch period when building.
If you build a new project using .NET Core, you will have dozens more packages as the entire BCL will be available as NuGet packages. Do you really want to check-in all the System.* packages into source code?

There is a very simple reason why you want to store Nuget packages in source control. Your organization doesn't want your build server to have internet access.

Related

How do I uninstall a NuGet package without deleting its folder?

I'm storing all my packages in D:\Dev\Packages, using the repositoryPath attribute value as documented here. However, when I uninstall a package from a VS 2015 project, the package folder is deleted as well. I need to retain the folder, as other projects in other solutions are using it.
This behavior has changed since VS 2013. As far as I can recall, package folders weren't deleted during uninstall. If they were I surely would have noticed it before now.
So: how can I make sure that the package folder isn't deleted during a package uninstall?
OK, it seems I'm discussing a non-issue.
Per a very helpful email thread with Yishai on the NuGet team just now, I've realized that the likely reason I'm noticing it just now in VS 2015 is simply because this time I'm watching the Uninstall-Package output where before I wasn't.
The bottom line is that even when a package is deleted during uninstall, Package Restore pulls it out of cache the next time a solution needs it.
You may be interested in a blog Yishai pointed out to me:
https://oren.codes/2016/02/08/project-json-all-the-things/
I was briefly considering going down that path until all of the dots connected on this. But it's a great reference nonetheless—definitely one to keep handy.
All that said, I've posted over at GitHub/NuGet/Issues in case you'd like to follow it.

Autofac AggregateService exists in NuGet?

I found Autofac AggregateService awesome but what is the right way to include it in my project: clone it from code.google.com or use NuGet?
I got used to use NuGet but I can't find nothing about AggegateService there. Any help?
It seems that AggregateService and the other Extras are currently "in limbo". There's been a recent change in that the contributions are now being made part of the same solution as Autofac core, while they were previously a separate solution. From the current build file you can see that extras will be made available as a separate Autofac.Extras package and a separate download from the Autofac page.
Meanwhile, you can use AutofacContrib 2.6.1 or to grab the source and compile a dll yourself.
Btw, thanks for finding AggregateService awesome ;)
Update: actually, reading the build file properly (and looking at the current source structure), the Extras parts will be distributed as individual packages. So expect to find Autofac.Extras.AggregateService on Nuget in the future.

Is using GACUtil in your coding/svn/development workflow considered Bad Practice?

There's plenty of information/blogs/msdn articles around on NOT using GACUtil in your Deployment/Release scenarios and that MSI or another windows installer technology is a far better option.
However is it still appropriate to use GACUtil in your Development work flow.
We have a number of DLLs that are strong named & referenced from the GAC. In order to keep the development team in sync, once a new version of the GAC-able DLL is generated it's automatically added to all other developers GAC's as part of their daily trunk checkout. Workflow goes something like:
A Developers makes a change to one of our GAC-able assemblies, tests it locally, and once signed off, compiles a release version of the DLL
Release version is copied from \Project_DIR\bin\Release*.dll -> \COMPANY_GAC\Current*.dll
Other devs run our Source Control check out batch scripts which:
Check out newest versions of COMPANY_GAC\Current*.dll
Run GacUtil.exe on each DLL
This has worked for us up until now, but it's getting a little more complex with:
- Larger Team, more stringent management of GAC Changes.
- CLR2.0 & CLR4.0 compiled Company_Gac assemblies requiring different versions of GACUtil.exe
- Managing assemblies on Build/Integration Servers which have multiple feature branches (and hence having to hot-swap different GAC Dlls)
Should we be looking at something more robust that GACUtil & Scripts to manage this?
One consideration was to roll something ourselves in powershell to check the Assembly type and add the assemblies to the correct GAC. Has anyone done this?
Any other suggestions on how developers manage their GAC workflow would be welcome.
Not using gacutil.exe during deployment is an easy one: it isn't available on the target machine since it is a Windows SDK utility and it is not a re-distributable component.
Using it during development certainly isn't popular. Most typically you'd use a solution with the dependent projects included so that you'll automatically get the latest build with local deployment and no need for the GAC. That goes well up to a point, build times can require starting distributing swords when the solution gets too massive.
No magic solutions past that point, the GAC certainly helps to get build times down again. In general, churn in the foundation assemblies should start with minus 1000 points, they can cause a lot of pain. Save them up for only, say, weekly release updates. Off hand, there's also the core need to get all this stuff properly installed on the client machines. If nobody has focused on that yet, maybe now is a good time to get that solid. Which automatically gets debugged when everybody uses it to get the assemblies they need on their machine.

Using NuGet for Internal & External Dependencies in TFS

I'm currently looking at NuGet to solve my dependency problems in TFS and what I wanted to do is to host my own NuGet server that would take care of internal dependencies. I also want to use NuGet to handle my 3rd party dependencies as well. I'm trying to set up automated builds for our company and this is one roadblock I'm trying to overcome with NuGet.
So my question is how do I handle this scenario in which I have to retrieve my dependencies from different servers?
Is there a better way to handle internal dependencies? How is everyone else doing this?
Also just as a note I intend on using NuGet without committing packages to TFS. I planned on using the method outline in this article:
http://blog.davidebbo.com/2011/08/easy-way-to-set-up-nuget-to-restore.html
Glad you're looking into the no commit scenario for NuGet packages on TFS. You can take a look at my blog post on this topic where I explain the concept.
EDIT (2012/06/13): NuGetPowerTools is replaced by NuGet's built-in package restore functionality. However, same concept of changing the PackageSources element in nuget.targets still applies.
You definitely should take a look at David Fowler's NuGetPowerTools.
After installing this package, you can Enable-PackageRestore (newly installed command in Package Manager Console), which will add...
Enabling package restore will add MSBuild targets to your project files. These MSBuild targets will trigger nuget.exe in a pre-build step and fetch any packages required by your project.
No need to check-in NuGet packages in source control, all you need is the packages.config and these msbuild tasks.
To configure multiple, different package sources, you need to set some settings to be used by these MSBuild tasks. One of them is PackageSources. You can set it by editing the NuGet.targets file, which you will find in the .nuget folder once you enabled package restore.
Regarding those package sources, you could set up different internal NuGet galleries, or simply set up different network shares to be used. This is a matter of requirements and preference, so you can choose. All you need to do, is to tell your msbuild targets to use these packagesources. The order in which you define them, will be the order of lookup of packages as well.
Good luck!
Xavier
Little update on accepted answer and question:
When using TFS as a buildmachine without visual studio installed on it, you can do the following so the buildmachine automatically uses your custom packageSources (more than 1 in the same solution) without any further configuration of packagesources in your solution.
Create a machine default config by placing a NuGet.Config in the root ( C:\NuGet.Config ) by using sample from: http://docs.nuget.org/docs/reference/nuget-config-file
Comment out the line with: <add key="repositorypath" value="$\External\Packages" />
Otherwise your packages gets expanded in C:\$\External\packages\'. When commented out, the config gets chained and the right directory will be used.
Config your needed packagesource(s).
For more Info about other options (e.g. user specifc) see: http://docs.nuget.org/docs/reference/nuget-config-file (bottom of the page).

Fetching DLLs from NuGet if Deleted

I've done a fair bit of reading on NuGet, and I can't seem to find what I want. Essentially, I'm hoping that it will work like Apache Ivy, where you can just check in your config file (without any binaries) and tell NuGet to fetch all the DLLs -- thus saving you from versioning tons of DLLs.
Hence: is there a command in NuGet to fetch and configure all dependencies mentioned in packages.config?
Again, the case for this is that I only checked packages.config into source control, not the actual DLLs, and I need to re-fetch everything. (Preferably without fetching packages one by one by name).
This has been covered recently in blog posts:
Inbuilt functionality for this is coming in a future version of NuGet: http://feeds.haacked.com/~r/haacked/~3/x8g_kFzD4eA/feedback-request-for-using-nuget-without-committing-packages.aspx
(Linked from above) How to do this today using command line NuGet.exe (available from the NuGet pages on CodePlex): http://blog.davidebbo.com/2011/03/using-nuget-without-committing-packages.html
EDIT: Now also covered on NuGet's Documentation Pages