We use Nuget for our internal development to allow us to share code across teams. We run into issues however when one person is working on code that will be deployed across multiple nuget packages at the same time. For instance
A depends on B which depends on C.
A, B and C have their artifacts pushed to Nuget and that's how we manage the dependencies between A, B and C. The trouble we find is that if a developer wants to make changes in C and quickly see those changes reflected in A, they have to go through the following process.
Make change in C.
Push change up to git
CI picks up change to C and builds and deploys new nuget package.
Go into B and update reference to C using a nuget update package command.
Push up the change to the packages.config file up to git
CI picks up change to B and builds and deploys new nuget package for B
Now open A and change reference to B and nuget update package
Make changes in A to go along with the changes in B(and transitively C)
This seems extremely painful and is causing some of our developers to question the choice of Nuget for our internally developed code. Everyone still like it for consuming external packages.
Is there any better workflow for using Nuget internally?
In our company we have solved the cascading updates problem with the following setup. First we have the following setup for our NuGet repositories and build server.
There is an internal NuGet repository that holds all the published packages for the company. This repository is just a shared directory on one of our servers.
Each developer can have (but doesn't need to have) one or more directories on their own machine that serves as a local NuGet package repository. By using a user specific NuGet configuration the developer can control in which order NuGet searches through the package repositories to find packages.
<?xml version="1.0" encoding="utf-8"?>
<configuration>
<packageRestore>
<add key="enabled" value="True" />
</packageRestore>
<packageSources>
<add key="Dev" value="D:\dev\testpackages" />
<add key="Company" value="<UNC_ADDRESS_COMPANY_REPOSITORY>" />
<add key="NuGet official package source" value="https://nuget.org/api/v2/" />
</packageSources>
<disabledPackageSources />
<activePackageSource>
<add key="All" value="(Aggregate source)" />
</activePackageSource>
</configuration>
All solutions have automatic package restore turned on, so that we don't have to commit the packages to our version control system.
Developers only control 3 out of the 4 version numbers, e.g. if the version is <MAJOR>.<MINOR>.<BUILD>.<REVISION> then developers can only change the major, minor and build numbers, the revision number is set to 0 except in builds done by the build server where it is the build number of the build. This is important because it means that for a given version consisting of a major, minor and build number the build server will always produce the higher version number. This again means that NuGet will prefer to take the package version coming from the company package repository (which only gets packages through the build server).
In order to make a change to one of the base libraries there are two possible processes being used. The first process is:
Make the changes to the base library (A). Update the version of (A) if needed.
Run the MsBuild script to build the binaries and create the NuGet packages of (A)
Copy the new NuGet packages over to the package repository on the local machine
In the dependent project (B) upgrade to the new packages of (A) that were just placed in the local machine package repository (which should be of a higher version than the ones available on the company wide repository, or NuGet.org)
Make the changes to (B).
If more changes are required to (A) then repeat steps 1,2 and 3 and then delete the package of (A) from the working directory of (B). Next time the build runs NuGet will go looking for the specific version of (A), find it in the local machine repository and pull it back in. Note that the NuGet cache may thwart this process some of the time, although it looks like NuGet may not cache packages that come from the same machine(?).
Once the changes are complete, then we:
Commit the changes to (A). The build server will run the integration build to verify everything works.
Tell the build server to run the release build, which builds the binaries and pushes the NuGet packages to the company-wide NuGet repository.
In (B), upgrade to the latest version of (A) (which should have a higher version number than the test package because the test package should have version a.b.c.0 while the newly build version in the company-wide repository should be a.b.c. where > 0
Commit the changes to (B). Wait for the build server to finish the integration tests
Tell the build server to run the release build for (B).
Another way of doing the development work is by taking the following steps
Make the changes to the base library (A). Update the version of (A) if required.
Build the binaries
Copy the binaries over to the location where NuGet unpacks the package of (A) for project (B) (e.g. c:\mysource\projectB\packages\ProjectA.1.2.3.4)
Make the required changes to project (B)
The commit process is still the same, project (A) needs to be committed first, and in project (B) the NuGet reference to (A) needs to be upgraded.
The first approach is slightly neater because this process also warns if there are faults in the NuGet package of (A) (e.g. forgotten to add a new assembly) while in the second process the developer won't know until after the package for (A) has been published.
You have two choices here:
Run an instance of NuGet Gallery within your organisation. This is the code which runs nuget.org
Get a license for Artifactory Pro, which has in-built Nuget support and acts as a Nuget repository.
I have used both, and #1 is a reasonable choice to start with, but NuGet Galley is optimised and designed for nuget.org, not on-premise/enterprise use, so things like deleting packages is a pain (hand-rolled SQL required).
I'd say that you should pay the (low) license fee for Artifactory Pro - it's an excellent product, and the JFrog team are really keen and switched on.
You should not be using nuget.org for internal/enterprise packages; nuget.org is designed for 3rd party/open source libraries, not internal build dependencies.
EDIT: in terms of workflow, why are you putting shared code into multiple packages? If the code needs to be shared, it needs to go in its own separate package.
EDIT 2: To speed up the code change workflow for the developer, you can use nuget.exe (the command-line client) and use command-line accessible builds, so you can target a "developer" build run. Then in your "developer" build (as opposed to the CI build) you specify -Source as a local path (e.g. nuget install B -Source C:\Code\B) when you want to pull the newly-updated B as a dependency and build against that; likewise for C or other local, newly-updated packages. Then when A, B, and C all build fine, you can git push all of them (in reverse dependency order), and let CI do its thing.
However, you also should question whether your package separation is really appropriate if you have to do this build 'dance' often, as this suggests that all the code should be in a single package, or possibly split along different lines in separate packages. A key feature of a well-defined package is that it should not cause ripple effects on other packages, certainly not if you are using Semantic Versioning effectively.
Edit 3 Some clarifications requested by marcelo-oliveira: "command-line accessible builds" are builds which can take place entirely from the command-line, without using Visual Studio, usually via batch files. A "developer build" is a build which a developer runs from her workstation, as opposed to the CI build which runs on the CI server (both builds should essentially be the same).
If A, B and C are under the same solution, you can create NuGet packages for them in a single build.
Just make sure that the using package has the new version number (assuming your build doesn't randomly change it) of the package it depends on.
If A, B and C are intentionally under different solutions e.g. A is under an infrastructure solution and B is under a product solution, then the only suggestion I have for you is to define your CI builds run on check in and not periodically.
Edit:
Another option is to create a push pre-release packages (e.g. version 1.0.0-alpha) to your NuGet server during local builds, this way developers can experiment with the package prior to creating a release version.
Nuget was designed for sharing third party libaries. We plan to use Nuget internally for all common code that can be shared between projects. Things like a common data layer, string and other regular expression functions, email components and other artifacts like that.
The only way I see Nuget being of any help is when the components, code or artifacts you are sharing across teams/projects are stable and have their own release cycle that is different from your project's release cycle. For your needs Nuget is an overkill. You might have better productivity linking all such projects within the same solution. Your solution structure should include the three projects A,B and C as dependent projects referenced internally wherever required.
Related
If I have one repo that holds libraries (that are published to Nuget) and a separate repo that holds the application code (that consumes the Nuget packages), is there an easy way to test changes to the library code within the application repo without publishing to the official Nuget feed?
Your build script could be something like this
step 1. build your package and copy your .nupkg's to %buildroot%\newPackages\
step 2. create a nuget.config file in your application code's root that adds %buildroot%\newPackages\ as a packageSource. If your application code is a functional test, then you can probably check in the nuget.config, so it doesn't need to be recreated by the build machine every build.
step 3. Have a shell script or small program that updates your application code's references to the newly built package, to match the version that was just built
step 4 build/test your applicationCode
I have a situation where we have multiple C# projects that use similar set of Nuget packages (ex. Newton Json, Microsoft Compilers and CodeDom, Owin, log4net, jQuery, EntityFramework etc.)
I am trying to see how we can have a shared location for all Nuget packages to reduce the footprint of those binaries in Git, having a single repo for them by centralizing them in one place.
One option I found is to use Nuget.config in each project with repositoryPath set to point at the shared location. This works great for adding/upgrading/restoring Nuget packages in the project but it is not very clean when a package gets removed from one project but is still required in a different one. Basically the package will get removed from the shared location and the change is committed to Git, then when the other project requires it, it would get restored and added back to Git. Not a perfect solution in my mind.
I have a two part question:
1. Is there a way to improve the above workflow when packages get removed?
2. What is the industry standard for handling third party libraries delivered via Nuget? Or if there is none, can you share your experience handling Nuget packages across multiple projects.
If the concern lies with the footprint/organization of the Git repository, maybe you can do a .git ignore for the dependencies folders to prevent git from committing them into the repositories. When you need to build the projects from source, just do a dotnet /nuget restore to get the dependencies from the source you configured in the Nuget.config
Not sure if it is the industry standard, but we host our own Nuget server to control the libraries that the different teams can use. Microsoft has an article on Hosting your own NuGet feeds
Hope it helps.
I have a private NuGet feed where I publish package A.
A has a version like 4.0.0.X (where X is the build number).
When I change code, the build number is incremented and the package is published.
In the csproj, I have referenced the package like this:
<PackageReference Include="A" Version="4.0.*" />
I want to get the newest version of A, which has no major changes (which would result in a bump the minor section...).
Unfortunately, if Nuget has downloaded a Version of A, it never attempts to check if there is a newer version.
I can check manually, but then Nuget automatically pins the version in the csproj, which I have to re-edit.
How can I fix this?
Fix means: I want a smooth dev experience for my CI workflow. Ideally, I have the newest package version on my dev computer without lots of manual work.
NuGet has caching strategies to avoid re-downloading packages/hitting the network all the time.
It dumps a little cache file in your obj that tells it whether you've changed your dependencies at all.
You will need to force NuGet to reevaluates the available packages from remote sources, by using the -force option. /p:RestoreForce=true, or --force in dotnet CLI.
In Visual Studio, currently a rebuild will do a force restore.
NuGet also has a http caching strategy that avoids hitting the remote feeds for 30 mins. To override that, use -NoCache from the commandline. Currently there's no option to override that in Visual Studio.
tl;dr;
NuGet caches a lot of things to improve performance and remote unnecessary remote calls.
Avoid that by calling restore with the --force/rebuilding.
I have turned on TeamCity's NuGet Server and I want to push in common packages (i.e. from public sources such as NuGet.org) because the build server cannot see outside our company, so restoring packages on the build server from NuGet.org is not possible.
I cannot see how to push these packages on to our TeamCity server. I've seen various answers suggesting to use a package build still or some other means of publishing from within a build, but this is not appropriate for my use case.
If I try to publish from a command line it complains that it cannot find an API key (where do I get that from?) and it won't allow me to enter my credentials (I assume my team city login would be it) as it tells me "Cannot prompt for input in non-interactive mode." (I didn't set that mode and I can't see how to turn that off).
So, how do I push/publish an adhoc package that I obtained elsewhere into team city?
I believe that the nuget functionality provided by TeamCity is an API added on top of TeamCity's builtin artifact functionality.
There are a number of consequences of that:
When a build configuration is executed that produces any .nupkg files that are marked as artifacts, they will be available on the Teamcity nuget feed.
As with all other artifacts nupkgs published in TeamCity are subject to Teamcity's general artifact retention rules.
Access rules for nuget packages are the same as access to the TeamCity projects.
There is however as far as I know no implementation in the Teamcity Nuget API for pushing packages to it. The general practice for storing original or generated packages is to use a stand alone nuget server or service like a normal file share, a Nuget.Core based server, proget or myget.org.
Update:
If you end up with many packages of your own I've heard people reporting that Teamcity becomes quite slow when the clients are resolving the packages.
Update 2:
The last years I've adopted the notion of separating build artifact packages into the two categories library package and deployment package. A separate package repository can be used for both types but a repository such as the one available in for instance Octopus deploy should only be used for deployment packages.
Update 3:
Microsoft have a page for a number of nuget server options.
I have a solution which produces several NuGet packages, and I pack the packages during build. I want my nightly builds to be marked as pre-release, so I version my packages accordingly: 1.2.3-PreRelease0001. However, once a nightly build passed testing, I want to publish the same build, with the same packages, but using a non-PreRelease version: 1.2.3.
My question: How can I re-package a NuGet package with a different version? I guess I could hack some unzip/edit/nuget pack script, but is there a better way?
Alternatives:
Don't package during build - package in a separate process, which I can re-run later.
Con: If I package during build, I get access to <Content> files directly from the sources
Run another build, this time packaging with the non-PreRelease version.
Con: Want to distribute the exact same bits I tested...
Don't mark nightly builds as PreRelease, and instead publish them to a separate repository.
Con: PreRelease packages are not marked as such, and could get mistaken as released.
Package during build twice: Once with PreRelease and once without.
Con: People might be tempted to ref the non-PreRelease versions. Maybe I could put them inside some GeneratedDoNotTouch folder...
There is no public API to change a package’s metadata in NuGet. I would say the last solution, i.e., produce both the prerelease and non-prerelease packages during build, is the best.
To prevent people from accidentally using the non-prerelease package, you can create it in a private directory first. Then, publish it only after the build passes testing.
Yes, you can extract the nuspec file from the package, make the necessary changes and then save the file back into the package. The problem is that this might stop working if the nuspec format is changed.