If I have one repo that holds libraries (that are published to Nuget) and a separate repo that holds the application code (that consumes the Nuget packages), is there an easy way to test changes to the library code within the application repo without publishing to the official Nuget feed?
Your build script could be something like this
step 1. build your package and copy your .nupkg's to %buildroot%\newPackages\
step 2. create a nuget.config file in your application code's root that adds %buildroot%\newPackages\ as a packageSource. If your application code is a functional test, then you can probably check in the nuget.config, so it doesn't need to be recreated by the build machine every build.
step 3. Have a shell script or small program that updates your application code's references to the newly built package, to match the version that was just built
step 4 build/test your applicationCode
Related
I have a single .NET solution with multiple class library projects, each one is published as a nuget feed using azure-devops.
Then I have an azure-devops build pipeline, with steps to Restore, Build, Publish, Pack and Push.
The first 4 steps are setup for **/.csproj and the last is a $(Build.ArtifactStagingDirectory)/.nupkg with the target feed.
I have everything set up and working, except if you make a change to just one project, it builds ALL projects because of the **/*.csproj.
This is no good for nuget packages, as it increments every project's version number and they all appear as having an update available in the nuget package manager.
My Question: Is there a way to do this so that only the project(s) with changes go through the process?
Is there a way to do this so that only the project(s) with changes go through the process?
The answer is yes.
The solution is use the private agent to build your solution instead of the hosted agent.
That because every time the hosted agent assigned to us is a clean machine, VS/MSbuild will build all the projects for the setting **/* csproj. So, to resolve this issue, we must save the results of the last build to achieve incremental builds.
So, to resolve this issue, we need to set up a private agent to build those projects and do not clean the working directory of your private agent before the build is run:
Set the Clean option to false on the Get sources:
Note: Since you also set the **/*.csproj for the task nuget push, if the project not modified, this command will push the same version to the feed, it will throw the conflict error, you need enable Allow duplicates to be skipped on the nuget push task:
Hope this helps.
I have a .NET Core project that is auto-built in Appveyor and deployed to Nuget. By default, every successful build causes a new Nuget release.
However, there are many cases when a new release is meaningless because the library's actual code has not changed:
Readme updated
Unit tests added
Appveyor configuration changed
Other cases
It is possible to configure the build so that Nuget publishing only runs if there are changes in the actual code (for example, in folder X)?
There are a few options.
Commit filtering. Note that with it the whole build, not just deployment will be skipped if nothing in folder x changed. You may need a build without deployment at least when unit tests added. As a workaround consider adding separate AppVeyor project which will build and deploy only if folder x changed and keep current project to build every time, but not deploy
Inspect changed files with script. Please check this sample on how to check those files if you use GitHub. So if you see that files in folder x changed, you can set some custom environment variable (lets say you call it deploy_nuget) to true, and use it with a conditional deployment.
I need to execute a command line utility from a package that is downloaded as part of nuget package restore in the TFS build process.
On my local computer that is stored in c:\users\me.nuget*
I've tried every permutation of that on TFS without success. I've also tried \mydir\packages with no success as well.
The biggest problem is that I have to run the package restore step before being able to see any sort of feedback from the log. That's some slow debugging.
Any ideas? Thanks ahead.
With the latest nuget/msbuild the packages folder is held under the active user's profile directory, so an appropriate Powershell command is
Get-ChildItem $(UserProfile)\.nuget\packages
This currently evaluates on the VSTS 2017 Hosted build agent to C:\Users\VssAdministrator\.nuget\packages but by using the variable you are insulated from any changes made.
Just an addition to #Paul Hatcher's answer:
I also faced the same problem in Azure DevOps build pipeline where a specific package and nuget packages directory could not be found.
It is a Xamarin.Forms app based on a .net standard library where no packages folder exists. I later noticed in build logs that the packages are restored to nuget folder under user's profile. However this particular case is not documented on https://learn.microsoft.com/en-us/azure/devops/pipelines/build/variables?view=vsts#agent-variables.
That means #Paul Hatcher's answer is also valid if you try to reference nuget package folder directly from your build pipeline. This ($(UserProfile).nuget\packages) should actually be a (standard) predefined build variable.
The Nuget package cache folder is in C:\Users\buildguest.nuget\packages, but it will be cleaned after build if you are using Hosted build server.
The simple way to verify:
Add NuGet restore or .Net Core Restore build step to restore packages
Add PowerShell build step to list files in C:\Users\buildguest.nuget\packages
Code:
Get-ChildItem -Path C:\Users\buildguest\.nuget\packages
Queue build and check the PowerShell step log (the packages’ will be listed in the log)
Remove/disable NuGet restore or .Net Core Restore build step > Save build definition
Queue build
The build will be failed, because the path does not exist.
So, the packages need to be restored before build solution/project if aren’t existing. You can add packages to source control and map to build agent to deal with the issue of too long time takes to restore packages.
I've got a build running in VSTS which is restoring NuGet packages from both nuget.org and a custom feed in VSTS. The custom feed is in the solutions NuGet.config as a <packageSource>, along with the user name and password in <packageSourceCredentials>
The build, including the restore, is working Ok, but there is a warning ...
2016-10-12T16:18:57.6589001Z ##[warning]To connect to NuGet feeds
hosted in your Team Services account/TFS project collection with
NuGet 3.1 or below, edit your build definition to specify a path
to a NuGet.config containing the package sources you wish to use.
How can I remove this?
Based on my test, that warning remains even through using higher version of nugget (e.g. 3.3) or do not restore package from VSTS feed. (Hosted build agent has the same result).
You can’t remove it unless you custom a build task to restore package through command line.
I submit a issue here.
Update:
The issue has been updated.
I see the issue in the code coming from our transition from depending
on assets coming with the agent to being deployed with the task. You
can get around this for now until we get an official change out by
either (1) choosing to use the Nuget 3.5 version radio button in the
task config or (2) supplying a path to your nuget.config.
So, you can use Nuget 3.5 version or specify nuget.config file.
We use Nuget for our internal development to allow us to share code across teams. We run into issues however when one person is working on code that will be deployed across multiple nuget packages at the same time. For instance
A depends on B which depends on C.
A, B and C have their artifacts pushed to Nuget and that's how we manage the dependencies between A, B and C. The trouble we find is that if a developer wants to make changes in C and quickly see those changes reflected in A, they have to go through the following process.
Make change in C.
Push change up to git
CI picks up change to C and builds and deploys new nuget package.
Go into B and update reference to C using a nuget update package command.
Push up the change to the packages.config file up to git
CI picks up change to B and builds and deploys new nuget package for B
Now open A and change reference to B and nuget update package
Make changes in A to go along with the changes in B(and transitively C)
This seems extremely painful and is causing some of our developers to question the choice of Nuget for our internally developed code. Everyone still like it for consuming external packages.
Is there any better workflow for using Nuget internally?
In our company we have solved the cascading updates problem with the following setup. First we have the following setup for our NuGet repositories and build server.
There is an internal NuGet repository that holds all the published packages for the company. This repository is just a shared directory on one of our servers.
Each developer can have (but doesn't need to have) one or more directories on their own machine that serves as a local NuGet package repository. By using a user specific NuGet configuration the developer can control in which order NuGet searches through the package repositories to find packages.
<?xml version="1.0" encoding="utf-8"?>
<configuration>
<packageRestore>
<add key="enabled" value="True" />
</packageRestore>
<packageSources>
<add key="Dev" value="D:\dev\testpackages" />
<add key="Company" value="<UNC_ADDRESS_COMPANY_REPOSITORY>" />
<add key="NuGet official package source" value="https://nuget.org/api/v2/" />
</packageSources>
<disabledPackageSources />
<activePackageSource>
<add key="All" value="(Aggregate source)" />
</activePackageSource>
</configuration>
All solutions have automatic package restore turned on, so that we don't have to commit the packages to our version control system.
Developers only control 3 out of the 4 version numbers, e.g. if the version is <MAJOR>.<MINOR>.<BUILD>.<REVISION> then developers can only change the major, minor and build numbers, the revision number is set to 0 except in builds done by the build server where it is the build number of the build. This is important because it means that for a given version consisting of a major, minor and build number the build server will always produce the higher version number. This again means that NuGet will prefer to take the package version coming from the company package repository (which only gets packages through the build server).
In order to make a change to one of the base libraries there are two possible processes being used. The first process is:
Make the changes to the base library (A). Update the version of (A) if needed.
Run the MsBuild script to build the binaries and create the NuGet packages of (A)
Copy the new NuGet packages over to the package repository on the local machine
In the dependent project (B) upgrade to the new packages of (A) that were just placed in the local machine package repository (which should be of a higher version than the ones available on the company wide repository, or NuGet.org)
Make the changes to (B).
If more changes are required to (A) then repeat steps 1,2 and 3 and then delete the package of (A) from the working directory of (B). Next time the build runs NuGet will go looking for the specific version of (A), find it in the local machine repository and pull it back in. Note that the NuGet cache may thwart this process some of the time, although it looks like NuGet may not cache packages that come from the same machine(?).
Once the changes are complete, then we:
Commit the changes to (A). The build server will run the integration build to verify everything works.
Tell the build server to run the release build, which builds the binaries and pushes the NuGet packages to the company-wide NuGet repository.
In (B), upgrade to the latest version of (A) (which should have a higher version number than the test package because the test package should have version a.b.c.0 while the newly build version in the company-wide repository should be a.b.c. where > 0
Commit the changes to (B). Wait for the build server to finish the integration tests
Tell the build server to run the release build for (B).
Another way of doing the development work is by taking the following steps
Make the changes to the base library (A). Update the version of (A) if required.
Build the binaries
Copy the binaries over to the location where NuGet unpacks the package of (A) for project (B) (e.g. c:\mysource\projectB\packages\ProjectA.1.2.3.4)
Make the required changes to project (B)
The commit process is still the same, project (A) needs to be committed first, and in project (B) the NuGet reference to (A) needs to be upgraded.
The first approach is slightly neater because this process also warns if there are faults in the NuGet package of (A) (e.g. forgotten to add a new assembly) while in the second process the developer won't know until after the package for (A) has been published.
You have two choices here:
Run an instance of NuGet Gallery within your organisation. This is the code which runs nuget.org
Get a license for Artifactory Pro, which has in-built Nuget support and acts as a Nuget repository.
I have used both, and #1 is a reasonable choice to start with, but NuGet Galley is optimised and designed for nuget.org, not on-premise/enterprise use, so things like deleting packages is a pain (hand-rolled SQL required).
I'd say that you should pay the (low) license fee for Artifactory Pro - it's an excellent product, and the JFrog team are really keen and switched on.
You should not be using nuget.org for internal/enterprise packages; nuget.org is designed for 3rd party/open source libraries, not internal build dependencies.
EDIT: in terms of workflow, why are you putting shared code into multiple packages? If the code needs to be shared, it needs to go in its own separate package.
EDIT 2: To speed up the code change workflow for the developer, you can use nuget.exe (the command-line client) and use command-line accessible builds, so you can target a "developer" build run. Then in your "developer" build (as opposed to the CI build) you specify -Source as a local path (e.g. nuget install B -Source C:\Code\B) when you want to pull the newly-updated B as a dependency and build against that; likewise for C or other local, newly-updated packages. Then when A, B, and C all build fine, you can git push all of them (in reverse dependency order), and let CI do its thing.
However, you also should question whether your package separation is really appropriate if you have to do this build 'dance' often, as this suggests that all the code should be in a single package, or possibly split along different lines in separate packages. A key feature of a well-defined package is that it should not cause ripple effects on other packages, certainly not if you are using Semantic Versioning effectively.
Edit 3 Some clarifications requested by marcelo-oliveira: "command-line accessible builds" are builds which can take place entirely from the command-line, without using Visual Studio, usually via batch files. A "developer build" is a build which a developer runs from her workstation, as opposed to the CI build which runs on the CI server (both builds should essentially be the same).
If A, B and C are under the same solution, you can create NuGet packages for them in a single build.
Just make sure that the using package has the new version number (assuming your build doesn't randomly change it) of the package it depends on.
If A, B and C are intentionally under different solutions e.g. A is under an infrastructure solution and B is under a product solution, then the only suggestion I have for you is to define your CI builds run on check in and not periodically.
Edit:
Another option is to create a push pre-release packages (e.g. version 1.0.0-alpha) to your NuGet server during local builds, this way developers can experiment with the package prior to creating a release version.
Nuget was designed for sharing third party libaries. We plan to use Nuget internally for all common code that can be shared between projects. Things like a common data layer, string and other regular expression functions, email components and other artifacts like that.
The only way I see Nuget being of any help is when the components, code or artifacts you are sharing across teams/projects are stable and have their own release cycle that is different from your project's release cycle. For your needs Nuget is an overkill. You might have better productivity linking all such projects within the same solution. Your solution structure should include the three projects A,B and C as dependent projects referenced internally wherever required.