Nuget intermediary for faster updates - nuget

Let's say my company creates:
a commercial product "Product"
a free opensource library "Library"
"Library" is published on Nuget for other people.
"Library" is also used by "Product" also via a nuget reference.
Sometimes we find an urgent bug in "Library", that we need to fix and release updates to both "Library" and "Product".
We fix the bug, publish the fix to Nuget.... aaaahnd then we have to wait a couple of hours until nuget publishes the fixed version.
How do teams usually solve this problem elegantly?

Basically there are several ways to solve such problems. But the way, we used in our project is to not use NuGet for such dependencies. We place such "Libraries" in the separate project, that is a part of a "Product" repository, or "Product" repository refer to this "Library" with git-modules. The build process of product in this case would be:
Build "Library" -> after this publishing process to nuget started automatically in the separate job (speaking about automation build on Jenkins)
Build "Product" using built "Library"
Publish "Product"
In this piplene additional built options like "Building and publishing only library" will be helpfull in your build scripts.
If you refer to "Library" as to git submodule, so all staff, connected with library build and publishing can be placed there. So I recomment use git submodule for this. Also with git submodule it will be easier to mange version of "Library" that you want to use in your "Product"

You can create own "Nuget package source" somewere in the cloud, or in your company network. And, on building your "Library", publish it in both sources: nuget.org and your own one. So, in your internal projects you will use your package source an be able to receive updates right after publishing, but other users of this "Library" will able to include it in their project with regular nuget.org.

Related

Azure Dev Ops, Private Nuget feed, options to develop / test nuget packages?

I am looking for practical options to develop and test private nuget packages.
We have a set of "core" code that is delivered securely through an Azure Artifact Feed. We have various "consuming" applications that use the core nuget packages.
As a small-medium team, one person may be developing the core nuget as well as consuming it.
Today we check-in / merge the code for the nuget package. Make sure the Pull request is approved / passes. Then the build updates the Azure Artifact feed.
Then we come back to the "consuming" app and can update the package. Works great if you fix / add the feature the first time. However, slows down productivity when treating this as an iterative development approach.
Looking for simple options for a small team. Random thoughts on options:
Push nuget "alpha" package straight from developer's machine to Azure Artifact feed. Symbol server too?
Do something with an Azure build to allow "feature" branches to publish to Azure Artifact feed somehow?
Push to local nuget feed. Include pdbs so it can be debugged?
Temporarily break the nuget reference directly for local copy of dll(s)?
Re-think using nuget packages as a whole?
Push nuget "alpha" package straight from developer's machine to Azure Artifact feed. Symbol server too?
It depends on whether you need to debug it. If you need do debug this "alpha" package, you have to push the symbol package to the symbol server.
Note: You do not need to push the "alpha" package to the symbol server, just the symbol package.
Do something with an Azure build to allow "feature" branches to
publish to Azure Artifact feed somehow?
There is a task Push NuGet packages, we could use it to publish to Azure Artifact feed during build, no matter which branch it is on. It depends on whether you have enough permissions for the Azure Artifact feed, you can check it from Artifacts->Settings->Feed settings->Permissions:
Push to local nuget feed. Include pdbs so it can be debugged?
No, you also have to include the source code. Check this thread for some more details.
And there is a lightweight solution how to debugged nuget package on local feed on a network share.
Temporarily break the nuget reference directly for local copy of
dll(s)?
Re-think using nuget packages as a whole?
The answer is yes, when we develop the project on the local, use project reference is better than nuget, check my another post for some more details:
Ticket: Project reference VS NuGet.
Hope this helps.

Industry standard for managing Nuget packages within the enterprise

I have a situation where we have multiple C# projects that use similar set of Nuget packages (ex. Newton Json, Microsoft Compilers and CodeDom, Owin, log4net, jQuery, EntityFramework etc.)
I am trying to see how we can have a shared location for all Nuget packages to reduce the footprint of those binaries in Git, having a single repo for them by centralizing them in one place.
One option I found is to use Nuget.config in each project with repositoryPath set to point at the shared location. This works great for adding/upgrading/restoring Nuget packages in the project but it is not very clean when a package gets removed from one project but is still required in a different one. Basically the package will get removed from the shared location and the change is committed to Git, then when the other project requires it, it would get restored and added back to Git. Not a perfect solution in my mind.
I have a two part question:
1. Is there a way to improve the above workflow when packages get removed?
2. What is the industry standard for handling third party libraries delivered via Nuget? Or if there is none, can you share your experience handling Nuget packages across multiple projects.
If the concern lies with the footprint/organization of the Git repository, maybe you can do a .git ignore for the dependencies folders to prevent git from committing them into the repositories. When you need to build the projects from source, just do a dotnet /nuget restore to get the dependencies from the source you configured in the Nuget.config
Not sure if it is the industry standard, but we host our own Nuget server to control the libraries that the different teams can use. Microsoft has an article on Hosting your own NuGet feeds
Hope it helps.

Command to update packages repo in PackageManager console

I need to update package repo before building a solution in TFS Build Definition. I want to implement this using CommandLine build task.
Could someone tell me how to write a command to update package repo using a path.
According to your prior question, there are just missing some external packages during your TFS build pipeline.
Usually TFS use Package Management that hosts NuGet, npm, and Maven packages alongside all your other TFS assets: source code, builds, releases, etc, also be able to handle the external packages.
You could directly add external packages to a TFS Package Management feed. When you restore the packages, select the feed. All need packages will be restored entirely. To achieve this, just use Push NuGet packages to specify the packages you want to publish and the target feed location.
More details please refer Get started with NuGet Package Management in TFS

Can a VIPM package repository be set up on GitHub?

VIPM stands for Virtual Instrument Package Manager. It is a manager of install-able packages for NI LabVIEW. It is published by JKI Software and a free version of it is distributed with LabVIEW.
Registered (paying) users can set up public or private VI Package repositories. I would like to set one up on GitHub.
I attempted to do so by first creating a VI Repository on my local hard drive, publishing some packages to it, then making a remote clone on GitHub. Using the VIPM Repository Manager, I added the repository by browsing to the index.vipr file on my remote GitHub clone. However, VIPM gives me an error saying that the repository was not found.
Has anyone managed to set up and subscribe VI package repository on GitHub?
The short answer is that GitHub and a VIPM repository are fundamentally different and unless VIPM adds support for git repositories and GitHub then I doubt it will be possible.
If you are considering managing dependencies of any project using GitHub as a source for your shared libraries then you might want to consider a package manager like yarn.
Yarn (and others like npm and bower) are capable of fetching (cloning) from GitHub and follow the common practice from the Web developer world (and others) of having all a project's dependencies contained within the project; This is a departure from the VIPM view where you update your development environment (LabVIEW) by installing the packages 'globally'.
A list of the project's installed libraries and the library versions are stored in a human-readable file called package.json which provides a portable way of getting the project setup on another machine.
As new releases of the libraries happen, you can choose when to update the library in your project by selectively updating the libraries.
This approach works well with LabVIEW packed libraries (.lvlibp) as opposed to VIPM packages as there is no install-into-LabVIEW-IDE step with packed-libraries. If you have a hierarchy of packed libraries then they can also specify their dependent libraries using a package.json and then yarn can install all the libraries recursively.
It is possible to configure Yarn to place libraries into a folder of your own choosing instead of the default node_modules (as used by Node.js).
The advantages of this are:
You can choose which versions of libraries to use per project
Package managers integrate nicely with automatic testing and building setups
You can use GitHub or other git-providers for publishing your libraries
The disadvantages are:
More setup
It is not a common approach in the LabVIEW development world
Your VIs won't be installed into the LabVIEW palettes unless you explicitly install them

Is it possible to use a local project instead of a nuget dependency with DNX?

Let's say that have 2 DNX projects (A and B). Both of these projects are stored in separate repositories and are published as nuget packages.
Project A depends on project B. Under normal circumstances Project A would just pull project B's nuget package.
There is an instance where I'd like to work on both Project A and project B at the same time. I'd like to be able to be able to make changes to project B and use those changes in project A without having to build a package deploy it and then pull it.
I know that in the Ruby world, it's possible to do this with bundler. You can tell it to use a local directory instead of a dependency. I have also heard that it's also possible to do something similar with bower.
Is such a thing possible with DNX. If so, how would I go about doing this?
You can specify additional folders to search for using the projects property in global.json.
something like...
{
"sources": ["src","tests","../relative/path/to/other/project/src/dir/"]
}
will then have DNX load those projects from source, instead of trying to load them as a nuget package.