Private nuget feed - package path folders and indexing woes - nuget

We used nuget.server 2.8 to create a private feed for hosting nuget packages (mostly chocolatey packages) in our organization. I would like to improve/expand the indexing capability but I can't figure out how to do that.
I know in a typical nuget server feed, all the .NUPKG files would be in the root of the package path specified in the config. Long story short, we have a requirement for a folder structure in that package feed as different groups within the organization will be using SVN to commit data which ends up here. To easily manage this, we need a more complex folder structure.
However what I have found is that .NUPKGs in the root of the package path or one folder deep are indexed and available via the feed. Once you go two folders deep, the NUPKG files aren't indexed and aren't available via the nuget feed. Is there a relatively easy way I can change that? Is that a setting specified somewhere? I can't seem to find where this limitation is coming from. Any direction would be outstanding.

We've had a few users request such a feature for ProGet, but ultimately decided against implementing the feature because of the problem of not only dealing with duplicate packages, but communicating that problem to the user.
Remember that a valid NuGet package must have a file name that matches its version+id (e.g. MyPackage.1.2.nupkg can only be MyPackage v1.2). Thus if you have folderA\MyPackage.1.2.nupkg and folderB\MyPackage.1.2.nupkg, which is the valid? Do you invalidate both? Etc.
That said, it's trivial to implement, so you could simply use the ProGet SDK to build your own package store that inherits from the default, but iterates subdirectories as well.
As a side note, if you're serious about maintaining a private repository, you really should get something other than NuGet.Server. There are several available that can manage chocolately packages.

Symlinks is your best bet. You will just want to symlink those files up on a regular basis with a scheduled task.
I have to second Karl's answer on using something better than NuGet.Server. Depending on your growth potential, it can start to become unusable fast after you have 100+ packages in the repository. Note: I haven't checked this myself since 2012, it's possible it has better support now for multiple packages.

Related

Industry standard for managing Nuget packages within the enterprise

I have a situation where we have multiple C# projects that use similar set of Nuget packages (ex. Newton Json, Microsoft Compilers and CodeDom, Owin, log4net, jQuery, EntityFramework etc.)
I am trying to see how we can have a shared location for all Nuget packages to reduce the footprint of those binaries in Git, having a single repo for them by centralizing them in one place.
One option I found is to use Nuget.config in each project with repositoryPath set to point at the shared location. This works great for adding/upgrading/restoring Nuget packages in the project but it is not very clean when a package gets removed from one project but is still required in a different one. Basically the package will get removed from the shared location and the change is committed to Git, then when the other project requires it, it would get restored and added back to Git. Not a perfect solution in my mind.
I have a two part question:
1. Is there a way to improve the above workflow when packages get removed?
2. What is the industry standard for handling third party libraries delivered via Nuget? Or if there is none, can you share your experience handling Nuget packages across multiple projects.
If the concern lies with the footprint/organization of the Git repository, maybe you can do a .git ignore for the dependencies folders to prevent git from committing them into the repositories. When you need to build the projects from source, just do a dotnet /nuget restore to get the dependencies from the source you configured in the Nuget.config
Not sure if it is the industry standard, but we host our own Nuget server to control the libraries that the different teams can use. Microsoft has an article on Hosting your own NuGet feeds
Hope it helps.

Nuget: Good idea to check in package folder

I'm actually thinking about the pro and cons about using NuGet. In our current software we're storing each external reference in a common reference folder (which is commited to our SW versioning system). Over time this approach becomes more and more painful because we've to store different versions to the same library.
Since our devs are sometimes at the customer site (where not all customers are offering internet connectivity ...) we won't use NuGet directly, because NuGet packages can't be restored.
Based on that I'm actually thinking about using NuGet and store the packages folder in our SW versioning system.
Does anybody know if there are some disadvantages about this solution? Does anybody have a better proposal?
Thx.
I would argue against storing external nuget packages in your version control system.
It's not your application's responsibility to archive third party packages. Should you need to take care of that risk then build a solution intended for such (for example: use private nuget repository that's properly backed up).
Avoid duplication in code base - provided you use properly released packages, then the packages.config file content is sufficient for reliably reproducing the exact dependencies your application needs.
Synchronization is an effort - keeping packages.config and packages folder in sync- once you start including them in source control every developer working with packages would monitor and add or remove packages to source control.
If devs ever forget to add then local build still fails.
If they forget to remove no longer necessary piece then your downloadable set would contain junk.
VCS dataset size - storing them would needlessly enlarge your version control storage. Quite often the packages contain N different platform dlls, tools and whatnot which add up quite fast. Should you keep your dependencies constantly up to date, then after 10 years your VCS history would contain huige amount of irrelevant junk. Storage is cheap, but still..
Instead, consider having a private nuget repository with the purpose of serving and archiving the packages your application needs and set up your project to check your project nuget repository first. If your developers need offline compile support then they can set up project repository mirrors on their build boxes and configure the following fallback structure for repos:
Developer local project repository (ex: folder)
Shared project repository (ex: Nuget.Server)
(nuget.org)
A guide how to configure multiple repositories can be found here: How to configure local Nuget Repository.

Nuget Gallery with multiple feeds

I recently installed Nuget Gallery (https://github.com/NuGet/NuGetGallery) as a repository. Ideally I would like to create multiple feeds so that I could differentiate between nuget packages that will be reused in other projects (dll's, contracts etc) from the packages we use to deploy our projects to the production environment.
I know I can achieve this by creating multiple instances of the Nuget Gallery, but this seems to me a bit of an overkill, it would mean two websites two databases. I am also familiar with the fact that MyGet provides this functionally but I will not be able to get an approval for the purchase. I am also aware teamcity contains its own feed server but it doesn't allow this multiple feed scenario, nor its performs well enough to be used in a large scale.
In a nutshell the ideal deployment scenario would be as follow:
teamcity generate deployment package or dll/contract package, depending on the build scenario.
teamcity publishes deployment packages to a nuget gallery deploy feed
(say: nugetgallery.server.com/deploy/api/v2).
teamcity publishes dll/contract packages to a nuget gallery dev feed
(say: nugetgallery.server.com/dev/api/v2).
octopus always searches for packages in
nugetgallery.server.com/deploy/api/v2
devs / teamcity searches for packages in
nugetgallery.server.com/dev/api/v2
This way I keep things clean and I can even go as far as create a third type of feed that only contains release packages so that I can be sure nothing would ever be deployed to production if it wasn't on that feed.
I might have missed some fundamental approach, so alternatives to this one I picked are welcome.
As I couldn't find anything relevant I ultimately gave up and went with the two servers solution. I struggled a lot to find any documentation what functionalities the nuget gallery really has.
Right now we have something like deploy-nuget.server.com and dev-nuget.server.com, separate urls, iis instances and sql instances and folder location.
For someone that might look into this in the future, one of the solutions that could work is to make private repository based on the user, unfortunately in my case that would not be enough as I would also want the packages to be stored in different locations so we could enforce different backup policies based on the type of package. Another option would be to actually change to fork the project, but from my previous experience this never ends well as sooner rather than later you will want to upgrade and your custom changes will have to be sorted somehow.
I understand this is not the idea behind nuget gallery, as you are not supposed to delete packages. But we do have some space constraints so eventually we will remove certain deployment packages that were created for QA environments which we obviously dont care anymore.
you can try Proget. using this server you can easily manage multiple NuGet feeds.
it also provides free edition which supports all features.

managing non-nuget dlls along with nuget packages

Are there any guidelines or recommendations for managing libraries that aren't on nuget along with the packages that are.
Most of these may be 3rd party libraries that may never go on nuget unless we specifically put them on.
Is it best to keep these dlls out of the same folder that nuget uses to store its downloaded dlls or is it better to keep them together?
We would be looking at moving to DVCS once we sort this out and would probably add an ignore file to ignore the whole packages directory (and possibly add exclusions for these non-nuget dlls or just force them to be checked in if they ever do change).
My Personal Preferences (In Order)
Create a package and add it to NuGet (if licensing allows)
Create a package and put it in a private repository
Create a folder in the solution, store them all in there, add them to source control. Different folder than the nuget packages folder as it's clearer what it is and I don't add nuget dlls to source control.
For 1 & 2 i would recommend using Restore Package on build rather than storing it in your source control.
I'd also highly recommend against referencing any 3rd party controls from install folders or the GAC.

How to manage share libraries between applications?

We develop enterprise software and we wish to promote more code reuse between our developers (to keep this problem simple, let's assume all .NET). We are about to move to a new VCS system (mostly likely mercurial) and I want to have a strategy in place for how we will share libraries.
What is the best process for managing shared libraries that meets the following use cases:
Black Box - only the public API of the library is known and there is no assumption that consuming developers will be able to "step into" or set breakpoints into the library. The library is a black box. Often a dev does not care about the details, just give me the version of the lib that has always "worked".
Debug - the developer should be able to at least "step into" the library during development. Setting breakpoints would be a bonus too.
Parallel Development - while most likely the minority, there are seemingly valid use cases for developing the library in parallel with the consuming application. Often the authors of the library and component are the same developer. For better or worse, the applications and libraries can often be tightly coupled. Being able to make changes and debug into both can be a very productive way for us to develop.
It should be noted that solving 3, may implicitly solve 2.
Solutions may involve additional tools (such as NuGet, etc.).
By sharing libraries, you must distinguish between:
source dependencies (you are sharing sources, implying a recompilation within your project)
binary dependencies (you are share the delivery, compiled from common sources) and link to it from your project.
Regarding both, NuGet (2.0) finally introduced the "Package Restore During Build", in order to not commit to source control whatever is build in Lib or ExternalDependencies folder.
NuGet (especially with its new hierarchical config, NuGet 2.1) is well suited for module management within a C# project, and will interface with both git and Mercurial.
Combine it with the Mercurial subrepos, and you should be able to isolate in its own repo the common code base you want to reuse.
I have 2 possible solutions to this problem, neither of which seems ideal (and therefore why I posted the question).
Use the VCS to manage the dependencies. Specifically, use mercurial subrepos and always share by source.
Advantages:
All 3 usecases are solved.
Only one tool is required for source control and dependency management
Disadvantages:
Subrepos feature is considered a feature of last resort by Mercurial developers and from experimentation and reading has the following issues:
Tags cannot be easily or atomically applied to multiple repos.
Root/Shell repos are inherently fragile (can be broken if the pathing to subrepos changes). Mercurial developers suggest mitigating this issue by including no content in the shell repo and only use it to define (and track the revision) of the subrepos. Therefor allowing a dev to manually recreate a moment in time even if the subrepo pathing is broken.
Branching cannot cross repo boundaries (most likely not a big issue as one could argue that branches should only occur in a given subrepo).
Use Ivy or NuGet to manage the dependencies. There are two ways this could work.
Dependencies/Packages can simply contain official binaries. A build server can be configured to publish a new dependency/package into the company repository when a developer submits a build for new version. This solves case 1. Nuget seems to support symbol packages that may solve case 2. Case 3 is not solved and leaves developers in that case out to dry and come up with there own solution (there is basically no way to commit applications to the VCS that include dependencies by source). This seems to be the traditional way that dependency management tools are used.
Dependencies/Packages can contain a script that gets the source from mercurial. The script could be automatically executed when the dependency/package is installed. Some magic has be performed to have the .NET solution include the reference by project (rather than by browsing the filesystem), but in theory this could happen in the NuGet install script and reversed in the uninstall script.
Switching between "source" and "binary" dependencies seems to be a manual step. I would argue devs should switch to binary dependencies for releases and perhaps this could be enforced on the build server when creating a release. This further complicated by the fact that the VS solution needs to be modified to reference a project vs a binary.
How many source packages exists? Does every binary package contain the script to fetch the source that it was built with? Or do we create separate source packages that use the install script magic to get the source? This leads to the question is there a source package for every tag in mercurial? Every changeset? Or simply 1 source package that just clones and updates to the tip and leaves the dev to update to a previous revision (but this creates the problem of knowing what revision to update to).
If the dev then uses mercurial to change the revision of the source, how can this be reflected in the consuming application? The dependency/package that was used to fetch the source has not changed, but the source itself has...