We have solution with a lot of projects and a more or less complex dependency graph between those projects. Now each of those projects should become its own nuget package and the dependency graph of the nuget packages should mirror the on of the projects.
I have two questions:
Is it possible to achieve this while keeping all projects within the same solution? If so how?
Is advisable to keep all projects in the same solution? What would be the a common / "best practice" approach to this?
The situation in our project is the same and we took the following approach:
The first step is to create the nuspec files defining your packages. We have placed all theses files in a folder named ".nuspec" which is located in the solution's root directory. The nuspec files are added to the solution in a solution folder that is named ".nuspec", too.
The solution itself has a global AssemblyInfo file that contains the versioning information as well as some copyright stuff - in short all information that is common between our projects. Each project then has its own assembly info adding the information specific to each project.
The nuspec files do not contain a version. Instead we use $(version) as a placeholder there:
<?xml version="1.0" encoding="utf-16"?>
<package xmlns="http://schemas.microsoft.com/packaging/2011/08/nuspec.xsd">
<metadata>
<id>MyCompany.MyProduct.Server.DataAccess</id>
<version>$(Version)</version>
<authors>MyCompany</authors>
<projectUrl>http://example.com/myProduct.html</projectUrl>
<iconUrl>http://example.com/myProduct.icon.png</iconUrl>
<requireLicenseAcceptance>false</requireLicenseAcceptance>
<description>Some description goes here.</description>
<summary>The summary goes here</summary>
<copyright>Copyright © MyCompany 2015</copyright>
<language>en-US</language>
<dependencies>
<dependency id="MyCompany.MyProduct.Common" version="$(Version)" />
<dependency id="MyCompany.MyProduct.Server" version="$(Version)" />
</dependencies>
</metadata>
<files>
<file src="path\to\MyCompany.MyProduct.Server.DataAccess.dll" target="lib\net45\MyCompany.MyProduct.Server.DataAccess.dll" />
</files>
</package>
(Of course the dependencies might have dependencies themselves. The server component might reference a logging component for example.)
Initially we created a console application reading the version of the solution from the global AssemblyInfo file and parsing it into all of the nuspec files before creating and publishing the packages.
The console application worked well, but was a bit tedious to maintain in a TFS environment with continuous integration enabled. So we defined a custom TFS build template doing this work. All we need to do now to create a set of nuget packages for all of our projects is to trigger a TFS build.
This approach has the advantage that all packages have the same version and thus work well together.
This approach has the disadvantage that all packages have the same version and cannot be released independently.
We chose that approach because it prevented us from producing a conglomerate of badly integrated components. Our projects provide a small framework that is used to develop small LOB-applications that all are quite similar. Due to the fact that we deliver the framework in a set of different packages the developers can choose which of the packages they actually need and then install only those. Should a developer decide to add a missing functionality lateron he just installs the relevant packages that have the same version as those already installed. Thus there's no need to worry about compatibility.
Currently, in VS 2017 you can have several library projects in a solution which are built into separate packages and also reference each other by <ProjectReference>. Surprisingly, VS is smart enough to use <ProjectReference> when building solution AND to produce correct package <dependencies> for referenced projects in nuspec. In other words, you can conveniently work with many projects simultaneously in one solution and have them all published as a set of packages depending on each other.
Yes, you can "probably" make this work. I say probably because I haven't tried it, but I would approach it with something like this: https://www.nuget.org/packages/CreateNewNuGetPackageFromProjectAfterEachBuild/ with a manual nuspec defining your references would work. If you wanted to get really fancy, you could write some post-build Roslyn code to parse your project dependencies and build up the nuget dependency tree. That said, don't do this, in a non-trivial solution, its almost guaranteed to become manual and brittle pretty fast.
Ultimately, it's much preferable to just break your solution up -- create a solution per Nuget package, and pull in your dependencies using Nuget itself. Assuming you have a build/CI server, this should be fairly trivial. Just run your own Nuget repo and publish the build artifacts as they get built -- that way your dependent projects will pull the latest package you JUST built. You'll want to ensure your build process refreshes Nuget each time and you can use the standard nuget spec command as a post-build step. As a nice bonus, it'll force everyone working on the code to really think thru the dependencies when making changes.
Related
We have a bunch of legacy dll's (basically could be anything, some are old fortran, some .NET) and we want to move them to Azure Artifacts. Can I create NuGet packages from these legacy dll's that are not based on .NET (like the Fortran ones) by themselves.
I've already tried creating the NuGet packages, but I get warnings on my dependencies because it looks like they are trying to load the packages on a .NET framework. Is the only real workaround here to build a .NET say class library or something, then reference the dll through that, and create a NuGet package with that library and just add the legacy dll's as references?
Can I create a NuGet package (or other package) with legacy dll's that are not on a .NET Framework?
The answer is yes.
You could target those legacy dlls to the tools folder instead of the lib folder. like:
<files>
<file src="legacy\*.dll" target="Tools" />
<!-- Other files -->
</files>
Then pack this .nuspec file when you build the pipeline, those legacy dlls are located in the tools folder, which will not add as references.
Check the From a convention-based working directory for some details.
Hope this helps.
I assume what you mean is that your managed code is usng DllImport attributes to call native code, which in the .NET ecosystem is called Platform Invoke, or P/Invoke. I mention this because if you googled terms around P/Invoke and nuget you probably would have had better luck finding existint stack overflow questions, blog posts and so on. For this reason it's useful to try to find out the official or commonly used names of the features you use, so you know what to search for. Unfortunately I don't think the NuGet team has any docs on this scenario at the moment.
SDK style projects support the runtime\ directory in the package, although I think that's somewhat undocumented as well. I don't know about traditional projects using PackageReference (PR), but packages.config (PC) definately does not support runtimes\. For PC projects, the package author typically (always?) includes build targets to copy the native assemblies after build. In the package, the native dlls are elsewhere, often the author puts them in the build directory next to the targets, but I think I've also seen the targets copy from the runtime directory so that the package supports both PC and SDK style projects.
I suggest you try to think of some commonly used native libraries that have .NET bindings and see how the package works (nupkg is just a zip renamed). My guesses would be sqlite or curl, or how asp.net core's kestrel web server bundles libuv (or did in earlier versions if it no longer does). I'm on vacation at the moment, so don't have the motivation to dig deep myself.
We are building all the solutions to a shared bin directory. Having different projects reference different versions of the same dependency is not healthy for our build.
So, we consolidated the dependencies - great. But now the versions start to drift again. We do not want to consolidate them manually every now and then - we want to prevent the drift completely.
Why we do not want to use Paket? The main reason is that it seems we would lose the ability to migrate the NuGet package dependencies to the new PackageReference items in the projects. So, currently we have package.config files, but we plan to replace them with the respective PackageReferences. That means we will use internal NuGet support by msbuild, which seems to leave no place for Paket.
Now, I assume we are not unique in this world and others have the same problem as we are. How do you solve it?
EDIT 1
We have our internal NuGet repo, but we use it for dependencies which do not have organic representation in Nuget.org and for sharing our own internal packages.
One approach is to consume only from the internal NuGet repository. This has challenges, like:
Who uploads the dependencies there? Developers? But then how to make sure they do not upload different versions of the same dependency? Dedicated people? Then they become a bottleneck.
Small thing, but we need to block commits to the central NuGet.config
Uploading a dependency to the internal NuGet repo is not immediate. You cannot just download it from NuGet.org and upload to the internal one, because that would miss any transitive dependencies. So, a process should be built around it.
It is all possible, but I am reluctant to go down that route ... Must be a better way.
EDIT 2
While we do plan to migrate to PackageReference, it will take time. And unfortunately as long as we have Silverlight (another year, at least) a whole bunch of projects in the dedicated Silverlight solution (80+) will not be migrated to PackageReference, because by doing so it becomes impossible to debug the code with VS 2015.
Next, suppose we do migrate ALL the projects and then externalize all the PackageReference items to a single targets file imported by all the projects. This is feasible when using a shared bin directory as we plan to do. But when inspected in VS 2017 this setup communicates a wrongful picture that every single project depends on the entire set of NuGet dependencies.
I would rather avoid this.
Once you move to PackageReference, you can take advantage of MSBuild. For example, you can have a MSBuild file that contains all your dependency versions. It could be a file that you need to <Import ... /> in all your csproj files, or you could use Directory.Build.props. Finally, in each of your projects, change the version number in any <PackageReference to a MSBuild variable that uses the property you previously defined. Most of Microsoft's open source repositories use this technique, with minor variations about file names and whether it's imported automatically with Directory.Build.props, or an explicit <Import ... />.
While you can still use the Package Manager UI in Visual Studio to check for updates, you won't be able to update the package versions with it (at least, it won't preserve how and where the versions are defined). However, just make sure your MSBuild file that defines the versions is in your solution, so you can trivially open the file in Solution Explorer and then type the new version number in. Adding new package references is slightly more effort, but it's generally not done often, and it's still very easy with SDK-style projects, since Visual Studio lets you edit the csproj while the project is still loaded.
Since you didn't accept the other solution, maybe you could take a look at paket
It's a package manager for dotnet than (among other feature) holds solution wide dependency lock file. It is very customizable, and while it solves LOTS of problems, as any tool, creates some new ones. In my experience, the new ones are far less infuriating :)
My company is moving to using NuGet for our internal dependencies for desktop applications. This works fine for versioned imports, but in some cases (like during pre-Beta on a product) we'd like to grab the latest version of the dependency on our build servers and have the csproj files find it without issue.
We'd like to use automatic package restore, but that seems to be constrained by a specific version (as noted in this question). Using nuget restore followed by nuget update is also a possibility, but it doesn't seem to work solution-wide the way that restore does (and we have a couple dozen projects that have to share the same version of the same dependency).
Our best solution so far has been to add a hint path to the dependency binary in a non-versioned manner, i.e.,
<Reference Include="Dependency">
<SpecificVersion>False</SpecificVersion>
<HintPath>..\..\packages\Dependency\lib\net40\Dependency.dll</HintPath>
</Reference>
And use the pre-build event to run
nuget.exe install Dependency -NoCache -ExcludeVersion
Is there a better way to do this? It would be nice to do it the most standard way possible so that we can get tooling support and new developers to the project can more easily know how to add their own dependencies via NuGet.
As of Nuget.exe v2.8.3, there isn't any way to do a solution-wide restore and update (at least when not all the projects in a given folder hierarchy are part of a solution). We ended up using the workflow described in How do I update a single nuget package in a project from the command line?.
I have a nuget package that can be applied to any type of C# project.
It has a file that is added to the project as part of the package. The NuSpec looks like this:
<files>
<file src="Content\App_Start\StartUpCode.cs.pp" target="content\App_Start" />
I am using WebActivator to run the code in the file at application start.
I run into a problem when the nuget package is applied to several projects in the same solution. I get several instances of the StartUpCode.cs added, and as a result WebActivator runs the code several times.
How can I stop this code from being added to a project that is not web related? I.e. it's cool to add it to a WebAPI project, or a WebForms project, but not a class library.
I don't think there's anything in the Nuget spec that would allow you to do that easily. Maybe use a Powershell install script and detect the type of project it's being installed into and/or if the assembly has been referenced previously?
Personally, I'd split it into two Nuget packages. One with the business logic, and then another with the WebActivator dependency.
Say I have the following solution with multiple versions of the same code each targeting a different framework and I would like to generate a nuget package from it.
SharedLib.sln
SharedLib.Net35.csproj
packages.config
SharedLib.Net40.csproj
packages.config
SharedLib.Phone.csproj
packages.config
SharedLib.SL4.csproj
packages.config
The expected nupkg has the following structure
SharedLib.1.0.nupkg
lib/net35/SharedLib.dll
lib/net40/SharedLib.dll
lib/sl4-wp/SharedLib.dll
lib/sl4/SharedLib.dll
nuget.exe pack SharedLib.SL4.csproj will automatically determine that the target framework is SilverLight4 and place the binaries in lib/sl4
I know I can add a SharedLib.SL4.nuspec file with a <file> section to include binaries from the other projects but is there a way to make nuget automatically place the combined solution output into the proper structure (and also detect dependencies in packages.config from all projects?
No, there's currently no way to do this other than to write a custom build script that puts the files in the right place and then runs NuGet pack on them, or to take the .nuspec approach you described.
This is a feature we'd like to have, but haven't thought of a good way to do it. However, your post just gave me an idea.
Today, you can point nuget pack at a .csproj file.
We could consider an approach that allowed you to point it at a .sln file and if the project names follow some convention, we'd package all the projects into a single package.
If you really want this feature, consider logging an issue in the NuGet issue tracker. http://nuget.codeplex.com/workitem/list/basic