I am in process of setting up a whole series of NuGet packages for our framework. Beside the simple binary-packages (.dll's for modules of the framework) there are also packages that deliver source-code into the projects that are using them done with the \content directory in the NuGet package.
To develop this source-code I have a test-/sandbox project. I develop/debug/fix the code in this peoject and if its final for the next release, I copy it over to the package's content-folder where I replace things like $rootnamespace$ etc. This needs to be done for each and every version of the package.
Another way is to keep only the final source with the $rootnamespace$ tags in it and maintain that directly. But then testing/debugging would be done by re-adding the package to a test-project and debug it there, go pack to the package content, modify it, re-build and re-add and test it again.
So I see two ways to maintain the content-sourccode (none of them is really good):
Keep source-code in \content as small as possible and deploy as much as possible as binaries.
Generate the \content using some transformation engine (eg. T4) from the sandbox/dev-project. What engine would be best to use for this?
In short: I didn't find a good workflow yet to maintain the "content" source-code of NuGet packages. How are you guys doing this? Any ideas for that workflow?
Check the http://github.com/maartenba/MvcSiteMapProvider build. It is customized quite a bit but basically does a find/replace on several namespaces and replaces them with replacement tokens right before packaging.
Related
We are building all the solutions to a shared bin directory. Having different projects reference different versions of the same dependency is not healthy for our build.
So, we consolidated the dependencies - great. But now the versions start to drift again. We do not want to consolidate them manually every now and then - we want to prevent the drift completely.
Why we do not want to use Paket? The main reason is that it seems we would lose the ability to migrate the NuGet package dependencies to the new PackageReference items in the projects. So, currently we have package.config files, but we plan to replace them with the respective PackageReferences. That means we will use internal NuGet support by msbuild, which seems to leave no place for Paket.
Now, I assume we are not unique in this world and others have the same problem as we are. How do you solve it?
EDIT 1
We have our internal NuGet repo, but we use it for dependencies which do not have organic representation in Nuget.org and for sharing our own internal packages.
One approach is to consume only from the internal NuGet repository. This has challenges, like:
Who uploads the dependencies there? Developers? But then how to make sure they do not upload different versions of the same dependency? Dedicated people? Then they become a bottleneck.
Small thing, but we need to block commits to the central NuGet.config
Uploading a dependency to the internal NuGet repo is not immediate. You cannot just download it from NuGet.org and upload to the internal one, because that would miss any transitive dependencies. So, a process should be built around it.
It is all possible, but I am reluctant to go down that route ... Must be a better way.
EDIT 2
While we do plan to migrate to PackageReference, it will take time. And unfortunately as long as we have Silverlight (another year, at least) a whole bunch of projects in the dedicated Silverlight solution (80+) will not be migrated to PackageReference, because by doing so it becomes impossible to debug the code with VS 2015.
Next, suppose we do migrate ALL the projects and then externalize all the PackageReference items to a single targets file imported by all the projects. This is feasible when using a shared bin directory as we plan to do. But when inspected in VS 2017 this setup communicates a wrongful picture that every single project depends on the entire set of NuGet dependencies.
I would rather avoid this.
Once you move to PackageReference, you can take advantage of MSBuild. For example, you can have a MSBuild file that contains all your dependency versions. It could be a file that you need to <Import ... /> in all your csproj files, or you could use Directory.Build.props. Finally, in each of your projects, change the version number in any <PackageReference to a MSBuild variable that uses the property you previously defined. Most of Microsoft's open source repositories use this technique, with minor variations about file names and whether it's imported automatically with Directory.Build.props, or an explicit <Import ... />.
While you can still use the Package Manager UI in Visual Studio to check for updates, you won't be able to update the package versions with it (at least, it won't preserve how and where the versions are defined). However, just make sure your MSBuild file that defines the versions is in your solution, so you can trivially open the file in Solution Explorer and then type the new version number in. Adding new package references is slightly more effort, but it's generally not done often, and it's still very easy with SDK-style projects, since Visual Studio lets you edit the csproj while the project is still loaded.
Since you didn't accept the other solution, maybe you could take a look at paket
It's a package manager for dotnet than (among other feature) holds solution wide dependency lock file. It is very customizable, and while it solves LOTS of problems, as any tool, creates some new ones. In my experience, the new ones are far less infuriating :)
I am reviewing our TFS access code after we upgraded to VS 2017 and VSTS Online.
I found out from another question on this site that the recommended way to access the TFS libraries is via a NuGetPackage.
Great, that's surely better than referencing from the Team Explorer installation folder.
However, the NuGet package in question added over 45 references to my project.
I believe I am only using 4-6 of them.
I found this question which discussed the fact that the package files do not have to go into source control.
That's good to know.
However, the references have been added as "Copy Local" and so they are all currently being copied to my output directory. This has caused my application to more than treble in size. It just doesn't seem like good practice.
Do people usually just ignore this and trade off against the fact they are getting great dependency management?
Or manually remove the non-required references...? Do future updates put the references back?
Or have I incorrectly consumed the package in some way...?
There are a lot of NuGet questions on this site. I did search but please accept my apologies if this is a duplicate.
Do people usually just ignore this and trade off against the fact they are getting great dependency management?
Add all dependencies to the project is the default behavior of NuGet. At this moment, there is not such option so that we could choose some of those dependencies.
Although all dependencies are added to the project as "Copy Local", when we publish our application, we could exclude those unneeded dependencies by changing the Publish Status from Include (Auto) to Exclude:
In this case, those non-required references are not included into the application.
Or manually remove the non-required references...? Do future updates
put the references back?
Yes, you can manually remove those non-required references, but when you update the package next time, those removed references would be re-add again.
Besides, as you said, you are only using 4-6 of them. You can try to custom a nuget package only including those 4-6 references.
Create nuget package from dlls
Hope this helps.
I'm trying to use nuget.exe outside of Visual Studio as part of our build infrastructure. The idea is that the various build tools are fetched by a bootstrapper script that initializes a working copy. The bootstrapper does this by using a file that specifies the required tools and their version.
Broken approach 1 - use manually edited packages.config
At first, it seemed like a good idea to keep a manually edited packages.config file and use nuget restore to install them during bootstrapping. However, this does not work for tools that have dependencies, unless I list every single dependency in the packages.config as well, a much to arduous approach to be feasible, because I found no easy way to recursively find all dependencies of a package.
See also using nuget.exe commandline to install dependency .
Broken approach 2 - use nuget install to update packages.config
The second idea was then to use nuget install to install the packages, and let that command update the packages.config, very similar to the Install-Package cmdlet in the package manager console. But, surprisingly, nuget install does not support this! It either takes a packages.config or a package ID as parameter, but I found no way to update the packages.config with the new package and its dependencies.
This problem can also be found in another (two year old) SO question, see nuget.exe install not updating packages.config (or .csproj)?.
Is there a working (and non-hacky) approach at all?
This must be a problem that many people face when using nuget outside of VS, so what is the best approach in that case?
Of course, I could just parse the packages.config and emit a nuget install for each package, but I really don't want to re-invent the dependency management part of nuget, this is what I'm using nuget for in the first place. So I'm left with the feeling that either an -WithDependencies switch on nuget restore or an -UpdatePackagesConfig switch on nuget install is missing...
Note that there are other SO questions regarding the broken approaches described above. What I'd like to know it what the best approach is to solve the root problem, i.e. manage packages with dependencies outside of VS.
nuget install does not currently make changes to the project file.
nuget install can be used to either restore the NuGet packages listed in a packages.config file or download and extract them.
If you do not need the project being modified then your solution of reading the packages in the packages.config file and calling nuget install seems like a reasonable approach.
If you need the project to be modified then you could look at one of the following:
Ripple - a command line tool that adds extra features to NuGet. It has a ripple install command line which is similar to nuget install but it also updates the project file. It has a lot of other features for supporting build servers so this might be a good fit.
NuGet packages outside of Visual Studio with SharpDevelop - this was an experiment I put together to see whether full NuGet support could be achieved, including PowerShell scripts, from the command line without using Visual Studio. It uses PowerShell and quite a bit of SharpDevelop.
Customise NuGet.exe to do what you need. nuget update, for example, does modify the project file, at least for file references, but will not run PowerShell scripts. So you could take the NuGet.exe source code and extend it.
Of the above only 3) would give you exactly what you need. The other two would require a bit of work to read the packages from the packages.config file or some other list and then install them.
See my answer to Why does the nuget command line tool not follow dependencies?
nuget install My.Package.Id will follow dependencies. However, if you want to install multiple packages, you will need to create a batch file with a separate nuget install command for each package. These are top-level packages. You don't need to "install" the dependencies, as they will get downloaded automatically.
If you ultimately want a packages.config file, I imagine you can generate one by enumerating all the packages that were downloaded. However, you would have to take care not to include multiple versions of the same package.
I believe that how nuget 3 works with project.json files may do what you are looking for. Essentially my understanding is that the unit of dependency becomes the package and not necessarily individual assemblies. From what I recall, the idea is the have only one place to manage these types of package references which the project (IDE/Editor), package manager, and other command line tools can use.
What I don't understand and feel somewhat frustrated about is that it appears that the project.json concept is being canceled. I don't know if plans are to reintroduce it at anytime in the future. In the mean time I keep on hearing updated info on tooling that takes advantage of project.json such as nuget so where you can actually rely on this is something that is unclear to me at this point.
I would like to use NuGet packages for building packages for core helper libraries which I would like to add as source files into other projects. I want to use source files instead of libraries for several reasons, the main one being that I need them in SharePoint Projects, which is on the one hand much easier to deploy than additional libraries, and on the other hand helps to reduce version conflicts.
I know that I can add the source files as content to NuGet Packages, which would install them with the package. But this won't work together with package restore, and I don't want to have these files checked into source control in all projects.
Is it somehow possible to make a NuGet package which doesn't copy the files to the project, but instead adds file links, which point back to the file in the package folder, to the project? I think this approach would solve my use case.
Thanks
pascal
It is possible to add linked files with the use of PowerShell scripts, for example with this NuGet script: http://www.nuget.org/packages/Baseclass.Contrib.Nuget.Linked/
Version control Best practices.
When developing a program, I use third party libraries, NUnit and others.
I want to share the sources of this program hosted on http://www.codeplex.com/ or http://code.google.com/hosting/.
What are good practices as regards third libraries?
Should I add the dll of my third libraries in the version control ?
Thank you,
With the introduction of NuGet you have a different way to do this.
See this post by David Ebbo: Using NuGet without committing packages.
Basically you use NuGet to download and add package references to the libraries you want (assuming there's NuGet packages for the libraries you need), but do not add the Packages folder to your repository.
Instead you modify your pre-build step of the projects that require packages so that they automatically download the packages required if they're not present.
Testing has shown that this adds a minor delay to the build process when checking if the libraries are present, so this may or may not be good enough for you.
We always do especially if we are linking against a specific version, we have an NUnit folder for example and then a version folder within it.