What's the idea of Microsoft's "....Abstractions" Nuget packages? - nuget

Can someone explain the general idea behind providing ... Abstraction packages?
As an example, when I search for the word "hosting" in VS NuGet package manager, in the list of findings there are:
Microsoft.Extensions.Hosting
Microsoft.Extensions.Hosting.Abstractions
Microsoft.AspNetCore.Hosting.Server.Abstraction
Microsoft.AspNetCore.Hosting.Abstractions
Microsoft.AspNetCore.Hosting
Microsoft.Extensions.Hosting.WindowsServices
Are these all related? Some seem to be platform dependent (AspNetCore) while others are not? Is there a general rule that tells me when to choose which?
Suppose I want to implement BackgroundService in a .NET5 class library, which of these NuGets shall I install? (It seems that Microsoft.Extensions.Hosting.Abstractions works fine for me, but I had to try that out.)
Thanks

The idea is that a library or package that you provide would only reference the Abstraction packages for easier compatibility.
E.g. if you ship a company-wide NuGet package with some business logic in it or a custom client, you may want to use ILogger / ILogger<T> for logging but not actually depend on any implementation for logging (both the built-in loggers or Serilog etc.), so you can reference the logging abstractions for these interfaces.

Related

Service Fabric: Plugins vs. Application Types

I'm developing a Service Fabric-based trading platform that will host hundreds of different long-running trading algorithms, all of which conform to a common interface and share a good deal of common code but can be vastly different in their internal specifics. I could model each of the different algos as an application type (which I'd dynamically load) but given the large number of different algos I have to wonder if in makes more sense to create a single Plugin Runner application type then implement the algos as plugins.
In a related question, I understand how to implement a plugin architecture, in general, but I'm not quite sure where one would place the actual plugins in order to be discoverable by an instance running on Service Fabric.
Anyway, thanks for your help....
Both approaches can work I think. Using lots of Application Types adds the (significant) overhead of running lots of processes, but allows you to use and upgrade multiple versions of the same algorithm running simultaneously.
Using the plugin approach requires you to deal with versioning yourself.
Using the Application approach probably requires some kind of request router, while the
plugin service could make it's own decisions (if it's stateless).
You can create a Stateful service that acts as the plugin repository, or mount a file share, or use a database, no restrictions from the platform here. You can use naming conventions to locate the proper plugin.
The following approach could work if an application upgrade is acceptable to you when changing the set of plugins needed for a given application instance.
Recall that Service Fabric apps must be packaged before deployment or upgrade. Using either msbuild tasks or Powershell, you could copy your plugin dlls to the plugin runner service's code package as a post-packaging step prior to the app upgrade. Then your plugin dlls would be available to the service at startup using Assembly.Load and the code package's path, available in your service implementation's Context.CodePackageActivationContext.GetCodePackageObject("Your-Code-Package-Name").Path property. The code package's name is defined in ServiceManifest.xml, and is named Code by default.

What are the application implications of a netstandard library depending on a metapackage?

Suppose I have a class library which I want to target netstandard1.3, but also use BigInteger. Here's a trivial example - the sole source file is Adder.cs:
using System;
using System.Numerics;
namespace Calculator
{
public class Adder
{
public static BigInteger Add(int x, int y)
=> new BigInteger(x) + new BigInteger(y);
}
}
Back in the world of project.json, I would target netstandard1.3 in the frameworks section, and have an explicit dependency on System.Runtime.Numerics, e.g. version 4.0.1. The nuget package I create will list just that dependency.
In the brave new world of csproj-based dotnet tooling (I'm using v1.0.1 of the command-line tools) there's an implicit metapackage package reference to NETStandard.Library 1.6.1 when targeting netstandard1.3. This means that my project file is really small, because it doesn't need the explicit dependency:
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>netstandard1.3</TargetFramework>
</PropertyGroup>
</Project>
... but the nuget package produced has a dependency on NETStandard.Library, which suggests that in order to use my small library, you need everything there.
It turns out I can disable that functionality using DisableImplicitFrameworkReferences, then add in the dependency manually again:
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>netstandard1.3</TargetFramework>
<DisableImplicitFrameworkReferences>true</DisableImplicitFrameworkReferences>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="System.Runtime.Numerics" Version="4.0.1" />
</ItemGroup>
</Project>
Now my NuGet package says exactly what it depends on. Intuitively, this feels like a "leaner" package.
So what's the exact difference for a consumer of my library? If someone tries to use it in a UWP application, does the second, "trimmed" form of dependencies mean that the resulting application will be smaller?
By not documenting DisableImplicitFrameworkReferences clearly (as far as I've seen; I read about it in an issue) and by making the implicit dependency the default when creating a project, Microsoft are encouraging users to just depend on the metapackage - but how can I be sure that doesn't have disadvantages when I'm producing a class library package?
In the past, we've given developers the recommendation to not reference the meta
package (NETStandard.Library) from NuGet packages but instead reference
individual packages, like System.Runtime and System.Collections. The
rationale was that we thought of the meta package as a shorthand for a bunch of
packages that were the actual atomic building blocks of the .NET platform. The
assumption was: we might end up creating another .NET platform that only
supports some of these atomic blocks but not all of them. Hence, the fewer packages you reference, the more portable you'd be. There were also concerns regarding how our tooling deals with large package graphs.
Moving forward, we'll simplify this:
.NET Standard is an atomic building block. In other words, new platforms
aren't allowed to subset .NET Standard -- they have to implement all of it.
We're moving away from using packages to describe our platforms,
including .NET Standard.
This means, you'll not have to reference any NuGet packages for .NET Standard
anymore. You expressed your dependency with the lib folder, which is exactly how
it has worked for all other .NET platforms, in particular .NET Framework.
However, right now our tooling will still burn in the reference to
NETStandard.Library. There is no harm in that either, it will just become
redundant moving forward.
I'll update the FAQ on the .NET Standard repo to include this question.
Update: This question is now part of the FAQ.
The team used to recommend figuring out what the slimmest package set was. They no longer do this, and recommend people just bring in NETStandard.Library instead (in the case of an SDK-style project, this will be done automatically for you).
I've never gotten a totally straight forward answer as to why that was, so allow me to make some educated guesses.
The primary reason is likely to be that it allows them to hide the differences in versions of the dependent libraries that you would otherwise be required to track yourself when changing target frameworks. It's also a much more user friendly system with the SDK-based project files, because you frankly don't need any references to get a decent chunk of the platform (just like you used to with the default references in Desktop-land, especially mscorlib).
By pushing the meta-definition of what it means to be a netstandard library, or a netcoreapp application into the appropriate NuGet package, they don't have to build any special knowledge into the definition of those things as Visual Studio (or dotnet new) sees them.
Static analysis could be used during publishing to limit the shipped DLLs, which is something they do today when doing native compilation for UWP (albeit with some caveats). They don't do that today for .NET Core, but I presume it's an optimization they've considered (as well as supporting native code).
There's nothing stopping you from being very selective, if you so choose. I believe you'll find that you're nearly the only one doing it, which also defeats the purpose (since it'll be assumed everybody is bringing in NETStandard.Library or Microsoft.NETCore.App).
You shouldn't need to disable the implicit reference. All platforms that the library will be able to run on will already have the assemblies that the NETStandard.Library dependency would require.
The .NET Standard Library is a specification, a set of reference assemblies that you compile against that provides a set of APIs that are guaranteed to exist on a know set of platforms and versions of platforms, such as .NET Core or the .NET Framework. It is not an implementation of these assemblies, just enough of the API shape to allow the compiler to successfully build your code.
The implementation for these APIs are provided by a target platform, such as .NET Core, Mono or .NET Framework. They ship with the platform, because they are an essential part of the platform. So there is no need to specify a smaller dependency set - everything's already there, you won't change that.
The NETStandard.Library package provides these reference assemblies. One point of confusion is the version number - the package is version 1.6.1, but this does not mean ".NET Standard 1.6". It's just the version of the package.
The version of the .NET Standard you're targeting comes from the target framework you specify in your project.
If you're creating a library and want it to run on .NET Standard 1.3, you'd reference the NETStandard.Library package, currently at version 1.6.1. But more importantly, your project file would target netstandard1.3.
The NETStandard.Library package will give you a different set of reference assemblies depending on your target framework moniker (I'm simplifying for brevity, but think lib\netstandard1.0, lib\netstandard1.1 and dependency groups). So if your project targets netstandard1.3, you'll get the 1.3 reference assemblies. If you target netstandard1.6, you'll get the 1.6 reference assemblies.
If you're creating an application, you can't target the .NET Standard. It doesn't make sense - you can't run on a specification. Instead, you target concrete platforms, such as net452 or netcoreapp1.1. NuGet knows the mapping between these platforms and the netstandard target framework monikers, so knows which lib\netstandardX.X folders are compatible with your target platform. It also knows that the dependencies of NETStandard.Library are satisfied by the target platform, so won't pull in any other assemblies.
Similarly, when creating a standalone .NET Core app, the .NET Standard implementation assemblies are copied with your app. The reference to NETStandard.Library does not bring in any other new apps.
Note that dotnet publish will create a standalone application, but it won't doesn't currently do trimming, and will publish all assemblies. This will be handled automatically by tooling, so again, trimming dependencies in your library won't help here.
The only place I can imagine where it might help to remove the NETStandard.Library reference is if you are targeting a platform that doesn't support the .NET Standard, and you can find a package from the .NET Standard where all of the transitive dependencies can run on your target platform. I suspect there aren't many packages that would fit that bill.

NuGet Server With Caching

I have a build server that pulls nuget packages on every build, and currently have a NugetGallery deployed internally for custom packages. Right now that eats bandwidth like no tomorrow (not a huge deal, but I want to be kind and make things faster for us).
So I want to auto-mirror repos and cache them.
So far I have a few options: MyGet, which is a cloud-only offering (so no), and Proget (which I'm leaning towards). Are there any other options for auto-mirroring I'm missing?
Klondike is an open-source NuGet package manager that you can deploy privately to your own servers.
Inedo's ProGet is by far the most popular choice for on-prem NuGet servers, but both JFrog and Sonatype have options as well.
You didn't mention whether you have multiple locations to support in your setup. If you do, I'd like to point you to this new feature developed by Inedo (ProGet) and MyGet in partnership:
http://blog.myget.org/post/2015/01/28/Introducing-MyGet-Feed-Sync.aspx
BaGet is the best tool if you looking for something similar to vardaccio or Devpi. It is opensource and capable of caching packages by enabling read-through caching

All in one installers or Core + Plugin Sub installers?

Build everything into one installable unit
Pros
One package to test and deploy to all environments
All plugins installed but possibly not registered for use in the config
Cons
Plugins may be in various states of the pipe how to deploy only good ones.
How to handle registering which plugins to deploy to which environment
Hard to change your mind might be a month between the dev build and the prod push
Build Core Installer (no plugins) + Sub installers (only the plugin)
Pros
Smaller footprint to prod less room for errors
Cons
Possibility of integration errors between plugins since they might be installed in various orders
How to rollback deployment when the previous deployment could be a strange assortment of core and sub installers. Need a way to track what the specific installation contains
How to reproduce errors in QA when QA has all plugins and prod may have a smaller possibly older subset.
These are my thoughts on the issue I have been strugling to have my cake and eat it too but I seem to be stuck with these two choices. Anybody else struggle with this issue and how did you resolve it? Any other pros and cons that I missed? So far I have chose the all or nothing approach but I am open to ideas.
Thanks in advance.
Build everything is easier to test and to deploy. You, at build time and by testing, guarantee all the plugins are compatible with each other. Depending on nature of the product, you can create bundles of plugins, which can be selected during installation.
Of course, there should an option to remove the plugins from the installation package which are not production-ready yet. But ensure QA gets what comes to customers or shareholders.
With separate packages approach, you have to implement dependency tracking and so forth. It is more flexible, which results in lots of possible combinations.
I'd choose the first option: one single package with everything and ability to fine-tune the selected features/plugins.
There's also one more option: combination of the two approaches above. Consider Eclipse project: it has a common platform and plugins. One can download a package with the set of plugins which are usually used in a particular environment. Other plugins can be installed later, if needed. So you combine your core with several logically connected plugins; other plugins could be added to the installation later.

How do I add Platform Update 1 to my bootstrapper?

I have been playing around with the new StateMachine workflow that has been added to Windows Workflow as part of Platform Update 1 (see also). I now want to look at installing what I've created and therefore need to make sure my bootstrapper is up-to-date. In the future, I will be moving to WIX but right now, for the purposes of prototyping, I'm just using a regular Setup and Deployment project and its bootstrap support.
The list of standard pre-requisites does not include the PU1 as an option. Therefore, how can I add support for it?
Update
I found this answer on StackOverflow regarding custom prerequisites, which led me to this article on MSDN, which led me to creating my own pre-requisite. However, I got a new error about mismatched framework requirements. I suspect I need to pick apart the multi-targeting support and the existing .NET framework prerequisite package to see how to make a new prerequisite that will work correctly.
I've had a stab at creating my own bootstrapper packages for this. The results are here to download. Note that these are entirely untested and provided as-is - use at your own risk. However, feedback is welcome. Hopefully Microsoft will provide an official solution.
See How to detect if the .NET Framework Platform Update 1 is installed
is the Microsoft .NET Framework 4 Platform Update 1 - Runtime Update (KB2478063) what you are looking for? See here for the download.