Deploying that utilises .Net Reactive Extensions - system.reactive

Our application utilises Reactive Extensions (Rx). These are normally installed via the downloadable package from Microsoft. However, when we ship our application we supply copies of the dlls (namely System.CoreEx.dll and System.Reactive.dll). There appear to be two versions in the GAC v1.0.2787.0 and v1.0.2856.0. We are referencing a specific one and ship the appropriate versions.
However when the application launches it throws an error dialog which states the Rx Dlls must be installed in the GAC. It also requests the Dlls for the other version of Rx e.g. if you are referencing 1.0.2787 it will request 1.0.2856.0 and vice versa.
Has anyone got around this problem?

The Rx assemblies don't need to be installed into the GAC unless your application's assemblies are installed in the GAC. Does your application need to be installed in the GAC or can it run from the installation directory?
It also requests the Dlls for the other version of Rx e.g. if you are referencing 1.0.2787 it will request 1.0.2856.0 and vice versa
It's unlikely that it's actually requesting across versions of Rx. You might want to double/triple check that your solution (all projects) all reference the correct (and same) version of the Rx assemblies.

Related

What are the application implications of a netstandard library depending on a metapackage?

Suppose I have a class library which I want to target netstandard1.3, but also use BigInteger. Here's a trivial example - the sole source file is Adder.cs:
using System;
using System.Numerics;
namespace Calculator
{
public class Adder
{
public static BigInteger Add(int x, int y)
=> new BigInteger(x) + new BigInteger(y);
}
}
Back in the world of project.json, I would target netstandard1.3 in the frameworks section, and have an explicit dependency on System.Runtime.Numerics, e.g. version 4.0.1. The nuget package I create will list just that dependency.
In the brave new world of csproj-based dotnet tooling (I'm using v1.0.1 of the command-line tools) there's an implicit metapackage package reference to NETStandard.Library 1.6.1 when targeting netstandard1.3. This means that my project file is really small, because it doesn't need the explicit dependency:
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>netstandard1.3</TargetFramework>
</PropertyGroup>
</Project>
... but the nuget package produced has a dependency on NETStandard.Library, which suggests that in order to use my small library, you need everything there.
It turns out I can disable that functionality using DisableImplicitFrameworkReferences, then add in the dependency manually again:
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>netstandard1.3</TargetFramework>
<DisableImplicitFrameworkReferences>true</DisableImplicitFrameworkReferences>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="System.Runtime.Numerics" Version="4.0.1" />
</ItemGroup>
</Project>
Now my NuGet package says exactly what it depends on. Intuitively, this feels like a "leaner" package.
So what's the exact difference for a consumer of my library? If someone tries to use it in a UWP application, does the second, "trimmed" form of dependencies mean that the resulting application will be smaller?
By not documenting DisableImplicitFrameworkReferences clearly (as far as I've seen; I read about it in an issue) and by making the implicit dependency the default when creating a project, Microsoft are encouraging users to just depend on the metapackage - but how can I be sure that doesn't have disadvantages when I'm producing a class library package?
In the past, we've given developers the recommendation to not reference the meta
package (NETStandard.Library) from NuGet packages but instead reference
individual packages, like System.Runtime and System.Collections. The
rationale was that we thought of the meta package as a shorthand for a bunch of
packages that were the actual atomic building blocks of the .NET platform. The
assumption was: we might end up creating another .NET platform that only
supports some of these atomic blocks but not all of them. Hence, the fewer packages you reference, the more portable you'd be. There were also concerns regarding how our tooling deals with large package graphs.
Moving forward, we'll simplify this:
.NET Standard is an atomic building block. In other words, new platforms
aren't allowed to subset .NET Standard -- they have to implement all of it.
We're moving away from using packages to describe our platforms,
including .NET Standard.
This means, you'll not have to reference any NuGet packages for .NET Standard
anymore. You expressed your dependency with the lib folder, which is exactly how
it has worked for all other .NET platforms, in particular .NET Framework.
However, right now our tooling will still burn in the reference to
NETStandard.Library. There is no harm in that either, it will just become
redundant moving forward.
I'll update the FAQ on the .NET Standard repo to include this question.
Update: This question is now part of the FAQ.
The team used to recommend figuring out what the slimmest package set was. They no longer do this, and recommend people just bring in NETStandard.Library instead (in the case of an SDK-style project, this will be done automatically for you).
I've never gotten a totally straight forward answer as to why that was, so allow me to make some educated guesses.
The primary reason is likely to be that it allows them to hide the differences in versions of the dependent libraries that you would otherwise be required to track yourself when changing target frameworks. It's also a much more user friendly system with the SDK-based project files, because you frankly don't need any references to get a decent chunk of the platform (just like you used to with the default references in Desktop-land, especially mscorlib).
By pushing the meta-definition of what it means to be a netstandard library, or a netcoreapp application into the appropriate NuGet package, they don't have to build any special knowledge into the definition of those things as Visual Studio (or dotnet new) sees them.
Static analysis could be used during publishing to limit the shipped DLLs, which is something they do today when doing native compilation for UWP (albeit with some caveats). They don't do that today for .NET Core, but I presume it's an optimization they've considered (as well as supporting native code).
There's nothing stopping you from being very selective, if you so choose. I believe you'll find that you're nearly the only one doing it, which also defeats the purpose (since it'll be assumed everybody is bringing in NETStandard.Library or Microsoft.NETCore.App).
You shouldn't need to disable the implicit reference. All platforms that the library will be able to run on will already have the assemblies that the NETStandard.Library dependency would require.
The .NET Standard Library is a specification, a set of reference assemblies that you compile against that provides a set of APIs that are guaranteed to exist on a know set of platforms and versions of platforms, such as .NET Core or the .NET Framework. It is not an implementation of these assemblies, just enough of the API shape to allow the compiler to successfully build your code.
The implementation for these APIs are provided by a target platform, such as .NET Core, Mono or .NET Framework. They ship with the platform, because they are an essential part of the platform. So there is no need to specify a smaller dependency set - everything's already there, you won't change that.
The NETStandard.Library package provides these reference assemblies. One point of confusion is the version number - the package is version 1.6.1, but this does not mean ".NET Standard 1.6". It's just the version of the package.
The version of the .NET Standard you're targeting comes from the target framework you specify in your project.
If you're creating a library and want it to run on .NET Standard 1.3, you'd reference the NETStandard.Library package, currently at version 1.6.1. But more importantly, your project file would target netstandard1.3.
The NETStandard.Library package will give you a different set of reference assemblies depending on your target framework moniker (I'm simplifying for brevity, but think lib\netstandard1.0, lib\netstandard1.1 and dependency groups). So if your project targets netstandard1.3, you'll get the 1.3 reference assemblies. If you target netstandard1.6, you'll get the 1.6 reference assemblies.
If you're creating an application, you can't target the .NET Standard. It doesn't make sense - you can't run on a specification. Instead, you target concrete platforms, such as net452 or netcoreapp1.1. NuGet knows the mapping between these platforms and the netstandard target framework monikers, so knows which lib\netstandardX.X folders are compatible with your target platform. It also knows that the dependencies of NETStandard.Library are satisfied by the target platform, so won't pull in any other assemblies.
Similarly, when creating a standalone .NET Core app, the .NET Standard implementation assemblies are copied with your app. The reference to NETStandard.Library does not bring in any other new apps.
Note that dotnet publish will create a standalone application, but it won't doesn't currently do trimming, and will publish all assemblies. This will be handled automatically by tooling, so again, trimming dependencies in your library won't help here.
The only place I can imagine where it might help to remove the NETStandard.Library reference is if you are targeting a platform that doesn't support the .NET Standard, and you can find a package from the .NET Standard where all of the transitive dependencies can run on your target platform. I suspect there aren't many packages that would fit that bill.

How are NuGet components intended to be distributed to end users?

I'm still learning about NuGet and have a (hopefully not stupid) question: how are the components/assemblies in NuGet packages intended to be distributed to end users? The docs I've found are all about how NuGet packages get installed on developer systems, but nothing addresses how the resulting software is shipped/installed.
My best guess is that the developer using the package must decide how to get the components or assemblies to the end users, presumably by including them inside whatever installer the developer builds. The developer is also presumably responsible for any required registration (if side-by-side installation is not sufficient). Basically I'm guessing that NuGet and the concept of packages are out of the loop at this stage. Is that right?
They aren't intended for end users. NuGets purpose is to help developers manage their software dependencies. The developers still package and deploy their application and its dependencies the same way they always have.
This can be via an MSI installer, exe installer, ClickOnce or Windows Store installation.
NuGet adds the assemblies as references to your application's project. When you compile your application, you'll see all the assemblies you pulled in via NuGet in the output directory (typically bin\debug or bin\release).
The idea here, as Andy says, is that these assemblies are now part of your application just as if you had found the assembly and referenced it manually.
Basically I'm guessing that NuGet and the concept of packages are out of the loop at this stage. Is that right?
Yep. However you plan to ship your application binary is the same way you distribute the assemblies that were "installed" by the NuGet packages. You don't need to distribute the packages, just the assemblies that were referenced by your application.

Building customer based installation packages with install4j

I am developing an application which has customer specific configuration (2 text and 2 binary files). The use case supposes that customer downloads an installation package (I am going to use install4j) and install it on target platform (Mac or Windows). So all installation packages should be different for different customers.
I am considering 2 possible scenarios for implementation:
Generate new installation package per customer request on server side (cons: I need to have install4j for Linux, which is server platform)
Have a half-generated installation package and inject customer data somehow to the package by customer request (cons: I am not sure this is quite possible at all)
I never used install4j before and don't know how to implement 1 or 2. Their documentation is far from ideal. They doesn't have examples or consider cases like this, so any suggestion is very appreciated.
You cannot modify an installer after it has been built. The main reason is that it would break code signing. So you would need to generate a new installer for each configuration. If you deploy on Mac OS X and Windows, you need install4j Multi-Platform Edition which also works on Linux.
Alternatively, you could ask the user to provide credentials in the installer, then you could download the appropriate files on demand with "Download file" actions.

Is it possible to install an assembly into the GAC as some sort of 'linked assembly'?

I'm trying to deploy some sort of framework and therefore need to register some assemblies in the GAC.
The interesting part is:
These GAC assemblies should only be used by the framework developer, the client apps should not use these GAC assemblies but the ones in their local directories (the GAC assemblies could be of a different version, most likely higher).
I've already found and tried the app.config setting but it seems to be ignored by the client app (latest .NET runtime installed is 3.5).
Because you will be loading two different versions of a given dll in the memory you need to isolate them. common methods are;
Creating an AppDomain. You create two dll's. The first dll creates a new AppDomain from your code and then loads the second dll which is linked to those dependencies.
Use a service process. You create one dll and a service application. The dll starts the service application in a new process and connects using for e.g. remoting. The service application is linked to the components you need to bind to.

How do I publish a Dynamics CRM4 plug-in with multiple assemblies?

My plugin DLL is really simple but references fifteen or so other DLLs. How do I register this?
Ways I know about are:
a) Put the other assemblies in the GAC (I think this is SDK preferred method). Will have to install on each client if needs to be taken online.
b) Use ILMerge to merge all of your assemblies into one assembly. You can deploy this to the database and have it used by your offline clients without a seperate install.