Since junit5 out there, how long jUnit4 going to be supported? Any plans for deprecation, when is it going to be? We are just trying to gauge whether we need to migrate existing jUnit4 test cases to jUnit5 now or later at some point. The user guide says the following but wanted know more clearly how long (a year, two or more from now)? Appreciate your response.
"since the JUnit team will continue to provide maintenance and bug fix releases for the JUnit 4.x baseline, developers have plenty of time to migrate to JUnit Jupiter on their own schedule."
JUnit 4 hasn’t seen any functional upgrade in a decade. What the JUnit team does is maintain the JUnit 4 engine, called Vintage, which runs JUnit 4 on the JUnit 5 platform. As long as the platform engine API stays downwards compatible this is very likely to work. The two events that could break JUnit 4 are:
The platform changes in a highly incompatible way and the Vintage engine will no longer be supported. This is not to be expected in the near future.
Java changes in an incompatible way so that the original JUnit 4 code no longer works. I don’t see that looming either.
That said, using just JUnit 4 decouples you from all innovation in the field of Java test automation. Many extensions already do not support JUnit 4 anymore. My recommendation: Start using the platform at once using Vintage. Write all new tests with Jupiter and the other test engines you need. Migrate old JUnit 4 tests when you have to adapt them anyway.
I'm developing data driven tests using Nunit3 and .Net Core 3.1 and I have many tests with more different data sources, which sometimes have complex logic inside. When I want start only one tests I want to start only one data provider, but I run all. In 3.15.1 ver of NUnit framework was released PreFilter, which solve this problem.
But this feature available only on .runsettings file as I understood docs.
In this question Charlie Poole says that .runsettings is only for VS adapter. But the VS adapter takes a long time to run my tests.
I found info for configuration file but don't undestand what I can configure in this file 0_o
Can I run my tests by NUnit Console Runner 3.12.0-beta1 with PreFilter?
I'm afraid not, no.
There's an open issue to implement it here: https://github.com/nunit/nunit-console/issues/438. You'll see from the VS adapter docs there's several edge-case bugs around this, which will be more visible in the adapter than in the console. At this point in time, nobody has yet taken on that task of implementing this feature in the console.
Suppose I have a class library which I want to target netstandard1.3, but also use BigInteger. Here's a trivial example - the sole source file is Adder.cs:
using System;
using System.Numerics;
namespace Calculator
{
public class Adder
{
public static BigInteger Add(int x, int y)
=> new BigInteger(x) + new BigInteger(y);
}
}
Back in the world of project.json, I would target netstandard1.3 in the frameworks section, and have an explicit dependency on System.Runtime.Numerics, e.g. version 4.0.1. The nuget package I create will list just that dependency.
In the brave new world of csproj-based dotnet tooling (I'm using v1.0.1 of the command-line tools) there's an implicit metapackage package reference to NETStandard.Library 1.6.1 when targeting netstandard1.3. This means that my project file is really small, because it doesn't need the explicit dependency:
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>netstandard1.3</TargetFramework>
</PropertyGroup>
</Project>
... but the nuget package produced has a dependency on NETStandard.Library, which suggests that in order to use my small library, you need everything there.
It turns out I can disable that functionality using DisableImplicitFrameworkReferences, then add in the dependency manually again:
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>netstandard1.3</TargetFramework>
<DisableImplicitFrameworkReferences>true</DisableImplicitFrameworkReferences>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="System.Runtime.Numerics" Version="4.0.1" />
</ItemGroup>
</Project>
Now my NuGet package says exactly what it depends on. Intuitively, this feels like a "leaner" package.
So what's the exact difference for a consumer of my library? If someone tries to use it in a UWP application, does the second, "trimmed" form of dependencies mean that the resulting application will be smaller?
By not documenting DisableImplicitFrameworkReferences clearly (as far as I've seen; I read about it in an issue) and by making the implicit dependency the default when creating a project, Microsoft are encouraging users to just depend on the metapackage - but how can I be sure that doesn't have disadvantages when I'm producing a class library package?
In the past, we've given developers the recommendation to not reference the meta
package (NETStandard.Library) from NuGet packages but instead reference
individual packages, like System.Runtime and System.Collections. The
rationale was that we thought of the meta package as a shorthand for a bunch of
packages that were the actual atomic building blocks of the .NET platform. The
assumption was: we might end up creating another .NET platform that only
supports some of these atomic blocks but not all of them. Hence, the fewer packages you reference, the more portable you'd be. There were also concerns regarding how our tooling deals with large package graphs.
Moving forward, we'll simplify this:
.NET Standard is an atomic building block. In other words, new platforms
aren't allowed to subset .NET Standard -- they have to implement all of it.
We're moving away from using packages to describe our platforms,
including .NET Standard.
This means, you'll not have to reference any NuGet packages for .NET Standard
anymore. You expressed your dependency with the lib folder, which is exactly how
it has worked for all other .NET platforms, in particular .NET Framework.
However, right now our tooling will still burn in the reference to
NETStandard.Library. There is no harm in that either, it will just become
redundant moving forward.
I'll update the FAQ on the .NET Standard repo to include this question.
Update: This question is now part of the FAQ.
The team used to recommend figuring out what the slimmest package set was. They no longer do this, and recommend people just bring in NETStandard.Library instead (in the case of an SDK-style project, this will be done automatically for you).
I've never gotten a totally straight forward answer as to why that was, so allow me to make some educated guesses.
The primary reason is likely to be that it allows them to hide the differences in versions of the dependent libraries that you would otherwise be required to track yourself when changing target frameworks. It's also a much more user friendly system with the SDK-based project files, because you frankly don't need any references to get a decent chunk of the platform (just like you used to with the default references in Desktop-land, especially mscorlib).
By pushing the meta-definition of what it means to be a netstandard library, or a netcoreapp application into the appropriate NuGet package, they don't have to build any special knowledge into the definition of those things as Visual Studio (or dotnet new) sees them.
Static analysis could be used during publishing to limit the shipped DLLs, which is something they do today when doing native compilation for UWP (albeit with some caveats). They don't do that today for .NET Core, but I presume it's an optimization they've considered (as well as supporting native code).
There's nothing stopping you from being very selective, if you so choose. I believe you'll find that you're nearly the only one doing it, which also defeats the purpose (since it'll be assumed everybody is bringing in NETStandard.Library or Microsoft.NETCore.App).
You shouldn't need to disable the implicit reference. All platforms that the library will be able to run on will already have the assemblies that the NETStandard.Library dependency would require.
The .NET Standard Library is a specification, a set of reference assemblies that you compile against that provides a set of APIs that are guaranteed to exist on a know set of platforms and versions of platforms, such as .NET Core or the .NET Framework. It is not an implementation of these assemblies, just enough of the API shape to allow the compiler to successfully build your code.
The implementation for these APIs are provided by a target platform, such as .NET Core, Mono or .NET Framework. They ship with the platform, because they are an essential part of the platform. So there is no need to specify a smaller dependency set - everything's already there, you won't change that.
The NETStandard.Library package provides these reference assemblies. One point of confusion is the version number - the package is version 1.6.1, but this does not mean ".NET Standard 1.6". It's just the version of the package.
The version of the .NET Standard you're targeting comes from the target framework you specify in your project.
If you're creating a library and want it to run on .NET Standard 1.3, you'd reference the NETStandard.Library package, currently at version 1.6.1. But more importantly, your project file would target netstandard1.3.
The NETStandard.Library package will give you a different set of reference assemblies depending on your target framework moniker (I'm simplifying for brevity, but think lib\netstandard1.0, lib\netstandard1.1 and dependency groups). So if your project targets netstandard1.3, you'll get the 1.3 reference assemblies. If you target netstandard1.6, you'll get the 1.6 reference assemblies.
If you're creating an application, you can't target the .NET Standard. It doesn't make sense - you can't run on a specification. Instead, you target concrete platforms, such as net452 or netcoreapp1.1. NuGet knows the mapping between these platforms and the netstandard target framework monikers, so knows which lib\netstandardX.X folders are compatible with your target platform. It also knows that the dependencies of NETStandard.Library are satisfied by the target platform, so won't pull in any other assemblies.
Similarly, when creating a standalone .NET Core app, the .NET Standard implementation assemblies are copied with your app. The reference to NETStandard.Library does not bring in any other new apps.
Note that dotnet publish will create a standalone application, but it won't doesn't currently do trimming, and will publish all assemblies. This will be handled automatically by tooling, so again, trimming dependencies in your library won't help here.
The only place I can imagine where it might help to remove the NETStandard.Library reference is if you are targeting a platform that doesn't support the .NET Standard, and you can find a package from the .NET Standard where all of the transitive dependencies can run on your target platform. I suspect there aren't many packages that would fit that bill.
Build everything into one installable unit
Pros
One package to test and deploy to all environments
All plugins installed but possibly not registered for use in the config
Cons
Plugins may be in various states of the pipe how to deploy only good ones.
How to handle registering which plugins to deploy to which environment
Hard to change your mind might be a month between the dev build and the prod push
Build Core Installer (no plugins) + Sub installers (only the plugin)
Pros
Smaller footprint to prod less room for errors
Cons
Possibility of integration errors between plugins since they might be installed in various orders
How to rollback deployment when the previous deployment could be a strange assortment of core and sub installers. Need a way to track what the specific installation contains
How to reproduce errors in QA when QA has all plugins and prod may have a smaller possibly older subset.
These are my thoughts on the issue I have been strugling to have my cake and eat it too but I seem to be stuck with these two choices. Anybody else struggle with this issue and how did you resolve it? Any other pros and cons that I missed? So far I have chose the all or nothing approach but I am open to ideas.
Thanks in advance.
Build everything is easier to test and to deploy. You, at build time and by testing, guarantee all the plugins are compatible with each other. Depending on nature of the product, you can create bundles of plugins, which can be selected during installation.
Of course, there should an option to remove the plugins from the installation package which are not production-ready yet. But ensure QA gets what comes to customers or shareholders.
With separate packages approach, you have to implement dependency tracking and so forth. It is more flexible, which results in lots of possible combinations.
I'd choose the first option: one single package with everything and ability to fine-tune the selected features/plugins.
There's also one more option: combination of the two approaches above. Consider Eclipse project: it has a common platform and plugins. One can download a package with the set of plugins which are usually used in a particular environment. Other plugins can be installed later, if needed. So you combine your core with several logically connected plugins; other plugins could be added to the installation later.
I have been playing around with the new StateMachine workflow that has been added to Windows Workflow as part of Platform Update 1 (see also). I now want to look at installing what I've created and therefore need to make sure my bootstrapper is up-to-date. In the future, I will be moving to WIX but right now, for the purposes of prototyping, I'm just using a regular Setup and Deployment project and its bootstrap support.
The list of standard pre-requisites does not include the PU1 as an option. Therefore, how can I add support for it?
Update
I found this answer on StackOverflow regarding custom prerequisites, which led me to this article on MSDN, which led me to creating my own pre-requisite. However, I got a new error about mismatched framework requirements. I suspect I need to pick apart the multi-targeting support and the existing .NET framework prerequisite package to see how to make a new prerequisite that will work correctly.
I've had a stab at creating my own bootstrapper packages for this. The results are here to download. Note that these are entirely untested and provided as-is - use at your own risk. However, feedback is welcome. Hopefully Microsoft will provide an official solution.
See How to detect if the .NET Framework Platform Update 1 is installed
is the Microsoft .NET Framework 4 Platform Update 1 - Runtime Update (KB2478063) what you are looking for? See here for the download.