service fabric running .net 6.0 application - azure-service-fabric

We are migrating our application from .NET 4.6.1 to .NET 6.0.
after deploying the migrated application, it is failing to find framework dlls. After logging into cluster VM looks like .NET 6.0 is not installed on machines. Do we have to have to separately install .NET 6 on cluster?
existing cluster is configured for auto fabric upgrade and current fabric version is 9.0.1028.9590
Thanks

You definitely don't need to install .NET frameworks on Service Fabric - this should be bundled as part of your Service Fabric application packages. We recently went through the exact process of upgrading our applications and I can also confirm that our cluster does not have any recent .NET frameworks installed.
How do you create your deployment packages? I suspect this may be the source of your problem...
For example, if using Azure DevOps, basically you need the following build tasks:
Use Net Core => to install 6.0.x SDK
Restore nuget packages
Build your .sln file via VS Studio build task
Build your .sfproj file via another VS Studio build task with parameters /t:Package /p:PackageLocation=$(build.artifactstagingdirectory)\applicationpackage
And publish artifact using source $(build.artifactstagingdirectory)
Finally, use the default powershell script Deploy-FabricApplication.ps1 that comes with default VS Studio to register and deploy your application to cluster.

Related

Azure DevOps: What do I need to build .Net 6 solutions?

We have an on-premises Azure DevOps 2019 server, with build pipelines for numerous .Net 4.x solutions that our small team maintains using VS2019.
The team is about to upgrade to VS2022, and at some point I would like to migrate some solutions to .Net 6. Can DevOps 2019 build .Net 6 solutions, and if so what changes are needed to support this (such as presumably installing VS2022 on the server)?
Will those solutions' build pipelines require any changes or should they continue to work as-is? They don't contain anything too clever, with steps such as: NuGet restore, build solution, run unit tests, NuGet pack & push. (The build pipelines are managed via the web GUI, not YAML, if that makes a difference).
Some solutions will remain .Net 4.x, so the server will still need to support (build) these.
You should just need two things:
First you have to install the corresponding SDK for building the apps (see sdk download - Build apps - SDK) on your build-agents.
(Optional) Add or modify your set SDK Task in your pipelines (see use dotnet core task).
One more hint, you don't need to install a whole VS on the server. The Build-Tools are enough (VS2022 Buildtools preview).

Azure Data Factory self-hosted Integration Runtime auto update issues

I have some problem with self-hosted Integration Runtime in Azure Data Factory V2.
I have a few VMs running 4.X.X IR software. Some of them had auto update enabled in DFv2
There was an update from 4.X.X to 5.X. After this, IR is unavailable from DFv2.
Looks like the IR services running on the VMs are pointing to a wrong execute path - using still 4.0. I can fix it manually with sc config or reinstall IR, but after reboot it doesn't work again.
Is that a bug? Can I somehow fix it without removing the VMs?
Update:
What I did - I went to Data Factory V2 Integration Runtimes and picked my self-hosted IR, went to Auto update and enabled it. My Virtual Machine hosting this IR was running an older IR software (4.X.X). There was an update to 5.X.X. Everything was working fine until I rebooted the VM. After this from Data Factory V2 Integration Runtimes I was seeing an error saying that my self-hosted IR is unavailable. I logged into the hosting VM and it turned out that IR software cannot start its service dmgsvc.exe. When you go to services.msc and check the Integration Runtime service pointing to the dmgsvc.exe, the path will be incorrect. What was wrong there? It was a catalog 4.0 instead of 5.0. IR software cannot start up correctly because of that and the error is Error 2: System cannot find the file specified. So what I did? I manually fixed it and it was working. But after the first reboot of the VM it was again pointing to the 4.0 catalog. I reinstalled the software and the effect was the same.
For the upgrade to version 5.x of the Azure Data Factory self-hosted integration runtime, we require .NET Framework Runtime 4.7.2 or later. On the download page, you'll find download links for the latest 4.x version and the latest two 5.x versions.
If automatic update is on and you've already upgraded your .NET
Framework Runtime to 4.7.2 or later, the self-hosted integration
runtime will be automatically upgraded to the latest 5.x version.
If automatic update is on and you haven't upgraded your .NET
Framework Runtime to 4.7.2 or later, the self-hosted integration
runtime won't be automatically upgraded to the latest 5.x version.
The self-hosted integration runtime will stay in the current 4.x
version. You can see a warning for a .NET Framework Runtime upgrade
in the portal and the self-hosted integration runtime client.
Refer: Troubleshoot self-hosted integration runtime

Is the NuGetToolInstaller task supposed to require .Net 4.7.2 and should using the task add .Net framework 4.7.2 as a demand?

We have a build VM that's on an older version of Windows 10 because we have a 3rd party component that can't be installed on newer versions. That version of Windows 10 doesn't support installing .Net Framework 4.7.2, and this appears to be required for the NuGetToolInstaller to work. Is there anyway to get NuGet working in a build that will work with all Windows 10 builds (or even Windows 7)?
I can force it to only choose to build on a VM with a later build of Windows 10 by manually adding a demand for .Net Framework 4.7.2, but shouldn't the NuGetToolInstaller task already include that demand (in the same way that the Visual Studio Build task does)?
and this appears to be required for the NuGetToolInstaller to work.
Check this json file, it shows all the supported versions in task NuGettoolinstaller, we can see that it can be installed from version 2.8.6, I try to install it with the version 2.8.6 and it works, check the pic below.
According to the description, it seems that you are using self-hosted agent, it will check the configuration of the local machine. If you have another version installed on your local agent machine, we can also use the NuGet version.

Is service fabric compatible with .netstandard2.0 or .netcore2.0

I have a set of service projects that i build about 12 months ago. I've tried updating nuget packages as often as possible but with .netstandard and core 2.0 I assumed it would be ready to port to the new sdk and packages without too much trouble.
I've managed to update most of my libraries to .netstandard2.0 but any of the services using service fabric packages won't work.
I've had to default my libraries to 4.6.2
I've tried by updating the csproj to use the new style layout. I've also tried by creating a new service fabric project, selecting the .net core option template and copy the settings from there but that doesn't work either.
Service Fabric is compatible with .netstandard 2.0 and .net core 2.0 (where the project is actually netcore2.0 and not net462) but with this 2 conditions:
Visual studio 2017 15.5 (only in preview as of now)
Service Fabric Tools 2.0 (only in preview as of now)
The good news is that you can create the project/solution from the preview version and then reopen it from the regular version and it will work.
Here's an example of such a solution

How do I deploy .NET Framework 4 using Active Directory deployment?

I know it's possible to deploy earlier versions of the .NET framework using AD deployment, for example: http://msdn.microsoft.com/en-us/library/cc160717.aspx.
How do it do this for .NET 4? I tried unpacking the standalone .NET 4 installer and deploying the netfx_Extended_x86.msi package. This didn't work. After a reboot the event log shows that it tried but it failed to install with a message saying to run setup.exe.
Didn't test it but look at this: How to deploy .NET 4.0 Framework