How do I deploy service fabric application from VSTS release pipeline? - azure-devops

I have configured a CI build for a Service Fabric application, in Visual Studio Team Services, according to this documentation: https://azure.microsoft.com/en-us/documentation/articles/service-fabric-set-up-continuous-integration
But instead of having my CI build do the publishing, I only perform the Build and Package tasks, and include all Service Fabric related output, such as pkg folder, scripts, publish profiles and application parameters, in the drop. This way I can pass it along to the new Release pipeline (agent-based releases) to do the actual deployment of my service fabric application.
In my release definition I have a single Azure Powershell task, that uses an ARM endpoint (with proper service principals configured).
When I deploy my app to an existing service fabric cluster, I use the default Deploy-FabricApplication cmdlet passing along the pkg folder and a publish profile that is configured with a connection to the existing cluster.
The release fails with an error message "Cluster connection instance is null". And I cannot understand why?
Doing some debugging I have found that:
The Deploy-FabricApplication cmdlet executes the Connect-ServiceFabricCluster cmdlet just fine, but as soon as the Publish-NewServiceFabricApplication cmdlet takes over execution, then the cluster connection is lost.
I would expect that this scenario is possible using the service fabric cmdlets, but I cannot figure out how to keep the cluster connection open during depoyment.
UPDATE: The link to the documentation no longer refers to the Service Fabric powershell scripts, so the pre-condition for this question is no longer documented. The article now refers to the VSTS build and release tasks, which can be prefered over the powershell cmdlets I tried to use.

When the Connect-ServiceFabricCluster function is called (from Deploy-FabricApplication.ps1) a local $clusterConnection variable is set after the call to Connect-ServiceFabricCluster. You can see that using Get-Variable.
Unfortunately there is logic in some of the SDK scripts that expect that variable to be set but because they run in a different scope, that local variable isn't available.
It works in Visual Studio because the Deploy-FabricApplication.ps1 script is called using dot source notation, which puts the $clusterConnection variable in the current scope.
I'm not sure if there is a way to use dot sourcing when running a script though the release pipeline but you could, as a workaround, make the $clusterConnection variable global right after it's been set via the Connect-ServiceFabricCluster call. Edit your Deploy-FabricApplication.ps1 script and add the following line after the connection logic (~line 169):
$global:clusterConnection = $clusterConnection
By the way, you might want to consider setting up custom build/release tasks that deploy a Service Fabric application, rather than using the various Deploy-FabricApplication.ps1 scripts.

There now exists a built-in VSTS task for deploying a Service Fabric app so you no longer need to bother with executing the PowerShell script on your own. Task documentation page is at https://www.visualstudio.com/docs/build/steps/deploy/service-fabric-deploy. The original CI article has also been updated which provides details on how to set everything up: https://azure.microsoft.com/en-us/documentation/articles/service-fabric-set-up-continuous-integration/.

Try to use "PowerShell" task instead of "Azure PowerShell" task.

I hit the same bug today and opened a GitHub issue here
On a side note, VS generated script Deploy-FabricApplication.ps1 uses module
"$((Get-ItemProperty -Path "HKLM:\SOFTWARE\Microsoft\Service Fabric SDK" -Name "FabricSDKPSModulePath").FabricSDKPSModulePath)\ServiceFabricSDK.psm1"
That's where Publish-NewServiceFabricApplication comes from. You can check the deployment logic and rewrite it in more sane way using lower-level ServiceFabric SDK cmdlets (potentially getting connection using Get-ServiceFabricClusterConnection instead of global-ling it)

Related

How do I automatically load powershell profiles with Jenkins pipeline when running Jenkins as a service?

First off, I didn't have this issue until setting up my agent to run as a windows service.
My company has custom cmdlets we have built that are part of the default profile that is loaded when running powershell. I am using Jenkins to execute a batchfile that iterates a command over a series of machines. After settings up Jenkins to be a service, it no longer has access to those cmdlets leading me to believe the profile isn't being loaded. If I load the profile manually by running the profile script, it only seems to work on the first machine.
When setting up Jenkins as a service, I configured it to be the same user that I would manually run these scripts as if I were to login to the computer. I have verified it is using the proper user with $env:UserName.
I am at a loss as to why setting up jenkins as a windows service broke this. I could revert to using the command line to connect to Jenkins, but that doesn't always connect post server maintenance or after a power outage.
Did I configure something wrong or is there a way to load profiles instead of jenkins always running -NoProfile?
Update - I noticed when running $PROFILE it was set to a default profile location that did not exist. It seems when opening powershell manually on the machine it loads the AllUsersCurrentHost profile but this doesn't happen when using powershell from Jenkins when running as a service. I created the file location where it said it was using the profile and copied the default profile there and it works. I am still not sure why the behavior differs, but at least I found a solution.

Is there an easy way to run Azure DevOps PowerShell scripts on my local machine?

I tried to find anything on this but I didn't succeed. Maybe I am using the wrong words for the search.
What I am trying to achieve is that I have a script that can run in an Azure DevOps environment as well as on my local machine for debug purposes. As far as I can see to execute locally I would need some kind of wrapper for the script that is behaving like the Azure DevOps Task is. Does anything like that exist out there?
If you want to have more control over building your code and be able to see intermediate results you need to install self-hosted agent on your machine. Here you have more info about this.
Most of the task are simply wrappers around console tools which adds sort of authorization or making them visually accessible. Maybe useful for you will be enable System.Debug flag on Microsoft agent to see more details what particular task does. You will see more details and thus be able to better understand what is happening behind.
For instance if you use variables in your script like $(someVariable) setting System.Debug you will see your final script in the log with replaced values.
Be aware also that Secret variables are masked. So you may find *** in logs instead of real value.
However, there is no easy way just to extract and wrap what task does to repeat it on your machine without involving Azure DevOps agent.

How to validate Powershell Desired State Configuration Template before Executing?

As we provision certain resources in Azure, management portal validates the template generated, however when we do it using powershell we only come to know about issues, only when it is executed.
There must be some parameter or switch which could help to just
validate the template & not actually execute it. Any body knows
please?
I assume you are talking about deploying ARM templates and I also assume you are using the AzureRm PowerShell module. In that case you can use the Test-AzureRmResourceGroupDeployment command to 'Validates a resource group deployment' (from the command's help).

Update Web.config configuration with powershell on Azure Release Script

I'm running my Deployments on the Release Management(Currently Preview) tool in VSO.
When you configure a new Release(with the new release management tool on VSO) you can add to the Flow a task named:Azure PowerShell(Run a PowerShell script within an Azure environment)
What i'm trying to do is to Make some changes to the web.config using the Get-WebApplication and then Set-WebConfigurationProperty.
the error i get from the Log is:
Process should have elevated status to access IIS configuration data.
##[error]Cannot find a provider with the name 'WebAdministration'.
Is it even possible to run those kind of commands in there or do you i need to use another kind of command to update my web.config?
There is no Azure API to make arbitrary transforms to your web.config.
Instead, the way this is typically done is to use the deployment time transform engine (e.g. via Web.Debug.config or using Chained Config transforms).
If you're trying to set the web.config of an Azure WebApp then you need to use the Set-AzureWebSite cmdlet or the Set-AzureRMWebApp cmdlet.
Which one you need to use depends on which Azure cmdlets are installed on the machine running the script. The hosted servers for RM may still have the 0.9.x cmdlets (which uses SetAzureWebSite). The Set-AzureRMWebApp cmdlet is in the 1.x cmdlets. Either will work to set the config, you just need to use the appropriate cmdlet for what's have installed.

Deployment not in a domain - psexec.exe or powershell remoting

I am working on an automated deployment process for a web application. The deployment will need to:
Deploy DB changes to database using sqlpackage.exe
Deploy reporting services reports to the reports server using the web service
Deploy web app to web server(s)
Deploy fonts for reports
among other things
The first two are reasonably straightforward to run from the web server, as the web service and db are contactable, and the tools to deploy run over the network.
From reading it appears that powershell remoting should be the way to go, and internally this would not be a problem. However when deploying to production, this will be being carried out in a datacentre, where the machines (2web, 1db) are not on a domain at all. I'd like to come up with a generic process that can run both internally and externally with the appropriate configuration. Powershell remoting, with machines not in a domain appears to require a fair bit of configuration using https etc., as NT credentials can't be validated.
Should I battle out configuring powershell remoting, or would configuring this to just use psexec to execute a powershell script directly on the remote machien, copying the deployment artifacts to a drop folder on the remote machine be the best way to go?
psexec seems to "just work". It appears powershell remoting comes with a lot more pain.
Why not use psexec then? You can restrict it's role to just getting you on to the remote machine, and not let it infect your scripts. I have not attempted to get ps remoting working on a non-domain, but it general I have found it to be fairly high effort to get going. Psexec, as you say, can often be simpler.
Excuse the peddling, but the open source framework I helped build called PowerUp essentially does all this for your. It uses a model in which the powershell (well psake) scripts can move execution to another machine by calling a specific function. This can either be done with powershell remoting or psexec - you wouldn't need to change the script, it just requires a setting per environment to say which you would like to use.
Check other the sample at https://github.com/AffinityID/PowerUpSamples/tree/master/SimpleWebsite.
Hopefully that shows you enough, but if not let me know and we can go into more detail.