How to validate Powershell Desired State Configuration Template before Executing? - powershell

As we provision certain resources in Azure, management portal validates the template generated, however when we do it using powershell we only come to know about issues, only when it is executed.
There must be some parameter or switch which could help to just
validate the template & not actually execute it. Any body knows
please?

I assume you are talking about deploying ARM templates and I also assume you are using the AzureRm PowerShell module. In that case you can use the Test-AzureRmResourceGroupDeployment command to 'Validates a resource group deployment' (from the command's help).

Related

How can i send values to a Jenkins script?

I have a Jenkins job and in that job the first thing that runs is a powershell script that I want to capture user inputs values and set them as global variables that are used through out the Jenkins job.
Now i want the user to be able to put these values in from their machine and then run the job with these values ?
How can i do this ?
EDIT: In case anybody else finds this answer. Please see the comments below. This should not be used for credentials! As the communication can be secured by TLS, the credentials will still be visible in build logs etc.
You need to check the This project is parameterized checkbox in the settings of your job in Jenkins. Then define the name, type etc.
The given name is already accessible via standard syntax.
In shell script ${nameOfParam} or %nameOfParam (depending on your shell / os).
In pipelines they are also accessible via params.nameOfParam.
You can set these variables via GUI using Build with parameters or via API call http://<JENKINS_URL>/job/<JOB_NAME>/buildWithParameters/nameOfParam=foo
See also: https://www.baeldung.com/ops/jenkins-parameterized-builds
Only thing I quiet don't get from your question is, what you exactly want to do with the powershell script. A pipeline script in Jenkins is executed on a node, so if the job starts it should be running without any user interaction. To set values from the user input as global variables in a powershell script, you already need to have them available within the jenkins node, hence it's nonsense to set them in the powershell script because they are already available.

Is there an easy way to run Azure DevOps PowerShell scripts on my local machine?

I tried to find anything on this but I didn't succeed. Maybe I am using the wrong words for the search.
What I am trying to achieve is that I have a script that can run in an Azure DevOps environment as well as on my local machine for debug purposes. As far as I can see to execute locally I would need some kind of wrapper for the script that is behaving like the Azure DevOps Task is. Does anything like that exist out there?
If you want to have more control over building your code and be able to see intermediate results you need to install self-hosted agent on your machine. Here you have more info about this.
Most of the task are simply wrappers around console tools which adds sort of authorization or making them visually accessible. Maybe useful for you will be enable System.Debug flag on Microsoft agent to see more details what particular task does. You will see more details and thus be able to better understand what is happening behind.
For instance if you use variables in your script like $(someVariable) setting System.Debug you will see your final script in the log with replaced values.
Be aware also that Secret variables are masked. So you may find *** in logs instead of real value.
However, there is no easy way just to extract and wrap what task does to repeat it on your machine without involving Azure DevOps agent.

Cannot remove file from data lake store using runbook

I am trying to run a runbook on azure that contains the following command:
Remove-AzureRmDataLakeStoreItem
When the Runbook is run, the following error comes out:
"Remove-AzureRmDataLakeStoreItem : The term 'Remove-AzureRmDataLakeStoreItem' is not recognized as the name of a cmdlet,..."
What should I do?
This issue typically happens when there is a version mismatch between PS modules in your runbook and the Azure Automation account. To resolve, you will need to update your Azure PS Modules within the Azure Automation Account. "update" steps are published HERE.
Important note:
"Because modules are updated regularly by the product group, changes can occur with the included cmdlets, which may negatively impact your runbooks depending on the type of change, such as renaming a parameter or deprecating a cmdlet entirely. To avoid impacting your runbooks and the processes they automate, it is recommended that you test and validate before proceeding. If you do not have a dedicated Automation account intended for this purpose, consider creating one so that you can test many different scenarios and permutations during the development of your runbooks, in addition to iterative changes such as updating the PowerShell modules. After the results are validated and you have applied any changes required, proceed with coordinating the migration of any runbooks that required modification and perform the following update as described in production."

How do I deploy service fabric application from VSTS release pipeline?

I have configured a CI build for a Service Fabric application, in Visual Studio Team Services, according to this documentation: https://azure.microsoft.com/en-us/documentation/articles/service-fabric-set-up-continuous-integration
But instead of having my CI build do the publishing, I only perform the Build and Package tasks, and include all Service Fabric related output, such as pkg folder, scripts, publish profiles and application parameters, in the drop. This way I can pass it along to the new Release pipeline (agent-based releases) to do the actual deployment of my service fabric application.
In my release definition I have a single Azure Powershell task, that uses an ARM endpoint (with proper service principals configured).
When I deploy my app to an existing service fabric cluster, I use the default Deploy-FabricApplication cmdlet passing along the pkg folder and a publish profile that is configured with a connection to the existing cluster.
The release fails with an error message "Cluster connection instance is null". And I cannot understand why?
Doing some debugging I have found that:
The Deploy-FabricApplication cmdlet executes the Connect-ServiceFabricCluster cmdlet just fine, but as soon as the Publish-NewServiceFabricApplication cmdlet takes over execution, then the cluster connection is lost.
I would expect that this scenario is possible using the service fabric cmdlets, but I cannot figure out how to keep the cluster connection open during depoyment.
UPDATE: The link to the documentation no longer refers to the Service Fabric powershell scripts, so the pre-condition for this question is no longer documented. The article now refers to the VSTS build and release tasks, which can be prefered over the powershell cmdlets I tried to use.
When the Connect-ServiceFabricCluster function is called (from Deploy-FabricApplication.ps1) a local $clusterConnection variable is set after the call to Connect-ServiceFabricCluster. You can see that using Get-Variable.
Unfortunately there is logic in some of the SDK scripts that expect that variable to be set but because they run in a different scope, that local variable isn't available.
It works in Visual Studio because the Deploy-FabricApplication.ps1 script is called using dot source notation, which puts the $clusterConnection variable in the current scope.
I'm not sure if there is a way to use dot sourcing when running a script though the release pipeline but you could, as a workaround, make the $clusterConnection variable global right after it's been set via the Connect-ServiceFabricCluster call. Edit your Deploy-FabricApplication.ps1 script and add the following line after the connection logic (~line 169):
$global:clusterConnection = $clusterConnection
By the way, you might want to consider setting up custom build/release tasks that deploy a Service Fabric application, rather than using the various Deploy-FabricApplication.ps1 scripts.
There now exists a built-in VSTS task for deploying a Service Fabric app so you no longer need to bother with executing the PowerShell script on your own. Task documentation page is at https://www.visualstudio.com/docs/build/steps/deploy/service-fabric-deploy. The original CI article has also been updated which provides details on how to set everything up: https://azure.microsoft.com/en-us/documentation/articles/service-fabric-set-up-continuous-integration/.
Try to use "PowerShell" task instead of "Azure PowerShell" task.
I hit the same bug today and opened a GitHub issue here
On a side note, VS generated script Deploy-FabricApplication.ps1 uses module
"$((Get-ItemProperty -Path "HKLM:\SOFTWARE\Microsoft\Service Fabric SDK" -Name "FabricSDKPSModulePath").FabricSDKPSModulePath)\ServiceFabricSDK.psm1"
That's where Publish-NewServiceFabricApplication comes from. You can check the deployment logic and rewrite it in more sane way using lower-level ServiceFabric SDK cmdlets (potentially getting connection using Get-ServiceFabricClusterConnection instead of global-ling it)

Update Web.config configuration with powershell on Azure Release Script

I'm running my Deployments on the Release Management(Currently Preview) tool in VSO.
When you configure a new Release(with the new release management tool on VSO) you can add to the Flow a task named:Azure PowerShell(Run a PowerShell script within an Azure environment)
What i'm trying to do is to Make some changes to the web.config using the Get-WebApplication and then Set-WebConfigurationProperty.
the error i get from the Log is:
Process should have elevated status to access IIS configuration data.
##[error]Cannot find a provider with the name 'WebAdministration'.
Is it even possible to run those kind of commands in there or do you i need to use another kind of command to update my web.config?
There is no Azure API to make arbitrary transforms to your web.config.
Instead, the way this is typically done is to use the deployment time transform engine (e.g. via Web.Debug.config or using Chained Config transforms).
If you're trying to set the web.config of an Azure WebApp then you need to use the Set-AzureWebSite cmdlet or the Set-AzureRMWebApp cmdlet.
Which one you need to use depends on which Azure cmdlets are installed on the machine running the script. The hosted servers for RM may still have the 0.9.x cmdlets (which uses SetAzureWebSite). The Set-AzureRMWebApp cmdlet is in the 1.x cmdlets. Either will work to set the config, you just need to use the appropriate cmdlet for what's have installed.