How to run a PowerShell script during Azure VM deployment with ARM template? - powershell

I want to deploy a VM in azure using Azure Resource Manager (ARM), and then run a PowerShell script inside the VM post deployment to configure it.
I can do this fine with something like this: https://github.com/Azure/azure-quickstart-templates/tree/master/201-vm-vsts-agent
However that template grabs the PowerShell script from GitHub. As part of my deployment I want to upload the script to Azure Storage, and then have the VM get the script from Azure storage and run it. How can I do that part with regards to dependencies on the PowerShell script, because it has to exist in Azure Storage somewhere before being executed.
I currently have this to install a VSTS Agent as part of a deployment, but the script is downloaded from GitHub, I don't want to do that, I want the installation script of the VSTS Agent to be part of my ARM Project.
{
"name": "vsts-build-agents",
"type": "extensions",
"location": "[parameters('location')]",
"apiVersion": "2017-12-01",
"dependsOn": [
"vsts-build-vm"
],
"tags": {
"displayName": "VstsInstallScript"
},
"properties": {
"publisher": "Microsoft.Compute",
"type": "CustomScriptExtension",
"typeHandlerVersion": "1.9",
"settings": {
"fileUris": [
"[concat(parameters('_artifactsLocation'), '/', variables('powerShell').folder, '/', variables('powerShell').script, parameters('_artifactsLocationSasToken'))]"
]
},
"protectedSettings": {
"commandToExecute": "[concat('powershell.exe -ExecutionPolicy Unrestricted -Command \"& {', './', variables('powerShell').script, ' ', variables('powerShell').buildParameters, '}\"')]"
}
}
}
I guess my question is really about how to set _azurestoragelocation to an azure storage location where the script has just been uploaded as part of the deployment.

chicken\egg problem. you cannot upload to azure storage with arm template, you need to use script to upload to azure storage, but if you have that script on vm to upload it you dont really need to upload it.
that being said, why dont you use VSTS agent extension?
{
"name": "xxx",
"apiVersion": "2015-01-01",
"type": "Microsoft.Resources/deployments",
"properties": {
"mode": "Incremental",
"templateLink": {
"uri": "https://gallery.azure.com/artifact/20161101/microsoft.vsts-agent-windows-arm.1.0.0/Artifacts/MainTemplate.json"
},
"parameters": {
"vmName": {
"value": "xxx"
},
"location": {
"value": "xxx"
},
"VSTSAccountName": {
"value": "xxx"
},
"TeamProject": {
"value": "xxx"
},
"DeploymentGroup": {
"value": "Default"
},
"AgentName": {
"value": "xxx"
},
"PATToken": {
"value": "xxx"
}
}
}
},

Do you mean how to set _artifactsLocation as in the quickstart sample? If so you have 2 options (or 3 depending)
1) use the script in the QS repo, the defaultValue for the _artifactsLocation param will set that for you...
2) if you want to customize, from your local copy of the sample, just use the Deploy-AzureResourceGroup.ps1 in the repo and it will stage and set the value for you accordingly (when you use the -UploadArtifacts switch)
3) stage the PS1 somewhere yourself and manually set the values of _artifactsLocation and _artifactsLocationSasToken
You can also deploy from gallery.azure.com, but that will force you to use the script that is stored in the galley (same as using the defaults in GitHub)
That help?

Related

Execute SQL script with Azure ARM template

I'm deploying PostgreSQL server with a database and trying to seed this database with SQL script. I've learned that the best way to execute SQL script from ARM template is to use deployment script resource. Here is part of a template:
{
"type": "Microsoft.DBforPostgreSQL/flexibleServers/databases",
"apiVersion": "2021-06-01",
"name": "[concat(parameters('psqlServerName'), '/', parameters('psqlDatabaseName'))]",
"dependsOn": [
"[resourceId('Microsoft.DBforPostgreSQL/flexibleServers', parameters('psqlServerName'))]"
],
"properties": {
"charset": "[parameters('psqlDatabaseCharset')]",
"collation": "[parameters('psqlDatabaseCollation')]"
},
"resources": [
{
"type": "Microsoft.Resources/deploymentScripts",
"apiVersion": "2020-10-01",
"name": "deploySQL",
"location": "[parameters('location')]",
"kind": "AzureCLI",
"dependsOn": [
"[resourceId('Microsoft.DBforPostgreSQL/flexibleServers/databases', parameters('psqlServerName'), parameters('psqlDatabaseName'))]"
],
"properties": {
"azCliVersion": "2.34.1",
"storageAccountSettings": {
"storageAccountKey": "[listKeys(resourceId('Microsoft.Storage/storageAccounts', parameters('storageAccountName')), '2019-06-01').keys[0].value]",
"storageAccountName": "[parameters('storageAccountName')]"
},
"cleanupPreference": "Always",
"environmentVariables": [
{
"name": "psqlFqdn",
"value": "[reference(resourceId('Microsoft.DBforPostgreSQL/flexibleServers', parameters('psqlServerName')), '2021-06-01').fullyQualifiedDomainName]"
},
{
"name": "psqlDatabaseName",
"value": "[parameters('psqlDatabaseName')]"
},
{
"name": "psqlAdminLogin",
"value": "[parameters('psqlAdminLogin')]"
},
{
"name": "psqlServerName",
"value": "[parameters('psqlServerName')]"
},
{
"name": "psqlAdminPassword",
"secureValue": "[parameters('psqlAdminPassword')]"
}
],
"retentionInterval": "P1D",
"scriptContent": "az config set extension.use_dynamic_install=yes_without_prompt\r\naz postgres flexible-server execute --name $env:psqlServerName --admin-user $env:psqlAdminLogin --admin-password $env:psqlAdminPassword --database-name $env:psqlDatabaseName --file-path test.sql --debug"
}
}
]
}
Azure does not show any errors regarding the syntax and starts the deployment. However, deploySQL deployment gets stuck and then fails after 1 hour due to agent execution timeout. PostgreSQL server itself, database and firewall rule (not shown in the code above) are deployed without any issues, but SQL script is not executed. I've tried to add --debug option to Azure CLI commands, but got nothing new from pipeline output. I've also tried to execute these commands in Azure CLI pipeline task and they worked perfectly. What am I missing here?

Arm template deployment fail with 409 error for one specific storage account

I use arm template to deploy a storage account. However, I got an error saying: StorageAccountAlreadyExists: The storage account named xxx already exists.
My release pipeline is set to incremental, so shouldn't really show this error.
I changed storage account name to a new one, not only it worked the first time, but I can keep on deploying the same pipeline and no error ever thrown out.
Looks like it is something specific to this account, however, I can't see anything special. The arm template we use is also quite normal (something we got from official examples before).
{
"$schema": "http://schema.management.azure.com/schemas/2019-06-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"StorageDescriptor": {
"type": "string",
"defaultValue": "StorageAccount",
"metadata": {}
},
"StorageAccountName": {
"type": "string",
"defaultValue": "[toLower(concat(parameters('StorageDescriptor'), resourceGroup().name))]",
"metadata": { "Description": "Override name for the storage account" }
},
"StorageType": {
"type": "string",
"defaultValue": "Standard_LRS",
"allowedValues": [
"Standard_LRS",
"Standard_ZRS",
"Standard_GRS",
"Standard_RAGRS",
"Premium_LRS"
]
},
"Environment": {
"type": "string",
"defaultValue": "PreProd",
"metadata": { "description": "PreProd or Prod" }
}
},
"variables": {
},
"resources": [
{
"name": "[parameters('StorageAccountName')]",
"type": "Microsoft.Storage/storageAccounts",
"location": "[resourceGroup().location]",
"apiVersion": "2019-06-01",
"dependsOn": [],
"tags": {
"displayName": "Web Job Storage Account"
},
"properties": {
"accountType": "[parameters('StorageType')]"
}
}
],
"outputs": {
}
}
Even though your release pipeline is set to incremental, the storage account name must be unique every time you deploy. Refer to: here.
Arm template deployment fail with 409 error for one specific storage account
You need to check if the storage account attributes have been changed through the Azure/PowerShell portal by somebody else, and are different than the ones specified on the ARM template.
To resolve this issue, please try to export the template and update it in the Azure devops repo:
Then, we could update this new exported template file as you want and deploy with it.
As test, I could keep on deploying the same pipeline and no error ever thrown out.

Packer - Powershell pass variables

Currently we are deploying images with packer (In a build pipeline which is located in Azure DevOps) within our AWS domain with success. Now we want to take this a step further and we're trying to configure a couple of user for future Ansible maintenance. So we're written a script and tried it as an inline Powershell script but both of the options do not seem to pick up the variable which is set in the variable group in Azure DevOps, all the other variables are being used with success. My code is as follows:
{
"variables": {
"build_version": "{{isotime \"2006.01.02.150405\"}}",
"aws_access_key": "$(aws_access_key)",
"aws_secret_key": "$(aws_secret_key)",
"region": "$(region)",
"vpc_id": "$(vpc_id)",
"subnet_id": "$(subnet_id)",
"security_group_id": "$(security_group_id)",
"VagrantUserpassword": "$(VagrantUserPassword)"
},
"builders": [
{
"type": "amazon-ebs",
"access_key": "{{user `aws_access_key`}}",
"secret_key": "{{user `aws_secret_key`}}",
"region": "{{user `region`}}",
"vpc_id": "{{user `vpc_id`}}",
"subnet_id": "{{user `subnet_id`}}",
"security_group_id": "{{user `security_group_id`}}",
"source_ami_filter": {
"filters": {
"name": "Windows_Server-2016-English-Full-Base-*",
"root-device-type": "ebs",
"virtualization-type": "hvm"
},
"most_recent": true,
"owners": [
"801119661308"
]
},
"ami_name": "WIN2016-CUSTOM-{{user `build_version`}}",
"instance_type": "t3.xlarge",
"user_data_file": "userdata.ps1",
"associate_public_ip_address": true,
"communicator": "winrm",
"winrm_username": "Administrator",
"winrm_timeout": "15m",
"winrm_use_ssl": true,
"winrm_insecure": true,
"ssh_interface": "private_ip"
}
],
"provisioners": [
{
"type": "powershell",
"environment_vars": ["VagrantUserPassword={{user `VagrantUserPassword`}}"],
"inline": [
"Install-WindowsFeature web-server,web-webserver,web-http-logging,web-stat-compression,web-dyn-compression,web-asp-net,web-mgmt-console,web-asp-net45",
"New-LocalUser -UserName 'Vagrant' -Description 'User is responsible for Ansible connection.' -Password '$(VagrantUserPassword)'"
]
},
{
"type": "powershell",
"environment_vars": ["VagrantUserPassword={{user `VagrantUserPassword`}}"],
"scripts": [
"scripts/DisableUAC.ps1",
"scripts/iiscompression.ps1",
"scripts/ChocoPackages.ps1",
"scripts/PrepareAnsibleUser.ps1"
]
},
{
"type": "windows-restart",
"restart_check_command": "powershell -command \"& {Write-Output 'Machine restarted.'}\""
},
{
"type": "powershell",
"inline": [
"C:\\ProgramData\\Amazon\\EC2-Windows\\Launch\\Scripts\\InitializeInstance.ps1 -Schedule",
"C:\\ProgramData\\Amazon\\EC2-Windows\\Launch\\Scripts\\SysprepInstance.ps1 -NoShutdown"
]
}
]
}
The "VagrantUserpassword": "$(VagrantUserPassword)" is what is not working, we've tried multiple options but none of them seem to be working.
Any idea's?
Kind regards,
Rick.
Based on my test, the pipeline variables indeed couldn't pass to the powershell environment variable.
Workaround:
You could try to use the Replace Token task to pass the pipeline value to Json file.
Here are the steps:
1.Set the value in Json file.
{
"variables": {
....
"VagrantUserpassword": "#{VagrantUserPassword}#"
},
Use Replace Token task before the script task.
Set the value in Pipeline variables.
Then the value could be set successfully.
On the other hand, I also find some issues in your sample file.
"environment_vars": ["VagrantUserPassword={{user VagrantUserPassword}}"], The VagrantUserPassword need to be replaced with VagrantUserpassword(["VagrantUserPassword={{user VagrantUserpassword}}"]).
Note: This is case sensitive.
You need to use $Env:VagrantUserPassword to replace the $(VagrantUserPassword)
For example:
"inline": [
"Write-Host \"Automatically generated aws password is: $Env:VagrantUserPassword\"",
"Write-Host \"Automatically generated aws password is: $Env:VAR5\""
]

How to execute a PowerShell Command from within Azure Data Factory custom activity?

I have a custom activity in Azure Data Factory, which attempts to execute the following command:
PowerShell.exe -Command "Write-Host 'Hello, world!'"
However, when I debug (run) this command from within Azure Data Factory, it runs for a long time, and finally fails.
I guess it fails because perhaps it could not locate "PowerShell.exe". How can I ensure that the ADF Custom Activity has access to PowerShell.exe?
Some sites say about specifying a package (.zip file) that contains everything needed for the exe to execute. However, since PowerShell is from Microsoft, I think it would be inappropriate to ZIP the PowerShell directory, and specify it as a package to the Custom Activity.
Please suggest as to how I can execute PowerShell command from Custom Activity of an Azure Data Factory. Thanks!
Whenever I search "Execute PowerShell from Custom Activity in Azure Data Factory", the search results are talking more about which Az PowerShell command to use to trigger start an ADF pipeline.
I saw two threads in Stackoverflow.com, where the answer just specifies to use a Custom Activity, and the answer is not specific to PowerShell command call from ADF
Here is the JSON for the task:
{
"name": "ExecutePs1CustomActivity",
"properties": {
"activities": [
{
"name": "ExecutePSScriptCustomActivity",
"type": "Custom",
"dependsOn": [],
"policy": {
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"userProperties": [],
"typeProperties": {
"command": "PowerShell.exe -Command \"Write-Host 'Hello, world!'\"",
"referenceObjects": {
"linkedServices": [],
"datasets": []
}
},
"linkedServiceName": {
"referenceName": "Ps1CustomActivityAzureBatch",
"type": "LinkedServiceReference"
}
}
],
"annotations": []
}
}
I see "In Progress" for 3 minutes (180 seconds), and then it shows as "Failed."
I would suggest you to move all you scripting task in a powershell file and copy it to a storage account linked with your custom activity. . Once done try to call it like below:
powershell .\script.ps1
You can also provide the path of the script in json like below:
{
"name": "MyCustomActivityPipeline",
"properties": {
"description": "Custom activity sample",
"activities": [{
"type": "Custom",
"name": "MyCustomActivity",
"linkedServiceName": {
"referenceName": "AzureBatchLinkedService",
"type": "LinkedServiceReference"
},
"typeProperties": {
"command": "helloworld.exe",
"folderPath": "customactv2/helloworld",
"resourceLinkedService": {
"referenceName": "StorageLinkedService",
"type": "LinkedServiceReference"
}
}
}]
}
}
Please try it and see if it helps. Also i would suggest you to troubleshoot the pipeline steps to look for detailed error.
Also to your second point "Some sites say about specifying a package (.zip file) that contains everything needed for the exe to execute." This is required when you are building a custom activity using dot net then it is must copy all the Dll's and Exe's for execution.
Hope it helps.

Why is password type of AzureKeyVaultSecret dropped when creating LinkedService via powershell,

I'm attemping to create a LinkedService via the powershell command
New-AzureRmDataFactoryV2LinkedService -ResourceGroupName rg -DataFactoryName df -Name n -DefinitionFile n.json
the result is that the LinkedService is created, however the reference to the password type of AzureKeyVaultSecret is removed rendering it non-operational
The config file n.json was extracted from the DataFactory code tab and has the syntax below...
{
"name": "<name>",
"type": "Microsoft.DataFactory/factories/linkedservices",
"properties": {
"type": "Oracle",
"typeProperties": {
"connectionString": "host=<host>;port=<port>;serviceName=<serviceName>;user id=<user_id>",
"password": {
"type": "AzureKeyVaultSecret",
"store": {
"referenceName": "Prod_KeyVault",
"type": "LinkedServiceReference"
},
"secretName": "<secretname>"
}
},
"connectVia": {
"referenceName": "<runtimename>",
"type": "IntegrationRuntimeReference"
}
}
}
When the new LinkedService is created, the code looks exactly the same except properties->typeProperties->password is removed and requires manual configuration - which I'm trying to avoid if possible.
Any thoughts?
If you have tried using "Update-Module -Name AzureRm.DataFactoryV2" to update your powershell to the latest version, and it is still the same behavior, then the possible root cause is that password is not support as Azure Key Value yet in Powershell. As far as I know, it is a new feature added recently. So it may take some time to rollout it to Powershell.
In that case, the workaround is to use UI to create linked service for now.