Automated way to convert azure powershell scripts to ARM templates - powershell

Let's say I have (way too) many *.ps1 scripts that I'd to convert to ARM templates
Is there a way (a script, command, whatever) I can automatically convert a azure powershell *.ps1 to an ARM template, without having to actually do the deployment to Azure?
I'm not looking for a bullet-proof solution. If there's indeed an automated way to do the conversion which fails if the ps1 script isn't correct, I'm OK with that.

No, there's no way to do that (unless you can automate deployment + export, which would create flawed templates anyway).
The closest you can get to this is run all cmdlets with -Debug switch and capture HTTP requests they are doing and convert those to ARM Templates (shouldn't be too hard, copy\paste and a bunch of editing)

You can run the "-Debug" flag on any of the Azure Powershell cmdlets that relate to creating resources, like New-AzureRmVM, and the output will show you the ARM template that it's going to create:
Be wary of using this as I have found that the Powershell cmdlets are NOT the way you should be automating your deployments. You should be strictly using ARM as the Powershell cmdlets sometimes do not output the correct parameters needed for successful deployment since the Powershell cmdlets do not have a method of specifying the version of the ARM API to use.
Example output using the "New-AzureRmVM" with the the "-Debug" flag:
New-AzureRmVM -ResourceGroupName $RGName -Location $Location -VM $VM -LicenseType "Windows_Server" -Debug
DEBUG: ============================ HTTP REQUEST ============================
HTTP Method:
PUT
Absolute Uri:
https://management.azure.com/subscriptions/<subscription>/resourceGroups/LL_SQL_Test/providers/Microsoft.Compute/virtualMachines/LLSQL3?api-version=2018-04-01
Headers:
x-ms-client-request-id : 5920b683-e8fe-455e-969a-63f4c6e246d7
accept-language : en-US
Body:
{
"properties": {
"hardwareProfile": {
"vmSize": "Standard_DS2_v2"
},
"storageProfile": {
"osDisk": {
"osType": "Windows",
"image": {
"uri": "https://<storageaccount>.blob.core.windows.net/vhds/<VM>.vhd"
},
"caching": "ReadWrite",
"writeAcceleratorEnabled": false,
"createOption": "FromImage",
"diskSizeGB": 127,
"managedDisk": {
"storageAccountType": "Standard_LRS"
}
}
},
"osProfile": {
"computerName": "<computername>",
"adminUsername": "<username>",
"adminPassword": "<Password>",
"windowsConfiguration": {
"provisionVMAgent": true,
"enableAutomaticUpdates": true
}
},
"networkProfile": {
"networkInterfaces": [
{
"id": "/subscriptions/<subscription>/resourceGroups/<resourcegroup>/providers/Microsoft.Network/networkInterfaces/<NIC>"
}
]
},
"diagnosticsProfile": {
"bootDiagnostics": {
"enabled": false
}
},
"availabilitySet": {
"id": "/subscriptions/<subscription>/resourceGroups/<resourcegroup>/providers/Microsoft.Compute/availabilitySets/SQL_Availability_Set_Test"
},
"licenseType": "Windows_Server"
},
"location": "West US"
}
The above is a perfect example of "why not" to use Powershell, as currently, this will return an error:
Body:
{
"error": {
"code": "InvalidParameter",
"target": "osDisk.image",
"message": "Parameter 'osDisk.image' is not allowed."
}
}
As the API version (2018-04-01) the Powershell command is using to convert the Powershell input into a JSON ARM template doesn't allow for the parameter 'osDisk.Image" as it's expecting it to be formatted as:
"storageProfile": {
"imageReference": {
"id": "[resourceId('Microsoft.Compute/images', parameters('images_LL_SQL_IMG_name'))]"
},
"osDisk": {
"osType": "Windows",
Instead it's using
"storageProfile": {
"osDisk": {
"osType": "Windows",
"image": {
"uri": "https://<storageaccount>.blob.core.windows.net/vhds/LLSQL220180621090257.vhd"
},

As others have commented, there is no way to automatically convert PowerShell script to ARM templates. However, if you already have these resources deployed, you may consider using the ARM export feature to retrieve the ARM templates.
https://learn.microsoft.com/en-us/azure/azure-resource-manager/resource-manager-export-template

Related

Packer - Powershell pass variables

Currently we are deploying images with packer (In a build pipeline which is located in Azure DevOps) within our AWS domain with success. Now we want to take this a step further and we're trying to configure a couple of user for future Ansible maintenance. So we're written a script and tried it as an inline Powershell script but both of the options do not seem to pick up the variable which is set in the variable group in Azure DevOps, all the other variables are being used with success. My code is as follows:
{
"variables": {
"build_version": "{{isotime \"2006.01.02.150405\"}}",
"aws_access_key": "$(aws_access_key)",
"aws_secret_key": "$(aws_secret_key)",
"region": "$(region)",
"vpc_id": "$(vpc_id)",
"subnet_id": "$(subnet_id)",
"security_group_id": "$(security_group_id)",
"VagrantUserpassword": "$(VagrantUserPassword)"
},
"builders": [
{
"type": "amazon-ebs",
"access_key": "{{user `aws_access_key`}}",
"secret_key": "{{user `aws_secret_key`}}",
"region": "{{user `region`}}",
"vpc_id": "{{user `vpc_id`}}",
"subnet_id": "{{user `subnet_id`}}",
"security_group_id": "{{user `security_group_id`}}",
"source_ami_filter": {
"filters": {
"name": "Windows_Server-2016-English-Full-Base-*",
"root-device-type": "ebs",
"virtualization-type": "hvm"
},
"most_recent": true,
"owners": [
"801119661308"
]
},
"ami_name": "WIN2016-CUSTOM-{{user `build_version`}}",
"instance_type": "t3.xlarge",
"user_data_file": "userdata.ps1",
"associate_public_ip_address": true,
"communicator": "winrm",
"winrm_username": "Administrator",
"winrm_timeout": "15m",
"winrm_use_ssl": true,
"winrm_insecure": true,
"ssh_interface": "private_ip"
}
],
"provisioners": [
{
"type": "powershell",
"environment_vars": ["VagrantUserPassword={{user `VagrantUserPassword`}}"],
"inline": [
"Install-WindowsFeature web-server,web-webserver,web-http-logging,web-stat-compression,web-dyn-compression,web-asp-net,web-mgmt-console,web-asp-net45",
"New-LocalUser -UserName 'Vagrant' -Description 'User is responsible for Ansible connection.' -Password '$(VagrantUserPassword)'"
]
},
{
"type": "powershell",
"environment_vars": ["VagrantUserPassword={{user `VagrantUserPassword`}}"],
"scripts": [
"scripts/DisableUAC.ps1",
"scripts/iiscompression.ps1",
"scripts/ChocoPackages.ps1",
"scripts/PrepareAnsibleUser.ps1"
]
},
{
"type": "windows-restart",
"restart_check_command": "powershell -command \"& {Write-Output 'Machine restarted.'}\""
},
{
"type": "powershell",
"inline": [
"C:\\ProgramData\\Amazon\\EC2-Windows\\Launch\\Scripts\\InitializeInstance.ps1 -Schedule",
"C:\\ProgramData\\Amazon\\EC2-Windows\\Launch\\Scripts\\SysprepInstance.ps1 -NoShutdown"
]
}
]
}
The "VagrantUserpassword": "$(VagrantUserPassword)" is what is not working, we've tried multiple options but none of them seem to be working.
Any idea's?
Kind regards,
Rick.
Based on my test, the pipeline variables indeed couldn't pass to the powershell environment variable.
Workaround:
You could try to use the Replace Token task to pass the pipeline value to Json file.
Here are the steps:
1.Set the value in Json file.
{
"variables": {
....
"VagrantUserpassword": "#{VagrantUserPassword}#"
},
Use Replace Token task before the script task.
Set the value in Pipeline variables.
Then the value could be set successfully.
On the other hand, I also find some issues in your sample file.
"environment_vars": ["VagrantUserPassword={{user VagrantUserPassword}}"], The VagrantUserPassword need to be replaced with VagrantUserpassword(["VagrantUserPassword={{user VagrantUserpassword}}"]).
Note: This is case sensitive.
You need to use $Env:VagrantUserPassword to replace the $(VagrantUserPassword)
For example:
"inline": [
"Write-Host \"Automatically generated aws password is: $Env:VagrantUserPassword\"",
"Write-Host \"Automatically generated aws password is: $Env:VAR5\""
]

Deploying an Azure VM and Users with an ARM template and DSC

I am taking my first look at creating a DSC (Desired State Configuration) to go with an ARM (Azure Resource Manager) template to deploy a Windows Server 2016 and additional local user accounts. So far the ARM template works fine and for the DSC file I am using simple example to test functionality. The deployment works fine until I try to pass a username/password so I can create a local Windows user account. I can't seem to make this function work at all (see the error message below).
My question is, how do I use the ARM template to pass the credentials (password) to the DSC (mof) file so that the user can be created without having to explicitly allow plain text passwords (which is not a good practice)?
This is what I have tried:
DSC file
Configuration xUser_CreateUserConfig {
[CmdletBinding()]
Param (
[Parameter(Mandatory = $true)]
[string]
$nodeName,
[Parameter(Mandatory = $true)]
[System.Management.Automation.PSCredential]
[System.Management.Automation.Credential()]
$Credential
)
Import-DscResource -ModuleName xPSDesiredStateConfiguration
Node $nodeName {
xUser 'CreateUserAccount' {
Ensure = 'Present'
UserName = Split-Path -Path $Credential.UserName -Leaf
Password = $Credential
}
}
}
Azure ARM Template Snippet 1st Method
"resources": [
{
"apiVersion": "2016-03-30",
"type": "extensions",
"name": "Microsoft.Powershell.DSC",
"location": "[parameters('location')]",
"tags": {
"DisplayName": "DSC",
"Dept": "[resourceGroup().tags['Dept']]",
"Created By": "[parameters('createdBy')]"
},
"dependsOn": [
"[resourceId('Microsoft.Compute/virtualMachines', concat(variables('vmNamePrefix'), copyIndex(1)))]"
],
"properties": {
"publisher": "Microsoft.Powershell",
"type": "DSC",
"typeHandlerVersion": "2.19",
"autoUpgradeMinorVersion": true,
"settings": {
"wmfVersion": "latest",
"modulesUrl": "[concat(variables('_artifactslocation'), '/', variables('dscArchiveFolder'), '/', variables('dscArchiveFileName'))]",
"configurationFunction": "xCreateUserDsc.ps1\\xUser_CreateUserConfig",
"properties": {
"nodeName": "[concat(variables('vmNamePrefix'), copyIndex(1))]",
"Credential": {
"UserName": "[parameters('noneAdminUsername')]",
"Password": "PrivateSettingsRef:UserPassword"
}
}
},
"protectedSettings": {
"Items": {
"UserPassword": "[parameters('noneAdminUserPassword')]"
}
}
}
}
]
Error message
The resource operation completed with terminal provisioning state 'Failed'. VM has reported a failure when processing extension 'Microsoft.Powershell.DSC'. Error message: \\"The DSC Extension received an incorrect input: Compilation errors occurred while processing configuration 'xUser_CreateUserConfig'. Please review the errors reported in error stream and modify your configuration code appropriately. System.InvalidOperationException error processing property 'Password' OF TYPE 'xUser': Converting and storing encrypted passwords as plain text is not recommended. For more information on securing credentials in MOF file, please refer to MSDN blog: http://go.microsoft.com/fwlink/?LinkId=393729
This error message does not help
Azure ARM Template snippet 2nd Method
"resources": [
{
"apiVersion": "2018-10-01",
"type": "extensions",
"name": "Microsoft.Powershell.DSC",
"location": "[parameters('location')]",
"tags": {
"DisplayName": "DSC",
"Dept": "[resourceGroup().tags['Dept']]",
"Created By": "[parameters('createdBy')]"
},
"dependsOn": [
"[resourceId('Microsoft.Compute/virtualMachines', concat(variables('vmNamePrefix'), copyIndex(1)))]"
],
"properties": {
"publisher": "Microsoft.Powershell",
"type": "DSC",
"typeHandlerVersion": "2.9",
"autoUpgradeMinorVersion": true,
"settings": {
"wmfVersion": "latest",
"configuration": {
"url": "[concat(variables('_artifactslocation'), '/', variables('dscArchiveFolder'), '/', variables('dscArchiveFileName'))]",
"script": "xCreateUserDsc.ps1",
"function": "xUser_CreateUserConfig"
},
"configurationArguments": {
"nodeName": "[concat(variables('vmNamePrefix'), copyIndex(1))]"
},
"privacy": {
"dataCollection": "Disable"
}
},
"protectedSettings": {
"configurationArguments": {
"Credential": {
"UserName": "[parameters('noneAdminUsername')]",
"Password": "[parameters('noneAdminUserPassword')]"
}
}
}
}
}
]
Error Message
VM has reported a failure when processing extension 'Microsoft.Powershell.DSC'. Error message: "The DSC Extension received an incorrect input: A parameter cannot be found that matches parameter name '$credential.Password'. Another common error is to specify parameters of type PSCredential without an explicit type. Please be sure to use a typed parameter in DSC Configuration, for example: configuration Example param([PSCredential] $UserAccount). Please correct the input and retry executing the extension. More information on troubleshooting is available at https://aka.ms/VMExtensionDSCWindowsTroubleshoot
This does not help!
I have been trying to solve this error for a couple of days. I have Googled for other example but can only find example of people deploying Web Server and Microsoft's documentation is no help because it tells you to use both of the above methods. When method 1 is the old way (according to Microsoft). So, any help will be much appreciated.
this is how I was setting up parameter in the configuration:
# Credentials
[Parameter(Mandatory)]
[System.Management.Automation.PSCredential]$Admincreds,
and then in the template:
"properties": {
"publisher": "Microsoft.Powershell",
"type": "DSC",
"typeHandlerVersion": "2.19",
"autoUpgradeMinorVersion": true,
"settings": {
"configuration": xxx // doesn't matter for this question
"configurationArguments": yyy // doesn't matter for this question
},
"protectedSettings": {
"configurationArguments": {
"adminCreds": {
"userName": "someValue",
"password": "someOtherValue"
}
}
}
}
Links to working stuff:
https://github.com/Cloudneeti/PCI_Reference_Architecture/blob/master/templates/resources/AD/azuredeploy.json#L261
https://github.com/Cloudneeti/PCI_Reference_Architecture/blob/master/artifacts/configurationscripts/ad-domain.ps1#L11
ps. you might also need to do this. Honestly, I dont remember ;)

How to execute a PowerShell Command from within Azure Data Factory custom activity?

I have a custom activity in Azure Data Factory, which attempts to execute the following command:
PowerShell.exe -Command "Write-Host 'Hello, world!'"
However, when I debug (run) this command from within Azure Data Factory, it runs for a long time, and finally fails.
I guess it fails because perhaps it could not locate "PowerShell.exe". How can I ensure that the ADF Custom Activity has access to PowerShell.exe?
Some sites say about specifying a package (.zip file) that contains everything needed for the exe to execute. However, since PowerShell is from Microsoft, I think it would be inappropriate to ZIP the PowerShell directory, and specify it as a package to the Custom Activity.
Please suggest as to how I can execute PowerShell command from Custom Activity of an Azure Data Factory. Thanks!
Whenever I search "Execute PowerShell from Custom Activity in Azure Data Factory", the search results are talking more about which Az PowerShell command to use to trigger start an ADF pipeline.
I saw two threads in Stackoverflow.com, where the answer just specifies to use a Custom Activity, and the answer is not specific to PowerShell command call from ADF
Here is the JSON for the task:
{
"name": "ExecutePs1CustomActivity",
"properties": {
"activities": [
{
"name": "ExecutePSScriptCustomActivity",
"type": "Custom",
"dependsOn": [],
"policy": {
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"userProperties": [],
"typeProperties": {
"command": "PowerShell.exe -Command \"Write-Host 'Hello, world!'\"",
"referenceObjects": {
"linkedServices": [],
"datasets": []
}
},
"linkedServiceName": {
"referenceName": "Ps1CustomActivityAzureBatch",
"type": "LinkedServiceReference"
}
}
],
"annotations": []
}
}
I see "In Progress" for 3 minutes (180 seconds), and then it shows as "Failed."
I would suggest you to move all you scripting task in a powershell file and copy it to a storage account linked with your custom activity. . Once done try to call it like below:
powershell .\script.ps1
You can also provide the path of the script in json like below:
{
"name": "MyCustomActivityPipeline",
"properties": {
"description": "Custom activity sample",
"activities": [{
"type": "Custom",
"name": "MyCustomActivity",
"linkedServiceName": {
"referenceName": "AzureBatchLinkedService",
"type": "LinkedServiceReference"
},
"typeProperties": {
"command": "helloworld.exe",
"folderPath": "customactv2/helloworld",
"resourceLinkedService": {
"referenceName": "StorageLinkedService",
"type": "LinkedServiceReference"
}
}
}]
}
}
Please try it and see if it helps. Also i would suggest you to troubleshoot the pipeline steps to look for detailed error.
Also to your second point "Some sites say about specifying a package (.zip file) that contains everything needed for the exe to execute." This is required when you are building a custom activity using dot net then it is must copy all the Dll's and Exe's for execution.
Hope it helps.

Why is password type of AzureKeyVaultSecret dropped when creating LinkedService via powershell,

I'm attemping to create a LinkedService via the powershell command
New-AzureRmDataFactoryV2LinkedService -ResourceGroupName rg -DataFactoryName df -Name n -DefinitionFile n.json
the result is that the LinkedService is created, however the reference to the password type of AzureKeyVaultSecret is removed rendering it non-operational
The config file n.json was extracted from the DataFactory code tab and has the syntax below...
{
"name": "<name>",
"type": "Microsoft.DataFactory/factories/linkedservices",
"properties": {
"type": "Oracle",
"typeProperties": {
"connectionString": "host=<host>;port=<port>;serviceName=<serviceName>;user id=<user_id>",
"password": {
"type": "AzureKeyVaultSecret",
"store": {
"referenceName": "Prod_KeyVault",
"type": "LinkedServiceReference"
},
"secretName": "<secretname>"
}
},
"connectVia": {
"referenceName": "<runtimename>",
"type": "IntegrationRuntimeReference"
}
}
}
When the new LinkedService is created, the code looks exactly the same except properties->typeProperties->password is removed and requires manual configuration - which I'm trying to avoid if possible.
Any thoughts?
If you have tried using "Update-Module -Name AzureRm.DataFactoryV2" to update your powershell to the latest version, and it is still the same behavior, then the possible root cause is that password is not support as Azure Key Value yet in Powershell. As far as I know, it is a new feature added recently. So it may take some time to rollout it to Powershell.
In that case, the workaround is to use UI to create linked service for now.

How to run a PowerShell script during Azure VM deployment with ARM template?

I want to deploy a VM in azure using Azure Resource Manager (ARM), and then run a PowerShell script inside the VM post deployment to configure it.
I can do this fine with something like this: https://github.com/Azure/azure-quickstart-templates/tree/master/201-vm-vsts-agent
However that template grabs the PowerShell script from GitHub. As part of my deployment I want to upload the script to Azure Storage, and then have the VM get the script from Azure storage and run it. How can I do that part with regards to dependencies on the PowerShell script, because it has to exist in Azure Storage somewhere before being executed.
I currently have this to install a VSTS Agent as part of a deployment, but the script is downloaded from GitHub, I don't want to do that, I want the installation script of the VSTS Agent to be part of my ARM Project.
{
"name": "vsts-build-agents",
"type": "extensions",
"location": "[parameters('location')]",
"apiVersion": "2017-12-01",
"dependsOn": [
"vsts-build-vm"
],
"tags": {
"displayName": "VstsInstallScript"
},
"properties": {
"publisher": "Microsoft.Compute",
"type": "CustomScriptExtension",
"typeHandlerVersion": "1.9",
"settings": {
"fileUris": [
"[concat(parameters('_artifactsLocation'), '/', variables('powerShell').folder, '/', variables('powerShell').script, parameters('_artifactsLocationSasToken'))]"
]
},
"protectedSettings": {
"commandToExecute": "[concat('powershell.exe -ExecutionPolicy Unrestricted -Command \"& {', './', variables('powerShell').script, ' ', variables('powerShell').buildParameters, '}\"')]"
}
}
}
I guess my question is really about how to set _azurestoragelocation to an azure storage location where the script has just been uploaded as part of the deployment.
chicken\egg problem. you cannot upload to azure storage with arm template, you need to use script to upload to azure storage, but if you have that script on vm to upload it you dont really need to upload it.
that being said, why dont you use VSTS agent extension?
{
"name": "xxx",
"apiVersion": "2015-01-01",
"type": "Microsoft.Resources/deployments",
"properties": {
"mode": "Incremental",
"templateLink": {
"uri": "https://gallery.azure.com/artifact/20161101/microsoft.vsts-agent-windows-arm.1.0.0/Artifacts/MainTemplate.json"
},
"parameters": {
"vmName": {
"value": "xxx"
},
"location": {
"value": "xxx"
},
"VSTSAccountName": {
"value": "xxx"
},
"TeamProject": {
"value": "xxx"
},
"DeploymentGroup": {
"value": "Default"
},
"AgentName": {
"value": "xxx"
},
"PATToken": {
"value": "xxx"
}
}
}
},
Do you mean how to set _artifactsLocation as in the quickstart sample? If so you have 2 options (or 3 depending)
1) use the script in the QS repo, the defaultValue for the _artifactsLocation param will set that for you...
2) if you want to customize, from your local copy of the sample, just use the Deploy-AzureResourceGroup.ps1 in the repo and it will stage and set the value for you accordingly (when you use the -UploadArtifacts switch)
3) stage the PS1 somewhere yourself and manually set the values of _artifactsLocation and _artifactsLocationSasToken
You can also deploy from gallery.azure.com, but that will force you to use the script that is stored in the galley (same as using the defaults in GitHub)
That help?