I'm working for an Azure project where the deployments can only be made using the ARM templates from Visual Studio CI and we have only read access to the Azure Portal.
Currently I'm getting the below error and can make no releases. I cannot delete deployments from Portal either since I have only permission to configure build and release phases, I was wondering if there is any phase I can create where the previous deployments are deleted.
So far I tried couple of things like using inline PowerShell command Remove-AzureRmResourceGroupDeployment , deleting resource group of type Microsoft.Resouces/deployments before the resource deployment phase but non of them worked.
[error]Creating the deployment 'azuredeploy-2017721-715' would exceed the quota of '800'. The current deployment count is '800', please delete some deployments before creating a new one. Please see https://aka.ms/arm-deploy for usage details.
Here is a script that allows you to delete such deployments in a parallel manner. You can also use this for an Azure PowerShell task in Azure DevOps. If you have issues regarding authentication have a look here: Azure credentials have not been set up or have expired, please run Connect-AzAccount
Param(
[string]
[Parameter(Mandatory = $true)]
$subscriptionId,
[string]
[Parameter(Mandatory = $true)]
$tenantId,
[string]
[Parameter(Mandatory = $true)]
$resourceGroupName,
[int]
[Parameter(Mandatory = $true)]
$numberOfDeploymentsToKeep,
[int]
[Parameter(Mandatory = $true)]
$batchSize
)
try {
$c = Get-AzContext
}
catch {
$c = $null
}
if (!$c -or !$c.Account) {
Connect-AzAccount -Subscription $subscriptionId -Tenant $tenantId
} else {
Select-AzSubscription -Subscription $subscriptionId -Tenant $tenantId
}
# ----------------------------------
# Get Deployments
# ----------------------------------
#$dateBeforeDeleteDeployments = Get-Date -Year 2018 -Month 06 -Day 30
#$deploymentsToDelete = Get-AzResourceGroupDeployment -ResourceGroupName $resourceGroupName | Where-Object { $_.Timestamp -le $dateBeforeDeleteDeployments }
$currentDeployments = Get-AzResourceGroupDeployment -ResourceGroupName $resourceGroupName
$currentNumberOfDeployments = ($currentDeployments | Measure-Object).Count
$numberOfDeploymentsToRemove = $currentNumberOfDeployments - $numberOfDeploymentsToKeep
if ($numberOfDeploymentsToRemove -lt 0) {
throw "Number of deployments to remove is < 0..."
}
if ($numberOfDeploymentsToRemove -eq 0) {
Write-Host "Number of deployments to remove is 0..."
return
}
Write-Host "Number of Deployments to remove: '$numberOfDeploymentsToRemove'..."
$deploymentsToDelete = $currentDeployments | Sort-Object -Property Timestamp | Select-Object -First $numberOfDeploymentsToRemove
$deploymentsToDelete | ForEach-Object {$i=0; $j=0; $deploymentsToDeleteBatched=#{}} {
if($i -ne $batchSize -and $deploymentsToDeleteBatched["Batch $j"]) {
$deploymentsToDeleteBatched["Batch $j"]+=$_
$i+=1
}
else {
$i=1
$j+=1
$deploymentsToDeleteBatched["Batch $j"]=#($_)
}
}
Write-Host "Created $($deploymentsToDeleteBatched.Count) batches..."
# ----------------------------------
# Execute deletion in parallel
# ----------------------------------
$jobNames = #()
foreach ($batchkey in $deploymentsToDeleteBatched.Keys) {
$deploymentsToDeleteBatch = $deploymentsToDeleteBatched.$batchkey
$logic = {
Param(
[object]
[Parameter(Mandatory = $true)]
$ctx,
[object]
[Parameter(Mandatory = $true)]
$deploymentsToDeleteBatch,
[string]
[Parameter(Mandatory = $true)]
$resourceGroupName
)
foreach ($deploymentToDelete in $deploymentsToDeleteBatch) {
$deploymentName = $deploymentToDelete.DeploymentName
Remove-AzResourceGroupDeployment -ResourceGroupName $resourceGroupName -Name $deploymentName -DefaultProfile $ctx -ErrorAction Stop
Write-Host "Deleted Deployment '$deploymentName' from '$($deploymentToDelete.Timestamp)'..."
}
}
$jobName = ([System.Guid]::NewGuid()).Guid
$jobNames += $jobName
$jobObject = Start-Job $logic -Name $jobName -ArgumentList (Get-AzContext), $deploymentsToDeleteBatch, $resourceGroupName
}
while (Get-Job -State "Running") {
Write-Host "---------------------------------------------------------------"
Write-Host "Jobs still running..."
Get-Job | Format-Table
Write-Host "---------------------------------------------------------------"
Start-Sleep -Seconds 10
}
Write-Host "Jobs completed, getting output..."
Write-Host "---------------------------------------------------------------"
foreach ($jobName in $jobNames) {
Write-Host "Output of Job '$jobName'..."
Receive-Job -Name $jobName
Write-Host "---------------------------------------------------------------"
}
Write-Host "Done..."
Use the Azure PowerShell task. It takes care of authentication for you -- no need to call Login-AzureRmAccount.
Related
I have Azure Data Factory CI/CD pipeline. My ADF have few global params, so I am following Microsoft documentation for their CI/CD. On same documentation page, there is below 'Update global param' powershell script. Issue is whenever this script runs, it resets my ADF network access to 'Public endpoint' from 'private endpoint'.
param
(
[parameter(Mandatory = $true)] [String] $globalParametersFilePath,
[parameter(Mandatory = $true)] [String] $resourceGroupName,
[parameter(Mandatory = $true)] [String] $dataFactoryName
)
Import-Module Az.DataFactory
$newGlobalParameters = New-Object 'system.collections.generic.dictionary[string,Microsoft.Azure.Management.DataFactory.Models.GlobalParameterSpecification]'
Write-Host "Getting global parameters JSON from: " $globalParametersFilePath
$globalParametersJson = Get-Content $globalParametersFilePath
Write-Host "Parsing JSON..."
$globalParametersObject = [Newtonsoft.Json.Linq.JObject]::Parse($globalParametersJson)
# $gp in $factoryFileObject.properties.globalParameters.GetEnumerator())
# may be used in case you use non-standard location for global parameters. It is not recommended.
foreach ($gp in $globalParametersObject.GetEnumerator()) {
Write-Host "Adding global parameter:" $gp.Key
$globalParameterValue = $gp.Value.ToObject([Microsoft.Azure.Management.DataFactory.Models.GlobalParameterSpecification])
$newGlobalParameters.Add($gp.Key, $globalParameterValue)
}
$dataFactory = Get-AzDataFactoryV2 -ResourceGroupName $resourceGroupName -Name $dataFactoryName
$dataFactory.GlobalParameters = $newGlobalParameters
Write-Host "Updating" $newGlobalParameters.Count "global parameters."
Set-AzDataFactoryV2 -InputObject $dataFactory -Force
I want Network access to be via 'Private endpoint' ALWAYS. Does anyone faced this issue?
Just change last line of your Global param script as follows:
Set-AzDataFactoryV2 -InputObject $dataFactory -PublicNetworkAccess "Disabled" -Force
Now your ADF network access won't reset to Public one.
We have a number of Azure function apps to deploy, using the same code. In our Azure DevOps pipeline we have an Azure PowerShell (4.*) script to deploy the code and start the function app:
Param (
[string] $resourceGroupName,
[string[]] $funcapps,
[string] $filePath
)
Write-Output "Deploying the following function apps: $funcapps";
foreach ($app in $funcapps)
{
Write-Output "Deploying function app: $app";
$webapp = Publish-AzWebapp -ResourceGroupName $resourceGroupname -Name $app -ArchivePath $filePath -Force;
Write-Output "Starting function app: $app";
$webapp = Start-AzWebApp -ResourceGroupName $resourceGroupName -Name $app;
Write-Output "Started function app: $app = $($webapp.State)";
}
This works fine (both from local PowerShell and from Azure DevOps), but with the number of apps we're deploying can take a while. To try to make it perform better, we tried to run the publish/start statements in parallel:
Param (
[string] $resourceGroupName,
[string[]] $funcapps,
[string] $filePath
)
Workflow Parallel-Deploy {
Param (
[string] $resourceGroupName,
[string[]] $funcapps,
[string] $filePath
)
Write-Output "Deploying the following function apps: $funcapps";
foreach -parallel($app in $funcapps)
{
Write-Output "Deploying function app: $app";
$webapp = Publish-AzWebapp -ResourceGroupName $resourceGroupname -Name $app -ArchivePath $filePath -Force;
Write-Output "Starting function app: $app";
$webapp = Start-AzWebApp -ResourceGroupName $resourceGroupName -Name $app;
Write-Output "Started function app: $app = $($webapp.State)";
}
}
Parallel-Deploy -resourceGroupName $resourceGroupName -funcapps $funcapps -filePath $filePath
The code is the same, just moved into a Workflow to use "foreach -parallel".
If I run the script from a local PowerShell, everything works fine - but from the Azure DevOps pipeline, I get an error, No account found in the context. Please login using Connect-AzAccount.
I've found reference to changes made in the Azure PowerShell DevOps task, that the context needs to be passed explicitly to background tasks. I tried to follow the example listed (Save-AzContext and Import-AzContext - updating from the Save-AzureRmContext/Import-AzureRmContext in the example), but it's still giving me the same error.
Any suggestions on what I'm doing wrong, and how to get the context correctly set inside the "foreach -parallel" block?
Edit 1
I probably should have shown exactly what I did for SaveContext/ImportContext...
Param (
[string] $resourceGroupName,
[string[]] $funcapps,
[string] $filePath,
[string] $tmpDir
)
$contextPath = "$tmpDir/context.json"
Save-AzContext -Path $contextPath" -Force
Workflow Parallel-Deploy {
Param (
[string] $resourceGroupName,
[string[]] $funcapps,
[string] $filePath,
[string] $contextPath
)
foreach -parallel($app in $funcapps)
{
# Output context - initially not set
Get-AzContext;
# Fetch and display context - now set
Import-AzContext -Path $contextPath;
Get-AzContext;
Write-Output "Deploying function app: $app";
$webapp = Publish-AzWebapp -ResourceGroupName $resourceGroupname -Name $app -ArchivePath $filePath -Force;
...
This still gave me the error that the account wasn't found in the context.
As per the suggestions, I changed to using a job instead:
foreach ($app in $funcapps) {
$jobname = "$app-Job";
Start-Job -Name $jobname -ScriptBlock {
Param (
[string] $resourceGroupName,
[string[]] $funcapps,
[string] $filePath,
[string] $contextPath
)
# Output context - initially not set
Get-AzContext;
# Fetch and display context - now set
Import-AzContext -Path $contextPath;
Get-AzContext;
Write-Output "Deploying function app: $app";
$webapp = Publish-AzWebapp -ResourceGroupName $resourceGroupname -Name $app -ArchivePath $filePath -Force;
...
And this too said the context wasn't correct.
save\import should work just fine, as well as just allowing the context autosave Enable-AzContextAutosave.
Alternatively you can just a native capability to launch cmdlets as jobs:
Publish-AzWebapp -ResourceGroupName $resourceGroupname -Name $app `
-ArchivePath $filePath -Force -AsJob
and then just wait for the jobs to finish and start the webapps.
I am trying to generate backup report for my azure VM backups.
I have many subscription and each have more than 50 number of RGs. Almost 1500 vms are hosted in our environment.
I tried generating backup report but the PowerShell script took 2 hours to complete. So I am trying workflow for some parallel processing.
But I am getting the below error though I do not see any recursive call here.
System.Management.Automation.ParseException: At line:1 char:1
+ try
+ ~~~
A workflow cannot use recursion.
at System.Management.Automation.ExceptionHandlingOps.CheckActionPreference(FunctionContext funcContext, Exception exception)
at System.Management.Automation.Interpreter.ActionCallInstruction`2.Run(InterpretedFrame frame)
at System.Management.Automation.Interpreter.EnterTryCatchFinallyInstruction.Run(InterpretedFrame frame)
at System.Management.Automation.Interpreter.EnterTryCatchFinallyInstruction.Run(InterpretedFrame frame)
try
{
#$cred = Get-Credential
Login-AzureRmAccount #-Credential $cred
$tempCSVPath = Read-Host 'Please provide local path to store the report, Example- D:\temp\report.csv '
$Path = 'C:\AzureRmProfile.json'
$subs = Get-AzureRmSubscription
Get-AzureRmRecoveryServicesVault
foreach($sub in $subs)
{
if($sub.Name -ne 'IRMLAB')
{
Select-AzureRmSubscription -SubscriptionName $sub.Name
$Vms = #()
$RR = #()
$rms = Get-AzureRmRecoveryServicesVault
Foreach($rm in $rms)
{
Set-AzureRmRecoveryServicesVaultContext -Vault $rm
$container_list = Get-AzureRmRecoveryServicesBackupContainer -ContainerType AzureVM
Workflow a{
param (
[parameter(Mandatory=$true)]
[psobject]$AzureRmConObject,
[parameter(Mandatory=$true)]
[psobject]$ProfilePath
)
foreach -parallel($container_list_iterator in $AzureRmConObject)
{
$Profile = Select-AzureRmProfile -Path $ProfilePath
$backup_item = Get-AzureRmRecoveryServicesBackupItem -Container $container_list_iterator -WorkloadType AzureVM
$backup_item_array = ($backup_item.ContainerName).split(';')
$Vms += [pscustomobject]#{
Virtualmachine_name = $backup_item_array[2]
Vault_resourcegroup_name = $backup_item_array[1]
backup_item_last_backup_status = $backup_item.LastBackupStatus
backup_item_latest_recovery_point = $backup_item.LatestRecoveryPoint
}
}
a -AzureRmConObject $container_list -ProfilePath $Path
$Vms | Export-Csv -Path $tempCSVPath -Append -Force
}
}
}
}
}
catch
{
Write-Host $_.Exception
}
It seems calling a recursive function is not permitted in workflows. Refer here -> Docs
I'm tying to change a value on a tag, using an automation script. The users will have a startup script, which will change the shutdown tag key from true to false.
When I set the tags individually using the script below it sets the tag value to false. The current setting is true.
When I use the automation script it wipes all the tags, however If I specify the vm in the script the automaton account works and changes the key value from false to true.
I can't see what I'm missing. This is from a webhook and is running as a powershell script, not a workflow.
[CmdletBinding()]
Param(
[Parameter(Mandatory=$True)]
[object]$WebhookData
)
Write-Output "------------------------------------------------"
Write-Output "`nConnecting to Azure Automation"
$Connection = Get-AutomationConnection -Name AzureRunAsConnection
Add-AzureRMAccount -ServicePrincipal -Tenant $Connection.TenantID `
-ApplicationId $Connection.ApplicationID -CertificateThumbprint $Connection.CertificateThumbprint
$RunbookVersion = "0.0.17"
$timeStartUTC = (Get-Date).ToUniversalTime()
Write-Output "Workflow started: Runbook Version is $RunbookVersion"
Write-Output "System time is: $(Get-Date)"
Write-Output "`nGetting tagged resources"
Write-Output "------------------------------------------------"
$ResourceGroupFilter = ""
$SupportedEnvironments = "DEV, Test, PREProd, Prod"
$isWebhookDataNull = $WebhookData -eq $null
Write-Output "Is webhook data null ? : $($isWebhookDataNull)"
# If runbook was called from Webhook, WebhookData will not be null.
If ($WebhookData -ne $null) {
# Collect properties of WebhookData
$WebhookName = $WebhookData.WebhookName
$WebhookHeaders = $WebhookData.RequestHeader
$WebhookBody = $WebhookData.RequestBody
$body = $WebhookBody | ConvertFrom-Json
$UserEmail = $body.user.email
Write-Output "Runbook started from webhook '$WebhookName' by '$($body.user.email)' for environment '$($body.environment)'"
Write-Output "Message body: " $WebhookBody
}
else {
Write-Error "Runbook mean to be started only from webhook."
}
If ($body.environment.ToUpper() -eq 'DEV') {
$ResourceGroupFilter = 'The-DEV-RG'
}
if ($ResourceGroupFilter -eq "") {
Exit 1
}
if($VMRG -eq ''){
Write-Output "No resource groups matched for selected environment. Webhook cant progress further, exiting.."
Write-Error "No resource groups matched for selected environment. Webhook cant progress further, exiting.."
Exit 1
}
$rgs = Get-AzureRmResourceGroup | Where-Object {$_.ResourceGroupName -like "*$rg*"}
foreach ($rg in $rgs)
{
$vms = Get-AzureRmVm -ResourceGroupName $rg.ResourceGroupName
$vms.ForEach({
$tags = $_.Tags
$tags['ShutdownSchedule_AllowStop'] = "$False";
Set-AzureRmResource -ResourceId $_.Id -Tag $tags -Force -Verbose
})
}
ForEach ($vm in $vms) {
Start-AzureRmVM -Name $vm.Name -ResourceGroupName $vm.ResourceGroupName -Verbose
}
Thanks in advance :)
The root reason is your local Azure Power Shell is latest version, but in Azure automation account, it is not latest version. I test in my lab, older version does not support this.
You need upgrade Azure Power Shell version. More information about this please see this answer.
The following code snipit works in PowerShell v2, but not v4.. In the release notes for PowerShell v3 is explains that you cannot set the IsFilter property on an unnamed script block. I believe that's exactly what I have, but I don't understand what change to make..
Any help would be appreciated.
function Stop-WindowsService
{
param(
[Parameter(Mandatory=$true,ValueFromPipeline=$true)]
$fromPipe,
[Parameter(ParameterSetName='static',Mandatory=$true,Position=0)]
[ValidateNotNullOrEmpty()]
[string]$name,
[Parameter(ParameterSetName='dynamic',Mandatory=$true,Position=0)]
[ValidateNotNull()]
[ScriptBlock]$scriptReturningName,
[Parameter(Mandatory=$false)]
[ValidateRange(1,86400)]
[int]$timeout = 60
)
Process {
$server = $_
if($PsCmdlet.ParameterSetName -eq 'dynamic') {
$scriptReturningName.IsFilter = $true
$name = ($server | &$scriptReturningName)
}
Write-Verbose "$($server.Name): $name ==> Checking"
$service = $server | Get-WindowsServiceRaw $name