GlobalParametersUpdateScript.ps1 is resetting Azure Data Factory Public Network Access - azure-devops

I have Azure Data Factory CI/CD pipeline. My ADF have few global params, so I am following Microsoft documentation for their CI/CD. On same documentation page, there is below 'Update global param' powershell script. Issue is whenever this script runs, it resets my ADF network access to 'Public endpoint' from 'private endpoint'.
param
(
[parameter(Mandatory = $true)] [String] $globalParametersFilePath,
[parameter(Mandatory = $true)] [String] $resourceGroupName,
[parameter(Mandatory = $true)] [String] $dataFactoryName
)
Import-Module Az.DataFactory
$newGlobalParameters = New-Object 'system.collections.generic.dictionary[string,Microsoft.Azure.Management.DataFactory.Models.GlobalParameterSpecification]'
Write-Host "Getting global parameters JSON from: " $globalParametersFilePath
$globalParametersJson = Get-Content $globalParametersFilePath
Write-Host "Parsing JSON..."
$globalParametersObject = [Newtonsoft.Json.Linq.JObject]::Parse($globalParametersJson)
# $gp in $factoryFileObject.properties.globalParameters.GetEnumerator())
# may be used in case you use non-standard location for global parameters. It is not recommended.
foreach ($gp in $globalParametersObject.GetEnumerator()) {
Write-Host "Adding global parameter:" $gp.Key
$globalParameterValue = $gp.Value.ToObject([Microsoft.Azure.Management.DataFactory.Models.GlobalParameterSpecification])
$newGlobalParameters.Add($gp.Key, $globalParameterValue)
}
$dataFactory = Get-AzDataFactoryV2 -ResourceGroupName $resourceGroupName -Name $dataFactoryName
$dataFactory.GlobalParameters = $newGlobalParameters
Write-Host "Updating" $newGlobalParameters.Count "global parameters."
Set-AzDataFactoryV2 -InputObject $dataFactory -Force
I want Network access to be via 'Private endpoint' ALWAYS. Does anyone faced this issue?

Just change last line of your Global param script as follows:
Set-AzDataFactoryV2 -InputObject $dataFactory -PublicNetworkAccess "Disabled" -Force
Now your ADF network access won't reset to Public one.

Related

Sonarqube endpoint created using api script fail in build pipeline with with "The token you provided doesn't have sufficient rights to check license"

I am attempting to create a sonarqube service endpoint in azure devops using this api script below. It succesffuly generates the service connection, and visually looks identical to one manually created in the azure devops portal. How ever when used in a yaml pipeline, fails at the Prepare analysis on SonarQube step with the following error
The token you provided doesn't have sufficient rights to check license.
If i manually changed every avaiable field of this connection (url/token/name) it still fails.
if i recreate with these values but through the portal it works.
function createDevOpsServiceConnection {
<#
.SYNOPSIS
Creates a DevOps service connection through REST API
.DESCRIPTION
The scrip will help you creating an Azure DevOps service connection through the REST API.
.PARAMETER PersonalToken
Personal Access Token. Check https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/use-personal-access-tokens-to-authenticate?view=azure-devops&tabs=preview-page
.PARAMETER Organisation
The Azure DevOps organisation name
.PARAMETER ProjectName
The Azure DevOps project where the service connection will be made
.PARAMETER ManagementGroupId
The management group id created in the tenant
.PARAMETER ManagementGroupName
The management group name created in the tenant
.PARAMETER SubscriptionId
The subscription id in the tenant where to connect
.PARAMETER SubscriptionName
The subscription name in the tenant where to connect
.PARAMETER TenantId
The tenant id where to connect
.PARAMETER ApplicationId
The application id (service principal) in the Azure AD
.PARAMETER ApplicationSecret
The application secret, in plain text
.EXAMPLE
create-DevOpsServiceConnection.ps1 -personalToken xxx -organisation DevOpsOrganisation -ProjectName WVD -ManagementGroupId MGTGROUP1 -ManagementGroupName 'MGT GROUP 1' -TenantId xxx-xxx -ApplicationId xxx-xxx-xxx -ApplicationSecret 'verysecret'
#>
Param(
[Parameter(Mandatory = $True)]
[string]$PersonalToken,
[Parameter(Mandatory = $True)]
[string]$Organisation,
[Parameter(Mandatory = $True)]
[string]$ProjectName,
[Parameter(Mandatory = $True, ParameterSetName = 'Subscription')]
[string]$SubscriptionId,
[Parameter(Mandatory = $True, ParameterSetName = 'Subscription')]
[string]$SubscriptionName,
[Parameter(Mandatory = $True)]
[string]$TenantId,
[Parameter(Mandatory = $True)]
[string]$ApplicationId,
[Parameter(Mandatory = $True)]
[string]$ApplicationSecret
)
$AzureDevOpsAuthenicationHeader = #{Authorization = 'Basic ' + [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(":$($personalToken)")) }
## Get ProjectId
$URL = "https://dev.azure.com/$($organisation)/_apis/projects?api-version=6.0"
Try {
$ProjectNameproperties = (Invoke-RestMethod $URL -Headers $AzureDevOpsAuthenicationHeader -ErrorAction Stop).Value
Write-Verbose "Collected Azure DevOps Projects"
}
Catch {
$ErrorMessage = $_ | ConvertFrom-Json
Throw "Could not collect project: $($ErrorMessage.message)"
}
$ProjectID = ($ProjectNameproperties | Where-Object { $_.Name -eq $ProjectName }).id
Write-Verbose "Collected ID: $ProjectID"
$ConnectionName = "sonarqube"
switch ($PsCmdlet.ParameterSetName) {
Subscription {
$data = #{
}
}
}
# Create body for the API call
$Body = #{
data = $data
name = $ConnectionName
type = "sonarqube"
url = "http://mysonarqube:9000/"
authorization = #{
parameters = #{
username = "admin"
password = 'SONARQUBETOKEN'
}
scheme = "UsernamePassword"
}
isShared = $True
isReady = $True
serviceEndpointProjectReferences = #(
#{
projectReference = #{
id = "xxxxxxxx"
name = "projectname"
}
name = $ConnectionName
}
)
}
$URL = "https://dev.azure.com/$organisation/$ProjectName/_apis/serviceendpoint/endpoints?api-version=6.1-preview.4"
$Parameters = #{
Uri = $URL
Method = "POST"
Body = ($Body | ConvertTo-Json -Depth 3)
Headers = $AzureDevOpsAuthenicationHeader
ContentType = "application/json"
Erroraction = "Stop"
}
try {
Write-Verbose "Creating Connection"
$Result = Invoke-RestMethod #Parameters
Write-Host "$($Result.name) service connection created"
}
Catch {
$ErrorMessage = $_ | ConvertFrom-Json
Throw "Could not create Connection: $($ErrorMessage.message)"
}
}
createDevOpsServiceConnection -personalToken $env:AZURE_DEVOPS_EXT_PAT -organisation $env:org -ProjectName "Bicep" `
-TenantId $env:ARM_TENANT_ID -ApplicationId $env:ARM_CLIENT_ID -ApplicationSecret $env:ARM_CLIENT_SECRET -SubscriptionId $ENV:ARM_SUBSCRIPTION_ID `
-SubscriptionName $env:sbname
Original script that this is based on can be found here; https://raw.githubusercontent.com/srozemuller/Azure/main/DevOps/Automation/create-DevOpsServiceConnection.ps1

How to delete Global Parameters using Powershell command

I have created a number of Global Parameters in DEV Azure Data Factory which is being used in many pipelines. I have deployed those Global parameters using ARM template to our QA environment.
Now I have deleted few parameters from Dev environment and when I re deployed the ARM template, those parameters are not getting deleted from QA environment.
I did not find any other way to delete from QA environment as we do not have delete permissions in QA.
Can you please suggest what are the next steps? I am looking for a powershell command way to delete those parameters.
param
(
[parameter(Mandatory = $true)] [String] $globalParametersFilePath,
[parameter(Mandatory = $true)] [String] $resourceGroupName,
[parameter(Mandatory = $true)] [String] $dataFactoryName
)
Import-Module Az.DataFactory
$newGlobalParameters = New-Object 'system.collections.generic.dictionary[string,Microsoft.Azure.Management.DataFactory.Models.GlobalParameterSpecification]'
Write-Host "Getting global parameters JSON from: " $globalParametersFilePath
$globalParametersJson = Get-Content $globalParametersFilePath
Write-Host "Parsing JSON..."
$globalParametersObject = [Newtonsoft.Json.Linq.JObject]::Parse($globalParametersJson)
# $gp in $factoryFileObject.properties.globalParameters.GetEnumerator())
# may be used in case you use non-standard location for global parameters. It is not recommended.
foreach ($gp in $globalParametersObject.GetEnumerator()) {
Write-Host "Removing global parameter:" $gp.Key
$globalParameterValue = $gp.Value.ToObject([Microsoft.Azure.Management.DataFactory.Models.GlobalParameterSpecification])
$newGlobalParameters.Remove($gp.Key)
}
$dataFactory = Get-AzDataFactoryV2 -ResourceGroupName $resourceGroupName -Name $dataFactoryName
$dataFactory.GlobalParameters = $newGlobalParameters
Write-Host "Updating" $newGlobalParameters.Count "global parameters."
Set-AzDataFactoryV2 -InputObject $dataFactory -Force

Azure DevOps pipeline, Azure PowerShell 4.* script loses AzContext in "ForEach -Parallel"?

We have a number of Azure function apps to deploy, using the same code. In our Azure DevOps pipeline we have an Azure PowerShell (4.*) script to deploy the code and start the function app:
Param (
[string] $resourceGroupName,
[string[]] $funcapps,
[string] $filePath
)
Write-Output "Deploying the following function apps: $funcapps";
foreach ($app in $funcapps)
{
Write-Output "Deploying function app: $app";
$webapp = Publish-AzWebapp -ResourceGroupName $resourceGroupname -Name $app -ArchivePath $filePath -Force;
Write-Output "Starting function app: $app";
$webapp = Start-AzWebApp -ResourceGroupName $resourceGroupName -Name $app;
Write-Output "Started function app: $app = $($webapp.State)";
}
This works fine (both from local PowerShell and from Azure DevOps), but with the number of apps we're deploying can take a while. To try to make it perform better, we tried to run the publish/start statements in parallel:
Param (
[string] $resourceGroupName,
[string[]] $funcapps,
[string] $filePath
)
Workflow Parallel-Deploy {
Param (
[string] $resourceGroupName,
[string[]] $funcapps,
[string] $filePath
)
Write-Output "Deploying the following function apps: $funcapps";
foreach -parallel($app in $funcapps)
{
Write-Output "Deploying function app: $app";
$webapp = Publish-AzWebapp -ResourceGroupName $resourceGroupname -Name $app -ArchivePath $filePath -Force;
Write-Output "Starting function app: $app";
$webapp = Start-AzWebApp -ResourceGroupName $resourceGroupName -Name $app;
Write-Output "Started function app: $app = $($webapp.State)";
}
}
Parallel-Deploy -resourceGroupName $resourceGroupName -funcapps $funcapps -filePath $filePath
The code is the same, just moved into a Workflow to use "foreach -parallel".
If I run the script from a local PowerShell, everything works fine - but from the Azure DevOps pipeline, I get an error, No account found in the context. Please login using Connect-AzAccount.
I've found reference to changes made in the Azure PowerShell DevOps task, that the context needs to be passed explicitly to background tasks. I tried to follow the example listed (Save-AzContext and Import-AzContext - updating from the Save-AzureRmContext/Import-AzureRmContext in the example), but it's still giving me the same error.
Any suggestions on what I'm doing wrong, and how to get the context correctly set inside the "foreach -parallel" block?
Edit 1
I probably should have shown exactly what I did for SaveContext/ImportContext...
Param (
[string] $resourceGroupName,
[string[]] $funcapps,
[string] $filePath,
[string] $tmpDir
)
$contextPath = "$tmpDir/context.json"
Save-AzContext -Path $contextPath" -Force
Workflow Parallel-Deploy {
Param (
[string] $resourceGroupName,
[string[]] $funcapps,
[string] $filePath,
[string] $contextPath
)
foreach -parallel($app in $funcapps)
{
# Output context - initially not set
Get-AzContext;
# Fetch and display context - now set
Import-AzContext -Path $contextPath;
Get-AzContext;
Write-Output "Deploying function app: $app";
$webapp = Publish-AzWebapp -ResourceGroupName $resourceGroupname -Name $app -ArchivePath $filePath -Force;
...
This still gave me the error that the account wasn't found in the context.
As per the suggestions, I changed to using a job instead:
foreach ($app in $funcapps) {
$jobname = "$app-Job";
Start-Job -Name $jobname -ScriptBlock {
Param (
[string] $resourceGroupName,
[string[]] $funcapps,
[string] $filePath,
[string] $contextPath
)
# Output context - initially not set
Get-AzContext;
# Fetch and display context - now set
Import-AzContext -Path $contextPath;
Get-AzContext;
Write-Output "Deploying function app: $app";
$webapp = Publish-AzWebapp -ResourceGroupName $resourceGroupname -Name $app -ArchivePath $filePath -Force;
...
And this too said the context wasn't correct.
save\import should work just fine, as well as just allowing the context autosave Enable-AzContextAutosave.
Alternatively you can just a native capability to launch cmdlets as jobs:
Publish-AzWebapp -ResourceGroupName $resourceGroupname -Name $app `
-ArchivePath $filePath -Force -AsJob
and then just wait for the jobs to finish and start the webapps.

Powershell Split function not working in VSTS Release pipeline

I have written a powershell script which takes multiple webapps(comma separated) as input.
I am splitting these webapps using powershell split function and configuring webapps by traversing each one of them using for-each loop.
Everything works fine in Powershell editor but when I configure the same script to VSTS release pipeline , split function doesn't work and which results in failure.
Input : devopstestwebapp1,devopstestwebapp2
Code : $WebAppName = $WebAppName.Split(',')
Output (After Split) : devopstestwebapp1 devopstestwebapp2
Error : The Resource 'Microsoft.Web/sites/devopstestwebapp1
devopstestwebapp2' under resource group 'DevOpsResourseGroup' was not found.
Following is my powershell script
# Parameters
param (
[Parameter(Position=0,mandatory=$true)]
[string] $AADAppID,
[Parameter(Position=1,mandatory=$true)]
[string] $AADKey,
[Parameter(Position=2,mandatory=$true)]
[string] $TenantId,
[Parameter(Position=3,mandatory=$true)]
[string] $ResourceGroupName,
[Parameter(Position=4,mandatory=$true)]
[string] $ServerName,
[Parameter(Position=5,mandatory=$true)]
[string] $RGLocation,
[Parameter(Position=6,mandatory=$true)]
[string] $WebAppName,
[Parameter(Position=7,mandatory=$true)]
[string] $SubscriptionName
)
# Connect to Azure
$ssAADKey = ConvertTo-SecureString $AADKey -AsPlainText -Force
$psCredential = New-Object System.Management.Automation.PSCredential($AADAppID, $ssAADKey)
Connect-AzureRmAccount -ServicePrincipal -Credential $psCredential -Subscription $SubscriptionName -TenantId $TenantId
write-host $WebAppName
$WebAppName = $WebAppName.Split(',')
write-host $WebAppName
Foreach ($servicename in $WebAppName)
{
write-host $servicename
  }
Below works perfectly with VSTS powershell task :
Store app name in variable :
$WebAppName = '$(WebAppName)'
write-host $WebAppName
foreach($servicename in $WebAppName.Split(','))
{
write-host $servicename
}
Output :
2019-04-22T11:02:02.7680996Z devopstestwebapp1,devopstestwebapp2,devopstestwebapp3
2019-04-22T11:02:02.7737101Z devopstestwebapp1
2019-04-22T11:02:02.7750490Z devopstestwebapp2
2019-04-22T11:02:02.7765756Z devopstestwebapp3
The problematic line is this one:
$WebAppName = $WebAppName.Split(',')
You are reassigning the result of split to the same variable $WebAppName which has been declared as a string in the parameter list. So the array result of Split will be cast to a string, not an array anymore.
The solution is to assign the result of split to a new variable:
$WebAppNameSplit = $WebAppName.Split(',')

VSTS - Deleting previous deployments during a release

I'm working for an Azure project where the deployments can only be made using the ARM templates from Visual Studio CI and we have only read access to the Azure Portal.
Currently I'm getting the below error and can make no releases. I cannot delete deployments from Portal either since I have only permission to configure build and release phases, I was wondering if there is any phase I can create where the previous deployments are deleted.
So far I tried couple of things like using inline PowerShell command Remove-AzureRmResourceGroupDeployment , deleting resource group of type Microsoft.Resouces/deployments before the resource deployment phase but non of them worked.
[error]Creating the deployment 'azuredeploy-2017721-715' would exceed the quota of '800'. The current deployment count is '800', please delete some deployments before creating a new one. Please see https://aka.ms/arm-deploy for usage details.
Here is a script that allows you to delete such deployments in a parallel manner. You can also use this for an Azure PowerShell task in Azure DevOps. If you have issues regarding authentication have a look here: Azure credentials have not been set up or have expired, please run Connect-AzAccount
Param(
[string]
[Parameter(Mandatory = $true)]
$subscriptionId,
[string]
[Parameter(Mandatory = $true)]
$tenantId,
[string]
[Parameter(Mandatory = $true)]
$resourceGroupName,
[int]
[Parameter(Mandatory = $true)]
$numberOfDeploymentsToKeep,
[int]
[Parameter(Mandatory = $true)]
$batchSize
)
try {
$c = Get-AzContext
}
catch {
$c = $null
}
if (!$c -or !$c.Account) {
Connect-AzAccount -Subscription $subscriptionId -Tenant $tenantId
} else {
Select-AzSubscription -Subscription $subscriptionId -Tenant $tenantId
}
# ----------------------------------
# Get Deployments
# ----------------------------------
#$dateBeforeDeleteDeployments = Get-Date -Year 2018 -Month 06 -Day 30
#$deploymentsToDelete = Get-AzResourceGroupDeployment -ResourceGroupName $resourceGroupName | Where-Object { $_.Timestamp -le $dateBeforeDeleteDeployments }
$currentDeployments = Get-AzResourceGroupDeployment -ResourceGroupName $resourceGroupName
$currentNumberOfDeployments = ($currentDeployments | Measure-Object).Count
$numberOfDeploymentsToRemove = $currentNumberOfDeployments - $numberOfDeploymentsToKeep
if ($numberOfDeploymentsToRemove -lt 0) {
throw "Number of deployments to remove is < 0..."
}
if ($numberOfDeploymentsToRemove -eq 0) {
Write-Host "Number of deployments to remove is 0..."
return
}
Write-Host "Number of Deployments to remove: '$numberOfDeploymentsToRemove'..."
$deploymentsToDelete = $currentDeployments | Sort-Object -Property Timestamp | Select-Object -First $numberOfDeploymentsToRemove
$deploymentsToDelete | ForEach-Object {$i=0; $j=0; $deploymentsToDeleteBatched=#{}} {
if($i -ne $batchSize -and $deploymentsToDeleteBatched["Batch $j"]) {
$deploymentsToDeleteBatched["Batch $j"]+=$_
$i+=1
}
else {
$i=1
$j+=1
$deploymentsToDeleteBatched["Batch $j"]=#($_)
}
}
Write-Host "Created $($deploymentsToDeleteBatched.Count) batches..."
# ----------------------------------
# Execute deletion in parallel
# ----------------------------------
$jobNames = #()
foreach ($batchkey in $deploymentsToDeleteBatched.Keys) {
$deploymentsToDeleteBatch = $deploymentsToDeleteBatched.$batchkey
$logic = {
Param(
[object]
[Parameter(Mandatory = $true)]
$ctx,
[object]
[Parameter(Mandatory = $true)]
$deploymentsToDeleteBatch,
[string]
[Parameter(Mandatory = $true)]
$resourceGroupName
)
foreach ($deploymentToDelete in $deploymentsToDeleteBatch) {
$deploymentName = $deploymentToDelete.DeploymentName
Remove-AzResourceGroupDeployment -ResourceGroupName $resourceGroupName -Name $deploymentName -DefaultProfile $ctx -ErrorAction Stop
Write-Host "Deleted Deployment '$deploymentName' from '$($deploymentToDelete.Timestamp)'..."
}
}
$jobName = ([System.Guid]::NewGuid()).Guid
$jobNames += $jobName
$jobObject = Start-Job $logic -Name $jobName -ArgumentList (Get-AzContext), $deploymentsToDeleteBatch, $resourceGroupName
}
while (Get-Job -State "Running") {
Write-Host "---------------------------------------------------------------"
Write-Host "Jobs still running..."
Get-Job | Format-Table
Write-Host "---------------------------------------------------------------"
Start-Sleep -Seconds 10
}
Write-Host "Jobs completed, getting output..."
Write-Host "---------------------------------------------------------------"
foreach ($jobName in $jobNames) {
Write-Host "Output of Job '$jobName'..."
Receive-Job -Name $jobName
Write-Host "---------------------------------------------------------------"
}
Write-Host "Done..."
Use the Azure PowerShell task. It takes care of authentication for you -- no need to call Login-AzureRmAccount.