Pass Variable Group as Dictionary To Python Script - azure-devops

I have a variable group that i'm using from a python script. Something like this:
- task: PythonScript#0
inputs:
scriptSource: 'inline'
script: |
print('variableInVariableGroup: $(variableInVariableGroup)')
I'd like to write my script so that I can iterate over the variable group without explicitly referencing individual variables. Is there a way to feed in the entire variable group to the script as a dictionary or something similar?

You could not do that directly, for the workaround is to get the vars via azure cli and set with task variable, then get them in the python script task.
Something like below:
# 'Allow scripts to access the OAuth token' was selected in pipeline. Add the following YAML to any steps requiring access:
# env:
# MY_ACCESS_TOKEN: $(System.AccessToken)
# Variable Group 'vargroup1' was defined in the Variables tab
resources:
repositories:
- repository: self
type: git
ref: refs/heads/testb2
jobs:
- job: Job_1
displayName: Agent job 1
pool:
vmImage: ubuntu-20.04
steps:
- checkout: self
persistCredentials: True
- task: PowerShell#2
name: TestRef
displayName: PowerShell Script
inputs:
targetType: inline
script: >-
echo $(System.AccessToken) | az devops login
$a=az pipelines variable-group variable list --org 'https://dev.azure.com/orgname/' --project testpro1 --group-id 3 --only-show-errors --output json
echo "$a"
echo "##vso[task.setvariable variable=allvars;isOutput=true]$a"
- task: PythonScript#0
displayName: Run a Python script
inputs:
scriptSource: inline
script: "b=$(TestRef.allvars)\nprint(b)\n\n "
...

Related

Use variable from AWSCLI in azure pipelines for script

I have a build process where I need to use a token, received through the AWSCLI. So far I have connected aws to my azure pipelines but I am having trouble setting up my yaml.
I want to fetch the relevant token to use it later as a variable in my script.
As you can see in my yaml I am running a powershell script with codeartifact and I am saving the value to my myOutputVar. The powershell script does not throw an error.
However, later when I run the building script that variable is not present resulting in ECHO is off.
How can I ensure the value received in the task can be used later in the script/build part?
trigger:
- azure-pipelines
pool:
vmImage: windows-latest
steps:
- task: NodeTool#0
inputs:
versionSpec: '10.x'
displayName: 'Install Node.js'
- task: AWSPowerShellModuleScript#1
inputs:
awsCredentials: 'AWS Connection'
regionName: 'eu-central-1'
scriptType: 'inline'
inlineScript: '##vso[task.setvariable variable=myOutputVar;]aws codeartifact get-authorization-token --domain somedomain --domain-owner 444.... --query authorizationToken --output text; '
- script: |
echo %myOutputVar%
npm ci
npm run build
displayName: 'npm install and build'
Your inline script can be multiple lines, and since this is PowerShell you can do something like:
inlineScript: |
$authToken = aws codeartifact get-authorization-token `
--domain somedomain `
--domain-owner 444.... `
--query authorizationToken `
--output text
Write-Host "##vso[task.setvariable variable=myOutputVar;]$authToken"

Azure DevOps yaml: use a powershell task output parameter to generate a loop in dependent job

I have the following yaml as used in an Azure DevOps pipeline (this is not the full pipeline - it's just a portion of yaml that is in a template):
jobs:
- job: CheckExcludedWorkspaces
displayName: Check Excluded Workspaces
pool:
name: DefaultWindows
steps:
- task: PowerShell#2
name: GetWorkspaces
displayName: Check Excluded Workspaces
inputs:
filePath: "$(System.DefaultWorkingDirectory)/pipelines_v2/powershell/checkExcludedWorkspaces.ps1"
targetType: FilePath
errorActionPreference: 'stop'
arguments: -environmentFolder "$(rootFolderPrefix)\${{parameters.environmentFolder}}" -excludeFolderList "${{parameters.tagOutList}}"
pwsh: false
# - ${{ each folder in dependencies.CheckExcludedWorkspaces.outputs['GetWorkspaces.WorkspaceList'] }}:
- job: NewJob
dependsOn: CheckExcludedWorkspaces
variables:
testVar: $[ dependencies.CheckExcludedWorkspaces.outputs['GetWorkspaces.WorkspaceList'] ]
pool:
name: DefaultWindows
steps:
- powershell: |
Write-Host "Test var = $(testVar)"
displayName: Test workspaces output
this works correctly in that the second job retrieves a variable from a powershell task in the previous job and outputs that variable value. The task in the second job outputs a list of apps using variable testVar. The output contains:
app1,app2,app3,app4 etc
I would like to take this the next stage which is I would like to create a loop of jobs that repeated runs for this application list. Something like:
jobs:
- job: CheckExcludedWorkspaces
displayName: Check Excluded Workspaces
pool:
name: DefaultWindows
steps:
- task: PowerShell#2
name: GetWorkspaces
displayName: Check Excluded Workspaces
inputs:
filePath: "$(System.DefaultWorkingDirectory)/pipelines_v2/powershell/checkExcludedWorkspaces.ps1"
targetType: FilePath
errorActionPreference: 'stop'
arguments: -environmentFolder "$(rootFolderPrefix)\${{parameters.environmentFolder}}" -excludeFolderList "${{parameters.tagOutList}}"
pwsh: false
- ${{ each folder in dependencies.CheckExcludedWorkspaces.outputs['GetWorkspaces.WorkspaceList'] }}:
- job: NewJob
dependsOn: CheckExcludedWorkspaces
variables:
testVar: $[ dependencies.CheckExcludedWorkspaces.outputs['GetWorkspaces.WorkspaceList'] ]
pool:
name: DefaultWindows
steps:
- powershell: |
Write-Host "Test var = ${{folder}}"
displayName: Test workspaces output
This code gives me an error:
Unrecognized value: 'dependencies'. Located at position 1 within expression: dependencies.CheckExcludedWorkspaces.outputs['GetWorkspaces.WorkspaceList']
Is there a way i can use a powershell task output variable, to create a list of jobs in a dependent job? The problem is that i don't know at design time what the list of applications will be (the pipeline should ideally find this out when it runs). The list of applications is based on the list of folders that are created within a repository - which changes over time..
In current situation, we cannot use the 'each' key word for the variables. The 'each' keyword is used for the Obj type, but the variable is String.
For more details, you can refer the doc: Each keyword

Create variables dynamically in azure pipeline

I'm trying to generate release notes in an azure piplelines stage and push the note to an azure service bus.
How do I expose the variable in a bash script then consume it in a subsequent job in the same stage?
I'm using a bash task to execute a git command and trying to export it as an environment variable which I want to use in the following job.
- stage: PubtoAzureServiceBus
variables:
COMMIT_MSG: "alsdkfgjdsfgjfd"
jobs:
- job: gitlog
steps:
- task: Bash#3
inputs:
targetType: 'inline'
script: |
# Write your commands here
export COMMIT_MSG=$(git log -1 --pretty=format:"Author: %aN%n%nCommit: %H%n%nNotes:%n%n%B")
env | grep C
- job:
pool: server
dependsOn: gitlog
steps:
- task: PublishToAzureServiceBus#1
inputs:
azureSubscription: 'Slack Release Notifications'
messageBody: |
{
"channel":"XXXXXXXXXX",
"username":"bot",
"iconEmoji":"",
"text":":airhorn: release :airhorn: \n`$(COMMIT_MSG)`"
}
signPayload: false
waitForCompletion: false
You need to use logging syntax and output variables like it is shown here:
trigger: none
pool:
vmImage: 'ubuntu-latest'
stages:
- stage: A
jobs:
- job: A1
steps:
- bash: echo "##vso[task.setvariable variable=shouldrun;isOutput=true]true"
# or on Windows:
# - script: echo ##vso[task.setvariable variable=shouldrun;isOutput=true]true
name: printvar
- stage: B
dependsOn: A
jobs:
- job: B1
condition: in(stageDependencies.A.A1.result, 'Succeeded', 'SucceededWithIssues', 'Skipped')
steps:
- script: echo hello from Job B1
- job: B2
variables:
varFromA: $[ stageDependencies.A.A1.outputs['printvar.shouldrun'] ]
steps:
- script: echo $(varFromA) # this step uses the mapped-in variable
Please take a look here to check documentation.
So you need to replace
export COMMIT_MSG=$(git log -1 --pretty=format:"Author: %aN%n%nCommit: %H%n%nNotes:%n%n%B")
wit logging command with isOutput=true
and then map it as here
jobs:
- job: A
steps:
- bash: |
echo "##vso[task.setvariable variable=shouldrun;isOutput=true]true"
name: ProduceVar # because we're going to depend on it, we need to name the step
- job: B
dependsOn: A
variables:
# map the output variable from A into this job
varFromA: $[ dependencies.A.outputs['printvar.shouldrun']
steps:
- script: echo $(varFromA) # this step uses the mapped-in variable
as you want to share variable between jobs (not stages as it shown in the first example).

Azure pipeline runtime evaluated variable passed a parameter

I have an AzurePowerShell script (powershell/gettenants.ps1) which sets the value of the tenants variable.
The following bash task successfully echos the new value BUT the following template recieves the default value (set at the top of the script). Note Im using expression syntax when specifying the template parameter value.
Any ideas what im doing wrong?
variables:
tenants: "default value"
- stage: Build_Shared_Update
jobs:
- job: Get_all_Tenants_Info
pool:
vmImage: 'windows-latest'
steps:
- checkout: self
fetchDepth: 1
- task: AzurePowerShell#4
inputs:
azureSubscription: 'Product Subscription(Guid)'
targetType: 'filePath'
scriptPath: powershell/gettenants.ps1
errorActionPreference: 'stop'
azurePowerShellVersion: 'latestVersion'
- task: Bash#3
inputs:
targetType: 'inline'
script: echo $(tenants)
- template: pipeline-templates/shared-infrastructure-plan.yml # Template reference
parameters:
tenants: ${{variables.tenants}}
Inside the template the parameter is referenced like this:
-out=sharedplan -var=list_of_tenants=${{parameters.tenants}}
when using ${{ }} syntax that variable is being replace at the compile time.
Read more here:
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch#understand-variable-syntax
The variables werent in scope!
Its worth noting that "much to learn" also identified this :)

Azure Pipeline Task inputs won't accept variables

In the azure pipeline yaml files, the variable imgRepoName is trimmed from the gitRepoName. An bash echo for gitRepoName shown core/cqb-api; bash echo for imgRepoName shown cqb-api
variables:
vmImageName: 'ubuntu-18.04'
gitRepoName: $(Build.Repository.Name)
imgRepoName: $(basename $(gitRepoName))
- job: build_push_image
pool:
vmImage: $(vmImageName)
steps:
- task: Docker#2
displayName: Build and Push image
inputs:
repository: imgRepoName
command: buildAndPush
containerRegistry: "coreContainerRegistry"
tags: test2
Issues:
When I wrote repository: "cqb-api" as the input for the docker task it works just fine, while use the variable directly as shown above won't create any images in the container registry.
PS, I also tried repository: $(imgRepoName) it give out the following error
invalid argument "***/$(basenamecore/cqb-api):test2" for "-t, --tag" flag: invalid reference format
It looks that it is executed at runtime. So gitreponame is replaced but basename function is not recognized in this context. You can check this:
variables:
gitRepoName: $(Build.Repository.Name)
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
$name = $(basename $(gitRepoName))
Write-Host "##vso[task.setvariable variable=imgRepoName]$name"
- task: Docker#2
displayName: Build and Push
inputs:
repository: $(imgRepoName)
command: build
Dockerfile: docker-multiple-apps/Dockerfile
tags: |
build-on-agent
It works for me.