How to do a Foreach loop or Each within Azure Pipeline - azure-devops

I am trying to run an Azure Pipeline that queries an Azure Storage table, I can do this no problem by passing in $AppName from a normal pipeline variable, but id like to loop though multiple apps from an app list within the yaml file.
The yaml file I am using is below:
trigger:
- master
variables:
- name: AppNames
value:
[
"7zip",
"AdobeAcrobatReaderDC",
"CitrixWorkspaceApp",
"GoogleChrome",
"LAPS",
"Mimecast",
"Nessus",
"NotepadPlusPlus",
"MicrosoftWvdRemoteDesktop",
]
- name: baseurl
value: $(NexusProdRepo)
- name: genRepo
value: $(ClientRepo)
- name: APIKey
value: $(PRODAPIKey)
pool:
name: $(PoolName)
demands:
- agentOS -equals $(agentOS)
stages:
- stage: Deployment
jobs:
- job: DeployApps
steps:
- script: echo "Deploying $(AppName)"
env:
AppName: ${{ each.value }}
forEach: ${{ variables.AppNames }}
- stage: QueryAzureTableStorage_Stage
dependsOn:
- ConnectiontoAzure
jobs:
- job: QueryAzureTableStorage_Job
steps:
- task: PowerShell#2
displayName: "Query Azure Table Storage"
name: "Query_Azure_Table_Storage"
inputs:
targetType: filePath
filePath: "$(Build.SourcesDirectory)/GetAndQueryStorageTable.ps1"
arguments: "-StorageAccountName $(StorageAccountName) -ResourceGroupName $(ResourceGroupName) -TableName $(TableName) -AppName $(AppName)"
Is anyone able to correct me on where I'm going wrong regards the foreach loop, or if its even possible?

loop though multiple apps from an app list within the yaml file.
Based on your requirement, I suggest that you can use Parameters and Each expression in YAML Pipeline. Refer to this doc: Each keyword
Here is an example:
parameters:
- name: AppNames
type: object
default: [7zip,AdobeAcrobatReaderDC]
stages:
- ${{ each app in parameters.AppNames }}:
- stage: Deployment_${{ app }}
jobs:
- job: DeployApps
steps:
- script: echo "Deploying "${{ app }}""
Result:

Related

It doesn't work pass variable from one stage to another and iterate with an each loop

I need to do an each from a variable obtained with stageDependencies previously.
It doesn't work Pass variable from one stage to another and iterate with an each loop using a split function
What am I doing wrong?
Code:
task.yml
steps:
- task: Bash#3
name: env_string
inputs:
targetType: 'inline'
script: |
environmentsStr='dev,prd'
echo "##vso[task.setvariable variable=environmentsString;isOutput=true]$environmentsStr"
pipeline
- stage: stage1
jobs:
- job: job_stage1
steps:
- template: tasks.yml
- stage: stage2
dependsOn:
- stage1
variables:
- name: envFromVar
value: 'dev,prd'
- name: envFromStageDependencies
value: $[ stageDependencies.stage1.job_stage1.outputs['env_string.environmentsString'] ]
jobs:
- job: job_stage2
steps:
- template: stage3.yml
parameters:
envFromStageDependencies: $(envFromStageDependencies)
environmentsFromVar: ${{ variables.envFromVar }}
stage3.yml
parameters:
- name: envFromStageDependencies
type: string
- name: environmentsFromVar
type: string
steps:
- ${{ each environment in split(parameters.envFromStageDependencies, ',') }}:
- bash: |
echo "env ${{ environment }}" # OUTPUT FAIL; (env dev,prd)
- ${{ each environment in split(parameters.environmentsFromVar, ',') }}:
- bash: |
echo "envVar ${{ environment }}" # OUTPUT OK, 2 iterations envVar [dev,prd]
OUTPUT
BASH -> env dev,prd
BASH -> envVar dev
BASH -> envVar prd
You should change your pipeline YAML like as below:
. . .
jobs:
- job: job_stage2
steps:
- template: stage3.yml
parameters:
environments: ${{ variables.envFromStageDependencies }}
environmentsFromVar: ${{ variables.envFromVar }}
As #Daniel Mann has mentioned, the template is processed at compile time which is before runtime. So, to pass the variable values into the template, you should use the compile time variable syntax (${{ variables.varName }}) instead of runtime variable syntax ($(varName)).
For more details, you can reference the document "Understand variable syntax".

How to convert string variable to object and pass it to another pipeline?

I have two Azure Devops pipelines: 'Starter' and 'Processing'. 'Starter' triggers 'Processing' and passes some parameters to it.
Starter:
trigger: none
pool:
vmImage: 'windows-2019'
stages:
- stage: A
jobs:
- template: Processing.yml
parameters:
products: $(Products)
creds: $(Creds)
Processing:
parameters:
- name: products
type: object
default: []
- name: creds
default: ''
jobs:
- ${{ each product in parameters.products }}:
- task: PowerShell#2
displayName: Importing ${{ product }} solution
inputs:
targetType: 'inline'
script: |
#Code
Key detail here is opportunity to loop through 'products' variable (each product in parameters.products), which must be setted in 'Starter' variables:
In other words, starting my pipeline I must pass list of products as 'string' and then loop through this list in second pipeline. 'Is it generally possible? Maybe products should be another type? I tried some work around but didn't get appropriate solution:
- job: Prepare_Products_Array
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
# Write your PowerShell commands here.
$productsArray = []
$productsArray = $(Products)
$productsArray = $productsArray.Split(',')
Write-Host ("##vso[task.setvariable variable=productsArray;]$productsArray")
- template: Processing.yml
parameters:
products: $env:productsArray
Exception:
From your yaml sample, you are defining the variable in YAML Pipeline UI and using parameters in YAML Template.
In this case, the variables defined on the UI will be assigned at runtime, but the parameters and expressions in the YAML template will be expanded at compile time.
Therefore, YAML UI variables cannot be passed to the Pipeline YAML Template.
And it will show the error:
Expected a...... Actual value $(Product)
This means that the pipeline variable not expanded at compile time.
I am afraid that there is no such method can pass the UI Pipeline variable to YAML Template.
Here are the workarounds:
Method 1 : You can use parameters in Starter yaml to pass the Object type parameters to YAML template.
Starter:
trigger: none
parameters:
- name: products
type: object
default: []
- name: creds
default: ''
pool:
vmImage: 'windows-2019'
stages:
- stage: A
jobs:
- template: Processing.yml
parameters:
products: ${{ parameters.products }}
creds: ${{ parameters.creds }}
Processing:
parameters:
- name: products
type: object
default: []
- name: creds
default: ''
jobs:
- job: test
steps:
- ${{ each product in parameters.products }}:
- task: PowerShell#2
displayName: Importing ${{ product }} solution
inputs:
targetType: 'inline'
script: |
echo ${{ product }}
Result: You can input the value when you run the pipeline.
Method2: You need to define the variable in Starter pipeline and change the products parameters as String type. Then you can use the expression - ${{ each product in split(parameters.products, ',')}}: to split the string.
Starter:
pool:
vmImage: 'windows-2019'
variables:
products: '1,2,3'
creds: test
stages:
- stage: A
jobs:
- template: Processing.yml
parameters:
products: ${{ variables.products }}
creds: ${{ variables.creds }}
Processing:
parameters:
- name: products
type: string
default: ''
- name: creds
default: ''
jobs:
- job: test
steps:
- ${{ each product in split(parameters.products, ',')}}:
- task: PowerShell#2
displayName: Importing ${{ product }} solution
inputs:
targetType: 'inline'
script: |
echo ${{ product }}

Azure Pipeline: Unexpected value 'steps'

Can anybody point me in the right direction here?
When I attempt to kick off the pipeline from the main yaml template the reference template gives me the unexpected 'steps' value.
I attempted to add a stage before the defined job and then receive the unexpected 'stage' message. Looking at some similar previously asked questions I understand a step can't be place directly under a 'stage' but this is not the case here.
parameters:
- name: baseEnvironmentName
type: string
- name: environmentSuffix
type: string
- name: postDeploySteps
type: stepList
default: []
- name: publishedPackageName
type: string
- name: serviceName
type: string
- name: dnsHostName
type: string
jobs:
- deployment: ${{ parameters.environmentSuffix }}Deployment
displayName: '${{ parameters.environmentSuffix }} Deployment'
pool:
vmImage: 'windows-2019'
environment: ${{ parameters.baseEnvironmentName }}-${{ parameters.environmentSuffix }}
variables:
- template: variables/variables.common.yaml
- template: variables/variables.${{ parameters.environmentSuffix }}.yaml
- name: MonitorExceptionAlert_Name
value: '${{ variables.IdPrefix }}-${{ parameters.environmentSuffix }}-${{ parameters.dnsHostName }}-ExceptionAlert'
steps:
- checkout: self
- checkout: common_iac
- task: Random Task Name
displayName: 'Create Environment Resource Group'
inputs:
ConnectedServiceName: '$(ADOS.ServiceConnectionName)'
resourceGroupName: '$(ResourceGroup.Name)'
location: '$(ResourceGroup.Location)'
condition: and(succeeded(), ne(variables['CreateResourceGroupInTaskGroup'], 'false'))
- task: Random Task Name 2
displayName: 'Create Application Insights'
inputs:
ConnectedServiceName: '$(ADOS.ServiceConnectionName)'
ResourceGroupName: '$(ResourceGroup.Name)'
Location: '$(ResourceGroup.Location)'
instanceName: '$(ApplicationInsights.Name)'
You are using deployment jobs. Try defining a strategy. Try replacing steps: inline with deployment with:
- deployment: ....
strategy:
runOnce:
deploy:
steps:
From the example at the Azure docs templates page, jobs: should contain a - job: which contains steps:. I don't know that jobs: can directly contain steps:.

AzureDevops set variables based on parameter and pass to template at next stage

In Azure DevOps, I'm trying to create a pipeline which offers a simple selection of pre-set options to the user running it. These options will be converted into different combinations of parameters as specified by a templated stage (the definition of which, I have no control over). The idea of my pipeline is that frequently-used build configurations are easy to select correctly, rather than having to manually set 3 or 4 different parameters.
I need the "Build.Setup" from immutable_pipeline to print config_one, profile_one when the first radio is selected (buildType=type1), config_two, profile_two when buildType=type2, and so on.
Unfortunately I'm really struggling to get any variable value into the templated stage other than the defaults. Are ADO variables even mutable variables at all - or just constants?
I've read the MS docs extensively and understand the meaings of the different macro declaration types. I've tried many different combinations of syntaxes ${{...}}, $(...) and $[...], all behave differently but none seems able to deliver what's needed. Is this even possible? Is there a simple solution someone can suggest?
Pipeline:
name: $(Date:yyyyMMdd).$(Rev:r)
parameters:
- name: buildType
displayName: 'Type of build'
type: string
default: 'type3'
values: ['type1', 'type2', 'type3']
pool:
name: default
variables:
- name: config
value: 'defaultConfig'
- name: profile
value: 'defaultProfile'
stages:
- stage: Stage1
displayName: Prepare build config
jobs:
- job: Job1_1
steps:
- checkout: none
- task: Bash#3
name: SetVariables
inputs:
targetType: inline
script: |
p1='${{ parameters.buildType }}'
v1='$(config)'
v2='$(profile)'
echo -e "BEFORE: p1='${p1}'\n v1='${v1}'\n v2='${v2}'"
case ${p1} in
type1)
v1='config_one'
v2='profile_one'
;;
type2)
v1='config_two'
v2='profile_two'
;;
type3)
v1='config_three'
v2='profile_three'
;;
esac
echo -e "AFTER: p1='${p1}'\n v1='${v1}'\n v2='${v2}'"
echo "##vso[task.setvariable variable=config]${v1}"
echo "##vso[task.setvariable variable=profile;isOutput=True]${v2}"
- job: Job1_2
dependsOn: Job1_1
variables:
- name: variable1
value: $(config)
- name: variable2
value: $[ dependencies.Job1_1.outputs['SetVariables.profile']]
steps:
- task: Bash#3
name: GetVariables2
inputs:
targetType: inline
script: |
echo -e 'SAME STAGE: v1="$(variable1)"\n v2="$(variable2)"'
# Next stage - use computed values for "config" and "profile"
- template: templates/immutable_pipeline.yml
parameters:
config: $(config)
profile: ${{ variables.profile }}
templates/immutable_pipeline.yml:
Note that I don't have access to change this, I can't make it dependsOn: Stage1.Job1_1.
parameters:
- name: config
displayName: 'Config'
type: string
default: 'unset'
- name: profile
displayName: 'Profile'
type: string
default: 'unset'
stages:
- stage: Build
displayName: Templated build
jobs:
- job: Setup
pool:
name: default
demands:
- Agent.OS -equals Linux
steps:
- checkout: none
- script: |
echo '##[info] parameters.config=${{ parameters.config }}'
echo '##[info] parameters.profile=${{ parameters.profile }}'
I just found one solution (which is arguably simpler than using variables) using the ${{ if eq(...) }}: syntax:
name: $(Date:yyyyMMdd).$(Rev:r)
parameters:
- name: buildType
displayName: 'Type of build'
type: string
default: 'type3'
values: ['type1', 'type2', 'type3']
pool:
name: default
stages:
- template: templates/immutable_pipeline.yml
${{ if eq(parameters.buildType, 'type1') }}:
parameters:
config: config_one
profile: profile_one
${{ if eq(parameters.buildType, 'type2') }}:
parameters:
config: config_two
profile: profile_two
${{ if eq(parameters.buildType, 'type3') }}:
parameters:
config: config_three
profile: profile_three
Still interested in whether the original approach of setting variables is even possible, if only beause I've spent so much time on it.

Execute Azure Devops job on a pool based on conditional parameter

I am trying to execute an Azure Devops Job on a specific pool based on a condition.
The goal is to switch between self-hosted agent and microsoft agent.
Here is the configuration:
parameters:
custom_agent: true
jobs:
- job: Test
displayName: Test job
- ${{ if eq(parameters.custom_agent, true) }}:
- pool:
name: mypool
demands:
- agent.os -equals Linux
- ${{ if eq(parameters.custom_agent, false) }}:
- pool:
vmImage: 'ubuntu-latest'
steps:
- task: npmAuthenticate#0
Any ideas ?
The below example solved my requirement
parameters:
- name: 'vmImage'
type: string
default: 'ubuntu-latest'
- name: 'agentPool'
type: string
default: ''
jobs:
- job: 'Example'
pool:
${{ if ne(parameters.agentPool, '') }}:
name: ${{ parameters.agentPool }}
${{ if eq(parameters.agentPool, '') }}:
vmImage: ${{ parameters.vmImage }}
steps:
- script: example
Another apporach to conditionally select pools if you use non-vm pools:
variables:
- ${{ if eq(parameters.custom_agent, true) }}:
- name: testJobPool
value: mypool
- ${{ if eq(parameters.custom_agent, false) }}:
- name: testJobPool
value: mypool_second
jobs:
- job: Test
displayName: Test job
pool:
name: $(testJobPool)
steps:
- task: npmAuthenticate#0
This has proved working.
We can specify conditions under which a step, job, or stage will run. We can configure the jobs in the pipeline with different condition entries, and set demands based on those conditions.
A skeleton version looks like this:
parameters:
- name: custom_agent
displayName: Pool Image
type: boolean
default: True
jobs:
- job: selfhostedagent
condition: eq(${{ parameters.custom_agent }}, True)
displayName: 'self_hosted agent'
pool:
name: Default
demands:
- Agent.Name -equals WS-VITOL-01
steps:
- script: echo self_hosted agent
- job: hostedagent
condition: eq(${{ parameters.custom_agent }}, False)
displayName: 'hosted agent'
pool:
vmImage: 'ubuntu-latest'
steps:
- script: echo hosted agent
Update1
In addition, we can configure task template, then use the template in the steps.
Result:
It looks like pool is not a valid property of a job type
Try switching your job to a deployment type:
jobs:
- deployment: Test
- ${{ if eq(parameters.custom_agent, true) }}:
pool:
name: mypool
demands:
agent.os -equals Linux
strategy:
runOnce:
deploy:
steps: