We looking to create a pipeline to update our multi-tenant azure environment. We need to perform some actions during the update per tenant. To accomplish this, we would like to create a job per tenant, so we can process tenants in parallel. To accomplish this, I want to use a runtime parameter to pass the tenants to update to my pipeline as follows:
parameters:
- name: tenants
type: object
the value of the tenants parameter might look like something like this:
- Name: "customer1"
Someotherproperty: "some value"
- Name: "customer2"
Someotherproperty: "some other value"
to generate the jobs, we do something like this:
stages:
- stage:
jobs:
- job: Update_Tenant
strategy:
matrix:
${{ each tenant in parameters.Tenants }}:
${{ tenant.tenantName }}:
name: ${{ tenant.tenantName }}
someproperty: ${{ tenant.otherProperty }}
maxParallel: 2
steps:
- checkout: none
- script: echo $(name).$(someproperty)
Now what we need, is some way to fill this tenants parameter. Now I tried a few solutions:
Ideally I would like to put a build stage before the Update_Tenants stage to call a REST api to get the tenants, and expand the tenants parameter when the Update_Tenants stage starts, but this is not supported AFAIK, since parameter expansion is done when the pipeline starts.
A less ideal but still workable option would have been to create a variable group yaml file containing the tenants, and include this variable group in my pipeline, and use the ${{ variables.Tenants }} syntax to reference them. However, for some reason, variables can only be strings.
The only solution I can currently think of, is to create a pipeline that calls a REST api to get the tenants to update, and then uses the azure devops api to queue the actual update process with the correct parameter value. But this feels like a bit of a clunky workaround to accomplish this.
Now my question is, are there any (better?) alternatives to accomplish what I want to do?
Maybe this can help. I was able to use external source (.txt file) to fill array variable in azure pipelines.
Working example
# Create a variable
- bash: |
arrVar=()
for images in `cat my_images.txt`;do
arrVar+=$images
arrVar+=","
done;
echo "##vso[task.setvariable variable=list_images]$arrVar"
# Use the variable
# "$(list_images)" is replaced by the contents of the `list_images` variable by Azure Pipelines
# before handing the body of the script to the shell.
- bash: |
echo my pipeline variable is $(list_images)
Sources (there is also example for matrix)
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch#set-a-job-scoped-variable-from-a-script
Other sources
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/runtime-parameters?view=azure-devops&tabs=script
To accomplish this, we would like to create a job per tenant, so we
can process tenants in parallel.
Apart from rolling deployment strategy, you can also check Strategies and Matrix.
You can try something like this unless you have to use Runtime parameters:
jobs:
- job: Update
strategy:
matrix:
tenant1:
Someotherproperty1: '1.1'
Someotherproperty2: '1.2'
tenant2:
Someotherproperty1: '2.1'
Someotherproperty2: '2.2'
tenant3:
Someotherproperty1: '3.1'
Someotherproperty2: '3.2'
maxParallel: 3
steps:
- checkout: none
- script: echo $(Someotherproperty1).$(Someotherproperty2)
displayName: 'Echo something'
Related
I am trying to use variables defined at the root level on a YAML pipeline inside Azure DevOps inside templates via the template syntax, but it seems that the variables are not available inside the templates, but when adding the steps directly to the pipeline, the exact same thing works perfectly.
So with a pipeline snippet like that
variables:
- name: test
value: asdf
stages:
- stage:
jobs:
- job: test_job
steps:
- script: echo "${{ variables.test }}"
- template: ./test.yaml
And a test.yaml like that
jobs:
- job: test
steps:
- script: echo "${{ variables.test }}"
The script inside the test_job job writes out asdf while the job inside the template just resolves to echo "".
Since my understanding of pipeline templates is, that those basically get inserted into the main pipeline, this seems like a bug. Any ideas on how to use the root variables in a template syntax inside templates or why this is not working? (Macro synatx is not an option as I need the variable inside a templated condition like ${{ if eq(variables['test'], 'asdf') }})
For security reasons, we only allow you to pass information into
templated code via explicit parameters.
https://docs.microsoft.com/en-us/azure/devops/pipelines/process/templates?view=azure-devops
The means the author of the pipeline using your template needs to
commit changes where they explicitly pass the needed info into your
template code.
There are some exceptions to this, where the variable is statically
defined in the same file or at pipeline compile time, but generally
speaking, it’s probably better to use parameters for everything that
does not involve system-defined read-only dynamic variable and
custom-defined dynamic output variables.
This behavior is by design, check this thread in the developer community
So you can either pass the variable as a parameter to the template or define a centralized variables file to include in the template like here
I have an Azure DevOps pipeline which runs npm run based on some runtime parameters.
Is there a possibility to trigger for example 3 jobs in the same pipeline, each of the job with different runtime parameters?
Thank you
Is there a possibility to trigger for example 3 jobs in the same pipeline, each of the job with different runtime parameters?
Yes. You can define the object type parameter and use each expression to iterate over the values.
Here is an example:
parameters:
- name: tests
type: object
default: [test1,test2,test3]
jobs:
- ${{ each test in parameters.tests }}:
- job:
steps:
- script: echo ${{ test }}
Result:
For more detailed info, you can refer to this doc: Loop through parameters
How to create a multi-stage pipeline depending on a stage/job-name derived from a parameter whereas stages run firstly in parallel and eventually one stage that waits for all previous stages?
Here's what I've tried so far:
A multi-stage pipeline runs for several stages depending on a tool parameter in parallel, whereas dependsOn is passed as parameter. Running it in parallel for each tool waiting for the previous stage for the said tool works smoothly.
Main template: all wait for for all
- ${{ each tool in parameters.Tools }}:
- template: ../stages/all-wait-for-all.yml
parameters:
Tool: ${{ tool }}
stages/all-wait-for-all.yml
parameters:
- name: Tool
type: string
stages:
- stage: ALL_WAIT_${{ parameters.Tool}}
dependsOn:
- PREPARE_STAGE
- OTHER_TEMPLATE_EXECUTED_FOR_ALL_TOOLS_${{ parameters.Tool }}
Now there should be one stage that should only run once and not per tool, but it should only run after the individual tool stages are done. It can't be hardcoded as there are various tools. So I hoped defining the individual wait-stages in a prepare job would work out:
Main template: prepare-stage
- script: |
toolJson=$(echo '${{ convertToJson(parameters.Tools) }}')
tools=$(echo "$toolJson" | jq '.[]' | xargs)
stage="ALL_WAIT"
for tool in $tools; do
stageName="${stage}_${tool }"
stageWaitArray+=($stageName)
done
echo "##vso[task.setvariable variable=WAIT_ON_STAGES]${stageWaitArray}"
echo "##vso[task.setvariable variable=WAIT_ON_STAGES;isOutput=true]${stageWaitArray}"
displayName: "Define wait stages"
name: WaitStage
stages/one-waits-for-all.yml
stages:
- stage: ONE_WAITS
dependsOn:
- $[ stageDependencies.PREPARE_STAGE.PREPARE_JOB.outputs['waitStage.WAIT_ON_STAGES'] ]
whereas below error is shown:
Stage ONE_WAITS depends on unknown stage $[ stageDependencies.PREPARE_STAGE.PREPARE_JOB.outputs['WaitStage.WAIT_ON_STAGES'] ].
As I understand depends on can not have dynamic $[] or macro $() expressions evaluated at runtime. You can use template expressions ${{}} which are evaluated at queue time.
Guess I was overthinking the solution as eventually it was pretty obvious.
So first template can be called within a loop from the main template whereas it's executed as many times as tools we got. Second template shall be called once waiting on previous stages for all tools, whereas the job/stage prefix is known, only the tool name as postfix was unknown. So just add them in a loop directly in dependsOn..
Here you go:
stages:
- stage: ONE_WAITS
dependsOn:
- PREPARE_STAGE
- ${{ each tool in parameters.Tools }}:
- OTHER_TEMPLATE_EXECUTED_FOR_ALL_TOOLS_${{ tool}}
- ALL_WAIT_${{ tool }}
We've been migrating some of our manual deployment processes from Octopus to Azure DevOps Yaml pipelines. One of the QoL changes we're sorely missing is to be able to select the environment from a drop-down list/ auto-complete field as we could in Octopus.
Is there a way to achieve this? Currently, the only way I can think of doing it is to have a repo with a .yaml template file updated with a list of new environments as part of our provisioning process... Which seems less than ideal.
If you are going to trigger the pipeline manually then you can make use of Runtime parameters in the Azure DevOps pipeline.
For Example:
In order to make OS image name selectable from a list of choices, you can use the following snippet.
parameters:
- name: EnvName
displayName: EnvName
type: string
default: A
values:
- A
- B
- C
- D
- E
- F
trigger: none # trigger is explicitly set to none
jobs:
- job: build
displayName: build
steps:
- script: echo building $(Build.BuildNumber) with ${{ parameters.EnvName }}
Documentation about runtime parameters are here.
The downside to this is that the trigger: None limits you that the pipeline can only be manually triggered. Not sure how this works with other trigger options.
I am trying to use if else conditions in Azure Devops yml pipeline with variable groups. I am trying to implement it as per latest Azure Devops yaml pipeline build.
Following is the sample code for the if else condition in my scenario. test is a variable inside my-global variable group.
variables:
- group: my-global
- name: fileName
${{ if eq(variables['test'], 'true') }}:
value: 'product.js'
${{ elseif eq(variables['test'], false) }}:
value: 'productCost.js'
jobs:
- job:
steps:
- bash:
echo test variable value $(fileName)
When the above code is executed, in echo statement we don't see any value for filename, i.e. it empty, meaning none of the above if else condition was executed, however when I test the if else condition with the following condition.
- name: fileName
${{ if eq('true', 'true') }}:
value: 'product.js'
Filename did echo the correct value, i.e. product.js. So my conclusion is that I am not able to refer the variables from the variable group correctly. So any suggestion will be helpful and appreciated.
Thanks!
Unfortunately there is no ternary operator in Azure DevOps Pipelines. And it seems unlikely considering the state of https://github.com/microsoft/azure-pipelines-yaml/issues/256 and https://github.com/microsoft/azure-pipelines-yaml/issues/278. So for the time being the only choices are :
conditional insertion : it works with parameters, and should work with variables according to the documentation (but it is difficult to use properly),
or the hacks you can find in this Stack Overflow question.
Another work-around has been posted by Simon Alling on GitHub (https://github.com/microsoft/azure-pipelines-yaml/issues/256#issuecomment-1077684972) :
format(
replace(replace(condition, True, '{0}'), False, '{1}'),
valueIfTrue,
valueIfFalse
)
It is similar to the solution provided by Tejas Nagchandi, but I find it a little bit better because the syntax looks closer to what it would be if there was a ternary operator.
I was able to achieve the goal using some dirty work-around, but I do agree that using parameters would be much better way unless ternary operators are available for Azure DevOps YAML pipeline.
The issue is that ${{ if condition }}: is compile time expression, thus the variables under variable group are not available.
I was able to use runtime expressions $[<expression>]
Reference: https://learn.microsoft.com/en-us/azure/devops/pipelines/process/expressions?view=azure-devops
My pipeline:
trigger:
- none
variables:
- group: Temp-group-for-testing
- name: fileName
value: $[replace(replace('True',eq(variables['test'], 'True'), 'value1'),'True','value2')]
stages:
- stage: test
jobs:
- job: testvar
continueOnError: false
steps:
- bash: echo $(fileName)
displayName: "echo variable"
Results are available on github
After detailed investigation I realized that if else doesnt work with variables in Az Devop yaml pipelines, it only works with parameters. However the solution posted by #Tejas Nagchandi is a workaround and might be able to accomplish the same logic of if else setting variable value with replace commands. Hats off to TN.