I would like to trigger a job template with an object as parameter.
Unfortunately, even based on the examples I couldn't find a way to do that.
I would appreciate if someone could guide me how to achieve this.
What I want to achieve, is to replace the ["DEPLOY", "CONFIG"] part with a dynamic variable:
- template: job-template.yaml
parameters:
jobs: ["DEPLOY", "CONFIG"]
This is not possible. YAML is very limited here and you may read more about this here
Yaml variables have always been string: string mappings.
So for instance you can define paramaters as complex type
Template file
parameters:
- name: 'instances'
type: object
default: {}
- name: 'server'
type: string
default: ''
steps:
- ${{ each instance in parameters.instances }}:
- script: echo ${{ parameters.server }}:${{ instance }}
Main file
steps:
- template: template.yaml
parameters:
instances:
- test1
- test2
server: someServer
But you are not able to do it dynamically/programmatically as every output you will create will end up as simple string.
What you can do is to pass as string and then using powershell split that string. But it all depends what you want to run further because you won't be able to simply iterate over yaml structure in that way. All what you can do is to run in in powershell loop and do something, but it can be not enough for you.
It's possible with some logic. see below
- template: job-template.yaml
parameters:
param: ["DEPLOY", "CONFIG"]
and in job-template.yaml file you can define. So every job name will be different
parameters:
param: []
jobs:
- ${{each jobName in parameters.param}}:
- job: ${{jobName}}
steps:
- task: Downl......
Related
I have the following pipeline variables:
---
variables:
models:
- name: model1
image_name: image1
- name: model2
image_name: image2
However, this is not allowed: A sequence was not expected. It seems that pipeline variables can only be single line strings. Is there a clever way to work around this? I've read about convertToJson, but it's not desirable/readable to write the models variable in a single line json string.
For context, I'm passing this variable as a parameter to a template, where I will be looping over parameters.models and run a stage for each model, like this:
- ${{each model in parameters.models}}:
- stage: {{ model.name }}
Apparently there is a very old unresolved issue that could provide a solution... What would be the best workaround?
DevOps YAML pipeline don't support such YAML structure:
variables:
models:
- name: model1
image_name: image1
- name: model2
image_name: image2
The variables in DevOps pipeline concept is only string.
And this place:
- ${{each model in parameters.models}}:
- stage: {{ model.name }}
Here are two usages in DevOps YAML tech, were named conditional insertion and template expression.
Both of them need valid structure in DevOps YAML. Variables in DevOps YAML concept doesn't support such structure, so these usages are not possible.
A possible way is make your YAML pipeline like below:
trigger:
- none
pool:
vmImage: ubuntu-latest
parameters:
- name: models
type: object
default:
- name: model1
image_name: image1
- name: model2
image_name: image2
stages:
- ${{each model in parameters.models}}:
- ${{ each modelcontent in model }}:
- ${{ if eq(modelcontent.Key, 'name') }}:
- stage: '${{ modelcontent.Value }}'
jobs:
- job:
steps:
- script: echo Hello, world!
displayName: 'Run a one-line script'
Results:
Parameters can pass YAML object, and then use them in compile time usages. that's why I use parameters.
If you want handle the parameters automatically, the only way it to use another script/code/app to parse the YAML content getting from the repository and then change it. And after you finish the change, follow this to push back the changed YAML:
Git push back changes to repo
John Folberth almost does what you want, except he isn't passing it as a parameter to the yaml but to underlying templates:
https://blog.johnfolberth.com/advanced-azure-devops-yaml-objects/
The ConvertToJson is the other alternative, André van der Goes explains it in this blog post:
https://www.automagical.eu/posts/passing-complex-variables-devops-yaml/
I have one variable group in ADO library which store different paths and some other variables.
In my main "master" pipeline I use it as below:
variables:
- group: myGroupName
- name: nameOfMyVariable(from variables group) or JustAnyName
- value: $[variables.nameOfMyVariable] or $[variables.JustAnyName]
then in job in the first Stage (for testing, there is only one stage and job for now) I'm trying to using template yaml:
jobs:
- template: my-template.yaml
parameters:
path: $(nameOfMyVariable) or $(JustAnyName)
then in my-template.yaml I have this code:
parameters:
- name: path
type: string
default: ''
jobs:
- job: BuildSomething
steps:
- task: CopyFiles#2
inputs:
Contents: |
${{ parameters.path }}
TargetFolder: '$Build.ArtifactStagingDirectory)'
....
Rest is not that important as it just can't find files to copy and when I try to print parameters.path with echo I get error :
syntax error: invalid arithmetic operator(error token is ".nameOfMyVariable").
I do not know how to fix it so I can access variables from variable group in some of my templates. Do I need to use ##vso[task.setvariables] or something else?
If you want use variable from variable group it is enough to just include this group
variables:
- group: myGroupName
And then use variable by name $(nameOfMyVariable)
In your example it seems you try unnecessary try to declare this variable again in first yaml example.
This example is additionally incorrect because you are addings dash directly before 'value' keyword and it may cause undefined behaviour.
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch#specify-variables
Just like Kontekst says, when it comes to variable groups, you don't need to declare the variable names and values in your yaml. Once you declare the variable groups in your yaml, you could use the variables from the groups directly.
And if you are using parameters against templates, for your scenario, I suppose that you could declare the parameters in your main yaml, and input the parameter value into your template. (and you don't have to use the variable groups)
My main yaml as below.
trigger:
- none
pool:
vmImage: ubuntu-latest
parameters:
- name: pathmain
displayName: Source Path
type: string
default: README.md
values:
- azure-pipelines.yml
- README.md
jobs:
- job:
steps:
- script: echo ${{ parameters.pathmain }}
- template: test-template.yml
parameters:
path: ${{parameters.pathmain}}
My test-template as below.
parameters:
- name: path
displayName: Source Path
type: string
jobs:
- job: BuildSomething
steps:
- task: CopyFiles#2
inputs:
Contents: |
${{ parameters.path }}
TargetFolder: '$Build.ArtifactStagingDirectory'
I have defined a variable in the pipeline and set it to false/true. However, when the pipeline is set, I can see that the value for parameters.RunUnitTest is $(RunUnitTest), and this is not the value I set up in the pipeline. So what I am doing wrong here?
trigger: none
extends:
template: ThunderPipeline.yaml
parameters:
MergeBetweenBranches: true
FromBranch: 'master'
ToBranch: 'R_Current_Sprint'
RunUnitTest: '$(RunUnitTest)'
Variables can be defined in one YAML and included in another template. This could be useful if you want to store all of your variables in one file. If you are using a template to include variables in a pipeline, the included template can only be used to define variables. You can use steps and more complex logic when you are extending from a template. Use parameters instead of variables when you want to restrict type.
In this example, the variable favoriteVeggie is included in azure-pipelines.yml.
# File: vars.yml
variables:
favoriteVeggie: 'brussels sprouts'
# File: azure-pipelines.yml
variables:
- template: vars.yml # Template reference
steps:
- script: echo My favorite vegetable is ${{ variables.favoriteVeggie }}.
Or something like:
# File: templates/steps-with-params.yml
parameters:
- name: 'runExtendedTests' # defaults for any parameters that aren't specified
type: boolean
default: false
steps:
- script: npm test
- ${{ if eq(parameters.runExtendedTests, true) }}:
- script: npm test --extended
# File: azure-pipelines.yml
steps:
- script: npm install
- template: templates/steps-with-params.yml # Template reference
parameters:
runExtendedTests: 'true'
I would like to use a common pipeline definition for our solutions. Using variables, I would like to specify solution specific settings. This works, except for a variable group.
I would like to use the pipeline definition in my variable group definition.
For example:
group: $(Build.DefinitionName).Dev
But that does not work. Another option would be to use a pipeline variable, but neither does work:
group: $(buildDefinitonName).Dev
group: {{ variables.buildDefinitonName }}.Dev
What does work is a parameter, but I do not want to specify it for each run.
group: ${{ parameters.buildDefinition }}.Dev
One option is to use your deployment in a template and scope the variable group to a job in that template. The parameter passed into the template would be the environment.
Here is how the template could look:
jobs:
- deployment: Deploy_JobName
variables:
- group: 'ProjectName${{ parameters.stage}}'
The parameter in the template would look like:
parameters:
- name: stage
type: string
This template would be called from a joblike:
jobs:
- template: template.yml
parameters:
stage: ${{ parameters.stage }}
Thanks for your response. Found out that
group: ${{ variables['Build.Definition'] }}.Dev
also works. So you can use predefined variables, but not pipeline variables this way.
I have a scenario where I need to have both:
runtime parameters, so that the pipeline can be triggered manually from the UI, where users triggering it can choose from a predefined set of options (defined in YAML)
variables, so that the pipeline can be invoked via REST APIs
Regarding runtime parameters, I was able to create the following sample pipeline:
parameters:
- name: image
displayName: Pool Image
type: string
default: ubuntu-latest
values:
- windows-latest
- ubuntu-latest
trigger: none
stages:
- stage: A
jobs:
- job: A
steps:
- pwsh: |
echo "This should be triggering against image: $MY_IMAGE_NAME"
env:
MY_IMAGE_NAME: ${{ parameters.image }}
When I run it, I can see the dropdown list where I can choose the image name and it is reflected in the output message of the PowerShell script.
Regarding variables, I have defined one called "image" here (notice the value is empty):
The idea now is to invoke the pipeline from REST APIs and have the image name replaced by the value coming from the variable:
{
"definition": {
"id": 1
},
"sourceBranch": "master",
"parameters": "{\"image\": \"windows-latest\" }"
}
In order to make the step print the value I'm passing here, I need to correct the environment variable in some way. I thought it would be sufficient to write something like:
env:
MY_IMAGE_NAME: ${{ coalesce(variables.image, parameters.image) }}
That's because I want to give the priority to the variables, then to parameters, so that in case none is specified, I always have a default value the pipeline can use.
However, this approach doesn't work, probably because we're dealing with different expansion times for variables, but I don't really know what I should be writing instead (if there is a viable option, of course).
What I also tried is:
env:
MY_IMAGE_NAME: ${{ coalesce($(image), parameters.image) }}
MY_IMAGE_NAME: ${{ coalesce('$(image)', parameters.image) }}
MY_IMAGE_NAME: $[ coalesce(variables.image, parameters.image) ]
MY_IMAGE_NAME: $[ coalesce($(image), parameters.image) ]
None of those are working, so I suspect this may not be feasible at all.
There is a workaround that I'm currently thinking of, which is to create two different pipelines so that those can be invoked independently, but while this is quite easy for me to accomplish, given I'm using a lot of templates, I don't find it the right way to proceed, so I'm open to any suggestion.
I tested and found you might need to define a variable and assign the parameter's value to it (eg. Mimage: ${{parameters.image}}). And define another variable(eg. Vimage) and assign $[coalesce(variables.image, variables.Vimage)] to it. Then refer to $(Vimage) in the env field of powershell task. Please check out below yaml.
parameters:
- name: image
displayName: Pool Image
type: string
default: ubuntu-latest
values:
- windows-latest
- ubuntu-latest
trigger: none
stages:
- stage: A
jobs:
- job: A
variables:
Mimage: ${{parameters.image}}
Vimage: $[coalesce(variables.image, variables.Mimage)]
steps:
- pwsh: |
echo "This should be triggering against image: $env:MY_IMAGE_NAME"
env:
MY_IMAGE_NAME: $(Vimage)
Env field of powershell task is usually for mapping secret variables. You can directly refer to $(Vimage) in the powershell script: echo "This should be triggering against image: $(Vimage).
Note: To queue a build via REST API with provided parameters, you need to check Let users override this value when running this pipeline to make the varilabe to be settable at queue time.
Update:
You can try passing the variables to the parameters of the template to make the parameters for template dynamic. Please check below simple yaml.
jobs:
- template: template.yaml
parameters:
MTimage: ${{parameters.image}}
VTimage: $(Vimage)
template.yaml:
parameters:
MTimage:
VTimage:
jobs:
- job: buildjob
steps:
- powershell: |
echo "${{parameters.VTimage}}"
echo "${{parameters.MTimage}}"