How can I invoke a YAML pipeline that has both variables and runtime parameters? - azure-devops

I have a scenario where I need to have both:
runtime parameters, so that the pipeline can be triggered manually from the UI, where users triggering it can choose from a predefined set of options (defined in YAML)
variables, so that the pipeline can be invoked via REST APIs
Regarding runtime parameters, I was able to create the following sample pipeline:
parameters:
- name: image
displayName: Pool Image
type: string
default: ubuntu-latest
values:
- windows-latest
- ubuntu-latest
trigger: none
stages:
- stage: A
jobs:
- job: A
steps:
- pwsh: |
echo "This should be triggering against image: $MY_IMAGE_NAME"
env:
MY_IMAGE_NAME: ${{ parameters.image }}
When I run it, I can see the dropdown list where I can choose the image name and it is reflected in the output message of the PowerShell script.
Regarding variables, I have defined one called "image" here (notice the value is empty):
The idea now is to invoke the pipeline from REST APIs and have the image name replaced by the value coming from the variable:
{
"definition": {
"id": 1
},
"sourceBranch": "master",
"parameters": "{\"image\": \"windows-latest\" }"
}
In order to make the step print the value I'm passing here, I need to correct the environment variable in some way. I thought it would be sufficient to write something like:
env:
MY_IMAGE_NAME: ${{ coalesce(variables.image, parameters.image) }}
That's because I want to give the priority to the variables, then to parameters, so that in case none is specified, I always have a default value the pipeline can use.
However, this approach doesn't work, probably because we're dealing with different expansion times for variables, but I don't really know what I should be writing instead (if there is a viable option, of course).
What I also tried is:
env:
MY_IMAGE_NAME: ${{ coalesce($(image), parameters.image) }}
MY_IMAGE_NAME: ${{ coalesce('$(image)', parameters.image) }}
MY_IMAGE_NAME: $[ coalesce(variables.image, parameters.image) ]
MY_IMAGE_NAME: $[ coalesce($(image), parameters.image) ]
None of those are working, so I suspect this may not be feasible at all.
There is a workaround that I'm currently thinking of, which is to create two different pipelines so that those can be invoked independently, but while this is quite easy for me to accomplish, given I'm using a lot of templates, I don't find it the right way to proceed, so I'm open to any suggestion.

I tested and found you might need to define a variable and assign the parameter's value to it (eg. Mimage: ${{parameters.image}}). And define another variable(eg. Vimage) and assign $[coalesce(variables.image, variables.Vimage)] to it. Then refer to $(Vimage) in the env field of powershell task. Please check out below yaml.
parameters:
- name: image
displayName: Pool Image
type: string
default: ubuntu-latest
values:
- windows-latest
- ubuntu-latest
trigger: none
stages:
- stage: A
jobs:
- job: A
variables:
Mimage: ${{parameters.image}}
Vimage: $[coalesce(variables.image, variables.Mimage)]
steps:
- pwsh: |
echo "This should be triggering against image: $env:MY_IMAGE_NAME"
env:
MY_IMAGE_NAME: $(Vimage)
Env field of powershell task is usually for mapping secret variables. You can directly refer to $(Vimage) in the powershell script: echo "This should be triggering against image: $(Vimage).
Note: To queue a build via REST API with provided parameters, you need to check Let users override this value when running this pipeline to make the varilabe to be settable at queue time.
Update:
You can try passing the variables to the parameters of the template to make the parameters for template dynamic. Please check below simple yaml.
jobs:
- template: template.yaml
parameters:
MTimage: ${{parameters.image}}
VTimage: $(Vimage)
template.yaml:
parameters:
MTimage:
VTimage:
jobs:
- job: buildjob
steps:
- powershell: |
echo "${{parameters.VTimage}}"
echo "${{parameters.MTimage}}"

Related

How do I pass the job name into a github action's input?

jobs:
my-name:
name: "My Name"
...
steps:
- name: Slack Notification
uses: my-action
with:
slack-msg: ${{ jobs.${{ env.GITHUB_JOB }}.name }}
I want that slack-msg to evaluate to "My Name". I'm using my-action in multiple jobs, and I always want to pass in the job name, but I don't know how to do that. When I tried the above, the job literally didn't run and I don't know how to troubleshoot why: the github workflow log for my-name literally doesn't exist.
How do I pass job-name into an input parameter?
As nested expression are not supported you can use a trick like below to obtaint the matrix job name.
jobs:
test:
env:
# to expose matrix job name to steps, which is not possible with expansions
JOB_NAME: ${{ matrix.name || format('{0} ({1})', matrix.tox-target, matrix.os) }}
name: ${{ matrix.name || format('{0} ({1})', matrix.tox-target, matrix.os) }}
Note that you cannot really access the matrix name, but you can ensure you save the same name into an environment variable and use that.

How to set Azure DevOps yaml variables conditionally based on parameter value

I am trying to set variables based on a parameter value in a yaml pipeline. It seems that I've read many other posts which show examples like the one below that the authors have said worked, but I cannot get past issues when trying to do something like below.
I've tried many variations on this example as well, too many to list here. Sometimes it will show 'values' as a duplicate key. In other cases I've been able to try and start a run and get the prompt with environment selection, but then opening the stage dialog throws a parse error.
Is there some sort of difference between variable declaration at the top of the file vs in a stage or job? That seems to be the difference that I notice when reading through other examples.
Ultimately what I'm trying to do is set the ServiceConnection variable value based on the value of the environment parameter.
parameters:
- name: environment
displayName: Environment
type: string
values:
- DEV
- TEST
pr: none
trigger: none
pool: PrivateAgentPool
variables:
- name: 'isMain'
value: $[eq(variables['Build.SourceBranch'], 'refs/heads/main')]
- name: 'buildConfiguration'
value: 'Release'
- name: 'environment'
value: ${{ parameters.environment }}
- name: 'ServiceConnection'
${{ if eq(variables['environment'], 'DEV') }}:
value: 'svcConnectionDev'
${{ if eq(variables['environment'], 'TEST') }}:
value: 'svcConnectionTest'
Looks like your solution is almost correct. Consider the below example.
parameters:
- name: region
type: string
default: westeurope
values:
- westeurope
- northeurope
variables:
${{ if eq(parameters['region'], 'westeurope') }}:
ServiceConnection: "svcConnectionDev"
${{ else }}:
enter code here
if you want to used this ServiceConnectionvar across you can do it just by calling $ServiceConnection
you could use bash with conditions:
steps:
- bash: |
echo "##vso[task.setvariable variable=ServiceConnection]svcConnectionDev"
condition: eq('${{parameters.environment}}', 'DEV')

Setting parameter value dynamically for automatic pipelines

If I create a parameter, I can set its value when I run a pipeline manually. But when the pipeline runs automatically, it uses the default value. When a pipeline runs automatically (say in response to pushing to a repo) is there any way to pass it the value of a parameter?
This is the yaml file I was playing around with. The goal is to be able to control which tests get run in the pipeline.
parameters:
- name: testFiles
type: string
default: cypress/integration/first.spec.js
trigger:
- azure-test
pool:
vmImage: ubuntu-latest
steps:
- task: NodeTool#0
inputs:
versionSpec: "10.x"
displayName: "Install Node.js"
- script: npm install
displayName: "npm install"
- script: npx cypress run --record --key [record key removed] --spec ${{ parameters.testFiles }}
displayName: "run cypress"
When a pipeline runs automatically (say in response to pushing to a repo) is there any way to pass it the value of a parameter?
When running the automatical pipeline, it will only use the default parameter values.
So we can achieve this requirement by changing the default value of parameters.
Based on my test, I have found a convenient method: you could use If expression to check the trigger method (manually or CI). Then you could set the value of the parameters.
Note: We cannot use if expressions when defining parameters, so we need to use variables to pass values.
You could refer to this ticket.
Here is my example:
trigger:
- master
variables:
${{ if eq( variables['Build.Reason'], 'IndividualCI' ) }}:
buildVersion: $(BUILD.SOURCEVERSIONMESSAGE)
${{ if ne( variables['Build.Reason'], 'IndividualCI' ) }}:
buildVersion: cypress/integration/first.spec.js
parameters:
- name: testFiles
type: string
default: $(buildVersion)
pool:
vmImage: ubuntu-latest
steps:
- script: echo ${{ parameters.testFiles }}
displayName: 'Run a one-line script'
The variable $(Build.Reason) is used to confirm the trigger method.
The variable $(BUILD.SOURCEVERSIONMESSAGE) contains the message of the commit.
Here are the steps:
When you push the changes in a repo, you need to add a comment. The comment is the testfile path.
The CI triggered pipeline will get this comment and set it as Parameters default value.
In this case, it is similar to manually running the pipeline to set the parameters value.
When you manually run the pipeline, it will use the defined default value. You can also modify the value when you maually run the pipeline.

azure-pipelines.yml, how to override variables for pipeline

I'm trying to learn how to use the new yaml configured pipeline system for Azure Devops, and I'm having a bit of trouble getting my head around the way the variables are supposed to work.
When I setup the pipeline, it created a file azure-pipelines.yml and committed this to the master branch.
By default, this file looks like so...
trigger:
- master
pool:
vmImage: 'windows-latest'
variables:
solution: '**/*.sln'
buildPlatform: 'Any CPU'
buildConfiguration: 'Release'
My project is setup with the following build configurations... "prod", "staging", "develop".
What I'm confused about, is where am I supposed to override these default variables for the actual pipelines?
I can modify the values directly in this file, but that's not really going to work. When I merge the changes back from "master" to "staging" etc, then presumably the pipelines for these lower environments will then be trying to build with "prod" configuration.
Surely there must be some way to configure variables independent of the source code.
There are 2 places where I can see an option to add Variables...
When I choose "Edit" for the pipeline, up in the top right, there is a "Variables" button next to run.
I can add variables there, but they don't appear to do anything. They are not applied when I run the pipeline.
Also, to make things more confusing, when I choose to "Run pipeline", there is also an option to define variables, but likewise, these don't seem to do anything. The build still just runs with the pre-defined values from the yaml file.
Agree with Shayki Abramczyk. This method could manually override the variable value on the UI interface.
I would like to share the method of automatically appending values to variables.
You could use Expressions to judge different situations(e.g. build branch). Then you could set the value for different situations.
Here is an example:
trigger:
- '*'
pool:
vmImage: 'windows-latest'
variables:
${{ if eq(variables['Build.SourceBranchName'], 'master') }}:
buildConfiguration: Prod
${{ if eq(variables['Build.SourceBranchName'], 'staging') }}:
buildConfiguration: Staging
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
# Write your PowerShell commands here.
Write-Host $(buildConfiguration)
This sample code can select the corresponding value according to different trigger branch names. (master: Prod , staging: staging)
Hope this helps.
You can define the variables with Let users override this value when running this pipeline:
Use the variable in the build step with $(BuildConfiguration ).
When you run the build you can override the value:
May be you need not a variables but a parameters?
parameters:
- name: STAND_NAME
displayName: Select Stand to deploy
type: string
default: none
values:
- dev
- stage
- prod
variables:
- group: global-variables # use global variable from library
- name: STAND_NAME
value: ${{ parameters.STAND_NAME }}
- ${{ if eq(parameters['STAND_NAME'], 'prod') }}:
- name: variable_depends_on_stand
value: "prod_value" #
- ${{ if eq(parameters['STAND_NAME'], 'stage') }}:
- name: variable_depends_on_stand
value: "stage_value"
- ${{ if eq(parameters['STAND_NAME'], 'dev') }}:
- name: variable_depends_on_stand
value: "dev_value"
- name: SOME_OTHER_GLOBAL_VARIABLE
value: some_other_value
It would be display like this in pipeline:
screenshot of the pipeline WUI

Azure Devops - passing variables between job templates

Normal (non-template) jobs in Azure DevOps yaml support inter-job variable passing as follows:
jobs:
- job: A
steps:
- script: "echo ##vso[task.setvariable variable=skipsubsequent;isOutput=true]false"
name: printvar
- job: B
condition: and(succeeded(), ne(dependencies.A.outputs['printvar.skipsubsequent'], 'true'))
dependsOn: A
steps:
- script: echo hello from B
How do I do something similar in the following, given that templates don't support the dependsOn syntax? I need to get an output from the first template and pass it as 'environmentSlice' to the second template.
- stage: Deploy
displayName: Deploy stage
jobs:
- template: build-templates/get-environment-slice.yml#templates
parameters:
configFileLocation: 'config/config.json'
- template: build-templates/node-app-deploy.yml#templates
parameters:
# Build agent VM image name
vmImageName: $(Common.BuildVmImage)
environmentPrefix: 'Dev'
environmentSlice: '-$(dependencies.GetEnvironmentSlice.outputs['getEnvironmentSlice.environmentSlice'])'
The reason I want the separation between the two templates is the second one is a deployment template and I would like input from the first template in naming the environment in the second template. I.e. initial part of node-app-deploy.yml (2nd template) is:
jobs:
- deployment: Deploy
displayName: Deploy
# Because we use the environmentSlice to name the environment, we have to have it passed in rather than
# extracting it from the config file in steps below
environment: ${{ parameters.environmentPrefix }}${{ parameters.environmentSlice }}
Update:
The accepted solution does allow you to pass variables between separate templates, but won't work for my particular use case. I wanted to be able to name the 'environment' section of the 2nd template dynamically, i.e. environment: ${{ parameters.environmentPrefix }}${{ parameters.environmentSlice }}, but this can only be named statically since templates are compiled on pipeline startup.
The downside of the solution is that it introduces a hidden coupling between the templates. I would have preferred the calling pipeline to orchestrate the parameter passing between templates.
You can apply the depend on and dependency variable into templates.
See below sample:
To make sample more clear, here has 2 template files, one is azure-pipelines-1.yml, and another is azure-pipeline-1-copy.yml.
In azure-pipelines-1.yml, specify the environment value as output variable:
parameters:
  environment: ''
jobs:
- job: preDeploy
  variables:
    EnvironmentName: preDeploy-${{ parameters.environment }}
  steps:
  - checkout: none
  - pwsh: |
      echo "##vso[task.setvariable variable=EnvironmentName;isOutput=true]$($env:ENVIRONMENTNAME)"
    name: outputVars
And then, in azure-pipeline-1-copy.yml use dependency to get this output variable:
jobs:
- job: deployment
  dependsOn: preDeploy
  variables:
    EnvironmentNameCopy: $[dependencies.preDeploy.outputs['outputVars.EnvironmentName']]
  steps:
  - checkout: none
  - pwsh: |
      Write-Host "$(EnvironmentNameCopy)"
    name: outputVars
At last, in YAML pipeline, just need to pass the environment value
stages:
  - stage: deployQA
    jobs:
    - template: azure-pipelines-1.yml
      parameters:
        environment: FromTemplate1
    - template: azure-pipeline-1-copy.yml
Now, you can see the value get successfully in the second template job:
It is possible to avoid the dependency in the called template. However, as the OP says, the environment name cannot be created dynamically.
Here is an example of the "calling" template, which firstly calls another template (devops-variables.yml) that sets some environment variables that we wish to consume in a later template (devops-callee.yml):
stages:
- stage: 'Caller_Stage'
displayName: 'Caller Stage'
jobs:
- template: 'devops-variables.yml'
parameters:
InitialEnvironment: "Development"
- template: 'devops-callee.yml'
parameters:
SomeParameter: $[dependencies.Variables_Job.outputs['Variables_Job.Variables.SomeParameter']]
In the devops-variables.yml file, I have this:
"##vso[task.setvariable variable=SomeParameter;isOutput=true;]Wibble"
Then, in the "devops-callee.yml", I just consume it something like this:
parameters:
- name: SomeParameter
default: ''
jobs:
- deployment: 'Called_Job'
condition: succeeded()
displayName: 'Called Job'
environment: "Development"
pool:
vmImage: 'windows-2019'
dependsOn:
- Variables_Job
variables:
SomeParameter: ${{parameters.SomeParameter}}
strategy:
runOnce:
deploy:
steps:
- download: none
- task: AzureCLI#2
condition: succeeded()
displayName: 'An Echo Task'
inputs:
azureSubscription: "$(TheServiceConnection)"
scriptType: pscore
scriptLocation: inlineScript
inlineScript: |
echo "Before"
echo "$(SomeParameter)"
echo "After"
Output:
2021-04-10T09:22:29.6188535Z Before
2021-04-10T09:22:29.6196620Z Wibble
2021-04-10T09:22:29.6197124Z After
This way, the callee doesn't reference the caller. Unfortunately, setting the environment in the callee thus:
environment: "$(SomeParameter)"
doesn't work - you'll just get an environment with the literal characters '$(SomeParameter)'.