In Azure DevOps pipelines there's an option to conditionally run a task based on a pipeline variable. This is handled under the Run this task > Custom conditions field and it uses the syntax:
eq(variables['VarName'], 'Desired Value')
An agent job has a similar field for conditional execution under Run this job > Custom condition using variable expressions.
However, when I use the same syntax as a conditional task the result always evaluates to 'false'.
So how can I conditionally run an agent job?
Screenshots:
Something like this worked for me:
- job: Job1
steps:
- powershell: |
if (some condition)
{
Write-Host ("##vso[task.setvariable variable=RunJob2;isOutput=true]True")
}
name: ScriptStep
- job: Job2
dependsOn: Create_Build_Matrix
condition: and(succeeded(), eq(dependencies.Job1.outputs['ScriptStep.RunJob2'], 'True'))
I discovered the answer. Unfortunately, it is not possible to conditionally run an agent job with a variable that is modified during build execution.
From the Azure DevOps Pipeline documentation under Pipeline Variables:
To define or modify a variable from a script, use the task.setvariable
logging command. Note that the updated variable value is scoped to
the job being executed, and does not flow across jobs or stages.
Try this one: https://stefanstranger.github.io/2019/06/26/PassingVariablesfromStagetoStage/
This one gives you to pass variables from one stage/job to another stage/job in the same release pipeline. I tried and it's working fine.
Also to run this you need to give some permissions for release pipeline. To allow for updating the Release Definition during the Release you need to configure the Release Permission Manage releases for the Project Collection Build Service.
Related
I have setup a in my azure pipeline which create a Redis cache instance through Azure CLI.
There is another task which runs afterwards that pick set the values in my application config file from a pipeline variable named "CacheConnectionKey".
I have to set the variable value manually in the pipeline. I want to automate this process by adding a new task in between both described above. The new task should get PrimaryKey from Redis cache instance and set the value in the pipeline variable (i.e. CacheConnectionKey).
There is a command I have tried in the power shell which is giving me the access keys:
Get-AzRedisCacheKey -ResourceGroupName "MyResourceGroup" -Name "MyCacheKey"
PrimaryKey : pJ+jruGKPHDKsEC8kmoybobH3TZx2njBR3ipEsquZFo=
SecondaryKey : sJ+jruGKPHDKsEC8kmoybobH3TZx2njBR3ipEsquZFo=
Now I want to set PrimaryKey resulting from this command to be set in the pipeline variable CacheConnectionKey so that the next process could use the value properly.
The "process" referred in the question could be anything like a run/job/stage/pipeline I suppose. Regardless, in YAML pipelines, you can set variables at the root, stage, and job level. You can also use a variable group to make variables available across multiple pipelines. Some tasks define output variables, which you can consume in downstream steps, jobs, and stages.
In YAML, you can access variables across jobs and stages by using dependencies. By default, each stage in a pipeline depends on the one just before it in the YAML file. So if you need to refer to a stage that isn't immediately prior to the current one, you can override this automatic default by adding a dependsOn section to the stage.
For example, let's assume you have a task called MyTask, which sets an output variable called MyVar.
To use outputs in the same job:
steps:
- task: MyTask#1 # this step generates the output variable
name: ProduceVar # because we're going to depend on it, we need to name the step
- script: echo $(ProduceVar.MyVar) # this step uses the output variable
To use outputs in a different job:
jobs:
- job: A
steps:
# assume that MyTask generates an output variable called "MyVar"
- task: MyTask#1
name: ProduceVar # because we're going to depend on it, we need to name the step
- job: B
dependsOn: A
variables:
# map the output variable from A into this job
varFromA: $[ dependencies.A.outputs['ProduceVar.MyVar'] ]
steps:
- script: echo $(varFromA) # this step uses the mapped-in variable
For more details on the syntax and examples, check the following articles:
Use output variables from tasks
Dependencies
Hi I have a pipeline dependency challenge and have thought up a number of possible solutions. I can try them all out in a lab but the problem I am wondering if they work well "in the field" and so would like to know if anyone has tried them?
I have 3 stages, each in their own YML file. Each one is called from a main YML which is called from a main pipeline.
- template: 'build.yml'
- template: 'deploy.yml'
- template: 'test.yml'
The 'deploy.yml' generates a large number of output environment variables and 4 of these are consumed by the 'test.yml' using the "stageDependencies" syntax:
stages:
- stage: 'Test_Stage'
dependsOn: Deploy_Stage
jobs:
job: 'Test_Job'
variables:
MyWebSite: [ stageDependencies.Deploy_Stage.Variables_Job.outputs['Variables_Job.Variables.MyWebSite'] ]
This works nicely.
But, I'd like to be able to create a pipeline that just does the test stage (to test a pre-existing web site). That doesn't work of course because of the dependency dependsOn: Deploy_Stage.
I can think of a few possible solutions:
Instead of having a dependency and using the [ stageDependencies... ] syntax, send the MyWebSite as a pipeline parameter between stages. (Note that there are actually parameters not 1, I just simplified to demonstrate the challenge.) If I do that, the tester gets prompted to fill out (or choose from a list) the various parameters. But, it does create linkage between Deploy_Stage and Test_Stage - I don't know if that's a bad thing?
Pass a Boolean parameter from Deploy_Stage to Test_Stage such as "CalledFromDeployStage" and then in Test_Stage, do this:
stages:
- stage: 'Test_Stage'
${{ if eq(parameters.CalledFromDeployStage, true) }}:
dependsOn: Deploy_Stage
jobs:
job: 'Test_Job'
variables:
MyWebSite: [ stageDependencies.Deploy_Stage.Variables_Job.outputs['Variables_Job.Variables.MyWebSite'] ]
This feels a bit clunky.
Create a new YML called "Test_Stage_Manual" and get it to prompt for the various parameters and leave the rest as-is. (If I do this, I would probably put the jobs into their own YML file and call that YML from both Test stages.)
Something else?
You can try like as below:
Create an individual YAML pipeline to only run the test.
In the "Deploy_Stage" of the main pipeline, add a step or job at the end of this stage to execute the API "Runs - Run Pipeline" to trigger the pipeline for test after all the previous steps and jobs in this stage are completed successfully.
When calling the "Runs - Run Pipeline" API, you can pass the variables and parameters generated in the "Deploy_Stage" to the pipeline for test via the Request Body (JSON type) of the API.
Due to the test is in an individual pipeline, you can manually trigger this pipeline if you like. When manually trigger, you can manually set the value of the required variables and parameters in the pipeline.
With this way, you can trigger the pipeline for test both via the "Deploy_Stage" and manually.
I have a pipeline in Azure DevOps somewhat like this:
parameters:
- name: Scenario
displayName: Scenario suite
type: string
default: 'Default'
variables:
Scenario: ${{ parameters.Scenario }}
...
steps:
- script: echo Scenario is $(Scenario)
And I'm executing the pipeline via the VSTS CLI like this:
vsts build queue ... --variables Scenario=Test
When I run my pipeline, it seems that the parameter default value overwrites my cmd line specified variable value and I get the step output Scenario is Default. I tried something like Scenario: $[coalesce(variables['Scenario'], ${{ parameters.Scenario }})] but I think I got the syntax wrong because that caused a parsing issue.
What would be the best way to only use the parameter value if the Scenario variable has not already been set?
What would be the best way to only use the parameter value if the
Scenario variable has not already been set?
Sorry but as I know your scenario is not supported by design. The Note here has stated that:
When you set a variable in the YAML file, don't define it in the web editor as settable at queue time. You can't currently change variables that are set in the YAML file at queue time. If you need a variable to be settable at queue time, don't set it in the YAML file.
The --variables switch in command can only be used to overwrite the variables which are marked as Settable at queue time. Since yaml pipeline doesn't support Settable variables by design, your --variables Scenario=Test won't actually be passed when queuing the yaml pipeline.
Here're my several tests to prove that:
1.Yaml pipeline which doesn't support Settable variable at Queue time:
pool:
vmImage: 'windows-latest'
variables:
Scenario: Test
steps:
- script: echo Scenario is $(Scenario)
I ran the command vsts build queue ... --variables Scenario=Test123, the pipeline run started but the output log would always be Scenario is Test instead of expected Scenario is Test123. It proves that it's not Pipeline parameter overwrites variable value, instead the --variables Scenario=xxx doesn't get passed cause yaml pipeline doesn't support Settable variables.
2.Create Classic UI build pipeline with pipeline variable Scenario:
Queuing it via command az pipelines build queue ... --variables Scenario=Test12345(It has the same function like vsts build queue ... --variables Scenario=Test) only gives this error:
Could not queue the build because there were validation errors or warnings.
3.Then enable the Settable at queue time option of this variable:
Run the same command again and now it works to queue the build. Also it succeeds to overwrite the original pipeline variable with the new value set in command-line.
You can do similar tests like what I did to figure out the cause of the behavior you met.
In addition:
VSTS CLI has been deprecated and replaced by Azure CLI with the Azure DevOps extension for a long time. So now it's more recommend to use az pipelines build queue
instead.
Lance had a great suggestion, but here is how I ended up solving it:
- name: Scenario
displayName: Scenario suite
type: string
default: 'Default'
variables:
ScenarioFinal: $[coalesce(variables['Scenario'], '${{ parameters.Scenario }}')]
...
steps:
- script: echo Scenario is $(ScenarioFinal)
In this case we use the coalesce expression to assign the value of a new variable, ScenarioFinal. This way we can still use --variables Scenario=Test via the CLI or use the parameter via the pipeline UI. coalesce will take the first non-null value and effectively "reorder" the precedence Lance linked to here: https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch#expansion-of-variables
(Note that there need to be single quotes around the parameter reference '${{}}' because the ${{}} is simply converted to to the value, but then the coalesce expression doesn't know how to interpret the raw value unless it has the single quotes around it to denote it as a string)
Note that the ability to set parameters via the CLI is a current feature suggestion here: https://github.com/Azure/azure-devops-cli-extension/issues/972
Is it possible in Azure Devops YAML pipelines to dynamically create additional steps based on some variable data (without creating our own plugin)
The thing is I see that I want to iterate through several directories, but I don't want to just lump it all in a single step since it makes it harder to scan through to find an error.
Is it possible in Azure Devops YAML pipelines to dynamically create
additional steps based on some variable data (without creating our own
plugin)
No, Yaml pipelines(azure-pipeline.yml) are under Version Control. So what you want (for your original title) is to dynamically commit changes to the azure-pipeline.yml file when executing the pipeline. That's not a recommended workflow.
1.Instead you can consider using Azure Devops Conditions to dynamically enable/disable the additional steps.
Use a template parameter as part of a condition
Use the output variable from a job in a condition in a subsequent job
Or Use some predefined variables:
condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/master'))
2.If you're not using Conditions, you can check conditional template as Simon suggests above.
Also, both #1 and #2 can work with new feature runtime parameters.
3.However, if the dynamic variable you mean comes from the result of components = result of ls -1 $(Pipeline.Workspace)/components command, above tips won't work for this situation. For this you can try something like this:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
# Write your PowerShell commands here.
# some logic to run `components = result of ls -1 $(Pipeline.Workspace)/components` and determine whether to set the WhetherToRun=true.
'Write-Host "##vso[task.setvariable variable=WhetherToRun]True"'
- task: CmdLine#2
inputs:
script: |
echo Hello world
condition: eq(variables['WhetherToRun'], 'True')
It is possible to include steps conditionally with an if statement.
I think the example of extending a template on the same page will give you a good indication of how to iterate through a list parameter and create / run a step based on each value.
I have a pipeline in Azure DevOps that essentially look like this
stage: BuildStage
job: SetUp
job: Compile
stage: DeployStage
job: Deploy
In the SetUp job I define an output variable, which I can define in the Compile job using e.g.
variables:
MyVariableFromSetUp: $[ dependencies.SetUp.outputs['MyVariable'] ]
The question is, how can I do the same in the Deploy job? I don't want to run the SetUp stage twice as it is time consuming to calculate the value of MyVariable hence I must cache it.
The DeployStage has a dependsOn to BuildStage, but it appears I cannot use the dependencies as I expected. The documentation fails to mention the multi-stage case when dealing with variables.
Hey there is no direct way to do this at present based on what I have found, you would be following one of the following 3 methods
Passing it as artifact value using this method https://medium.com/microsoftazure/how-to-pass-variables-in-azure-pipelines-yaml-tasks-5c81c5d31763
Passing it via a API Endpoint using VSTeam powershell module https://www.donovanbrown.com/post/Passing-variables-from-stage-to-stage-in-Azure-DevOps-release, similar method can also be found here https://stefanstranger.github.io/2019/06/26/PassingVariablesfromStagetoStage/