displayName with variable in azure devops pipeline - azure-devops

I have build a yaml pipeline which has multiple stages in them.
To ensure all the stages are using the same git tag, I am trying to display the git tag in the displayName of each stage.
I have a job within the stage which goes and finds the git tag
stages:
- stage : stageA
jobs:
- job: Tag
steps:
- bash: |
tag=`git describe --tags --abbrev=0` && echo "##vso[task.setvariable variable=version_tag;isOutput=true]$tag"
echo "$tag"
name: setTag
How can I use the $tag under displayName in the next stage of the pipeline?

How can I use the $tag under displayName in the next stage of the
pipeline?
No, runtime variables are absolutely impossible to get in compile time. The values of display name are captured before any of the stage actually run, unless you pass a compile-time expression, otherwise anything will be handled as 'string'.
Take a look of this:
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch#understand-variable-syntax
Only this situation can work:
trigger:
- none
pool:
vmImage: ubuntu-latest
stages:
- stage: A
jobs:
- job: A1
steps:
- bash: |
tag="testtag"
echo "##vso[task.setvariable variable=myStageVal;isOutput=true]$tag"
name: MyOutputVar
- stage: B
dependsOn: A
jobs:
- job: B1
variables:
myStageAVar: $[stageDependencies.A.A1.outputs['MyOutputVar.myStageVal']]
steps:
- bash: echo $(myStageAVar)

Related

condition issue with Azure Devops pipeline

variables:
branches: $[ or(eq(variables['Build.SourceBranch'], 'refs/heads/branch/ayush'), contains(variables['Build.SourceBranch'], 'refs/heads/releases/'))]
This ${{ if eq(variables.branches, 'true') }} condition is not being executed and so the value of var is not being set. However if i set branches to true, it works.
I do not understand what the issue is. What data type does branches have?boolean or string.
According to your description, this problem is related to the expression syntax.
When you are using the runtime expression "$[]", it works in runtime of the pipeline and "${{}}" is compile time syntax, it works before the runtime. So, the condition "${{if...}}" cannot get the value of the "branches" variable.
As a workaround, you can try to use specific condition to set the variable.
variables:
branches: $[ or(eq(variables['Build.SourceBranch'], 'refs/heads/branch/ayush'), contains(variables['Build.SourceBranch'], 'refs/heads/releases/'))]
stages:
- stage: A
jobs:
- job: A1
steps:
- script: echo Hello Stage A!
- stage: B
condition: and(succeeded(), eq(variables.branches, 'true'))
jobs:
- job: B1
steps:
- script: echo Hello Stage B!
- powershell: |
Write-Host "##vso[task.setvariable variable=var;]value"

Azure pipelines - Stages are skiped despite no conditions

I'm trying to set up a pipeline in Azure.
Actual behavior
On pull-request a build validation triggers the pipeline and all stages and jobs are triggered.
After the merge, all jobs are skipped.
Expected behavior
After the merge of the pull request, I would expect stage Bto be triggered
Question
What am I missing so pipeline triggers correctly on merge?
azure.pipelines.yml
trigger:
branches:
include:
- master
stages:
- template: 'main.yml'
main.yml
stages:
- stage: 'A'
condition: startsWith(variables['build.reason'], 'PullRequest')
jobs:
- job: A
steps:
- script: echo A
- stage: 'B'
jobs:
- job: 'B'
steps:
- script: echo B
The trigger feature only works for the whole pipeline and not for an individual stage/job in the pipeline.
Normally, we use the different trigger types (CI/PR, resources) and filters (branches, paths, tags) to define when the pipeline should be triggered.
In the pipeline, we generally specify conditions to a stage, job or step to define when the this stage, job or step should be run or not. The conditions will be verified after the pipeline has been triggered.
To specify conditions, you can use one of the following ways:
use the condition key.
use the if expression.
I found the issue and applied suggestions from #bright-ran-msft
Stage A and B are implicitly linked. Since stage A was not triggered on the merge, stage B would not start either.
Solution
Instead of using condition, it is required to use if
Example
stages:
- ${{ if eq( variables['build.reason'], 'PullRequest') }}:
- stage: 'A'
jobs:
- job: A
steps:
- script: echo A
- stage: 'B'
jobs:
- job: 'B'
steps:
- script: echo B

Dynamic variables not available in other stages of azure pipeline

I am new to azure pipelines and am currently experimenting with the passing variables to the later jobs. Here is the current snippet whereby I am trying to extract the project version from the pom file using maven help: evaluate plugin. The pomVersion variable is populated but is not available in the same step or later steps in the second job via the projectVersionDynamic variable.
stages:
- stage: FirstStage
jobs:
- job: FirstJob
pool:
vmImage: 'ubuntu-latest'
steps:
- task: Bash#3
inputs:
targetType: 'inline'
script: |
pomVersion=`mvn help:evaluate -Dexpression=project.version -q -DforceStdout`
echo $pomVersion ##Prints semantic version 2.27.0-SNAPSHOT
echo '##vso[task.setvariable variable=projectVersionDynamic;isOutput=true]$pomVersion'
echo '##vso[task.setvariable variable=projectVersionStatic;isOutput=true]2.27.0'
echo $(Task1.projectVersionDynamic) ##Error message
echo $projectVersionDynamic ##Is empty
name: Task1
displayName: Task1 in JobOne of FirstStage
- job: SecondJob
dependsOn: FirstJob
variables:
DynVar: $[ dependencies.FirstJob.outputs['Task1.projectVersionDynamic'] ]
StaVar: $[ dependencies.FirstJob.outputs['Task1.projectVersionStatic'] ]
pool:
vmImage: 'ubuntu-latest'
steps:
- task: Bash#3
inputs:
targetType: 'inline'
script: |
echo 'SecondJob'
echo $(DynVar) ##Is empty
echo $DynVar ##Is empty
echo $(StaVar) ##Prints 2.27.0
echo $StaVar ##Is empty
displayName: Task in JobTwo of FirstStage
Observation: projectVersionDynamic value does not get populated and is not available in the same task or subsequent tasks within or in later jobs / stages. However, the static variable gets populated in projectVersionStatic without any issues.
Is it possible to set dynamic values for user-defined variables in azure pipelines or is it that I am doing something wrong? I see an example here under the Stages section where it seems to be working.
Variables in Azure Pipelines can be really tricky sometimes. The documentation isn't always crystal-clear on how it's supposed to work. Looking at your example, a cpuple of observations:
echo '##vso[task.setvariable variable=projectVersionDynamic;isOutput=true]$pomVersion' - your single quotes need to be double quotes to expand the value of $pomVersion into the echo statement
(and this is where things get fuzzy): The purpose of task.setvariable is to communicate values to other tasks or jobs. From the documentation: "This doesn't update the environment variables, but it does make the new variable available to downstream steps within the same job."
$variable syntax won't work because the task.setVariable doesn't inject into the running environment - rather, it's a signal to the pipeline to capture the output and store it away for later use.
$(variable) syntax won't work because it's expanded just before the job starts, so it's too late to capture here.
If you use my suggestion in point 1) about double-quoting the task.setVariable, you should see the value available in the second job.
For anybody having a use case where variables defined in one azure pipeline stage needs to be used in another stage, this is how it can be done:
Following snipett of azure pipeline that defines a variable and make it available to be used in a job in the same stage and in a job in another stage:
stages:
- stage: stage1
jobs:
- job: job1
steps:
- task: "AzureCLI#2"
name: step1
inputs:
inlineScript: |
my_variable=a-value
echo "##vso[task.setvariable variable=var_name;isOutput=true]$my_variable"
- job: job2
variables:
stage1_var: $[ dependencies.job1.outputs['step1.var_name'] ]
steps:
- bash: |
echo "print variable value"
echo $(stage1_var)
- stage: stage2
jobs:
- job: job1
variables:
stage2_var: $[ stageDependencies.stage1.job1.outputs['step1.var_name'] ]
steps:
- bash: |
echo "print variable value"
echo $(stage2_var)

How to send post build message in azure DevOps YAML pipeline?

I'm trying to send a post-build Slack message after the job is done/failed in a Azure DevOps YAML pipeline. But it seems I can't find a proper condition setting.
Basically, I have three stages: test, build, and notification.
I tried the following at last, but dependencies.UnitTest.result returns null, so not giving me Succeeded or Failed.
I also tried a bunch of different conditions but they didn't work. For example, a plain succeeded() and failed() without dependencies, or succeeded('Test') as stage level or succeeded('UnitTest') as job level.
In most cases, they send the success message even if they failed at the Test stage, or a syntax error for job names as argument in succeeded() or failed()
What is the proper condition to send a post-build message like Jenkins?
stages:
- stage: Test
jobs:
- job: UnitTest
steps:
- script: echo UnitTest
- script: exit 1
- stage: Build
jobs:
- job: Build
steps:
- script: echo Build
- stage: Notify
dependsOn:
- Test
- Build
condition: succeededOrFailed()
jobs:
- job: Succeed
condition: eq(dependencies.UnitTest.result, 'Succeeded')
steps:
- script: echo Succeed #(slack)
- job: Fail
condition: eq(dependencies.UnitTest.result, 'Failed')
steps:
- script: echo Fail #(slack)
--- EDIT ---
MS support confirmed jobs in multi stages can't support as yaml syntax itself.
Not same as the original flow as expected, but you can split succeed and fail into different stages as follow.
(It may increase quite number of stages just for the notification if you want different message for each jobs..)
...
- stage: Notify_Succeeded
condition: succeeded()
jobs:
- job: Succeed
steps:
- script: echo Succeed #(slack)
- stage: Notify_Fail
condition: failed()
jobs:
- job: Fail
steps:
- script: echo Fail #(slack)
It is possible but you have to use REST API. With below YAML you will get what you described:
variables:
orgName: 'thecodemanual'
stages:
- stage: Test
jobs:
- job: UnitTest
steps:
- script: echo UnitTest
- script: exit 1
- stage: Build
jobs:
- job: Build
steps:
- script: echo Build
- stage: Notify
dependsOn:
- Test
- Build
condition: succeededOrFailed()
jobs:
- job: InitialJob
condition: always()
steps:
- pwsh: |
$url = "https://dev.azure.com/$(orgName)/$(System.TeamProject)/_apis/build/builds/$(Build.BuildId)/timeline?api-version=5.1"
$timeline = Invoke-RestMethod -Uri $url -Headers #{Authorization = "Bearer $env:SYSTEM_ACCESSTOKEN"}
Write-Host "Pipeline = $($timeline | ConvertTo-Json -Depth 100)"
$test = $timeline.records | where { $_.identifier -eq "Test.UnitTest" }
$result = $test.result
Write-Host "##vso[task.setvariable variable=testResult;isOutput=true]$result"
name: initial
env:
SYSTEM_ACCESSTOKEN: $(system.accesstoken)
- job: Succeed
dependsOn: InitialJob
condition: eq(dependencies.InitialJob.outputs['initial.testResult'], 'succeeded')
steps:
- script: echo Succeed #(slack)
- job: Fail
dependsOn: InitialJob
condition: eq(dependencies.InitialJob.outputs['initial.testResult'], 'failed')
steps:
- script: echo Fail #(slack)
Let me explain what I did above:
I made a call to REST API to get result of step from previous stage (as the the moment is not possible to get task result)
I assign status of that step to output variable
I used this variable as condition to run specific job Succeed or Fail
Remark: before you run code, please change orgName to your.
EDIT
Below url return you details for your build. But you will not fine there information about specific tasks or stages. But you will get there url for a timeline.
https://dev.azure.com/$(orgName)/$(System.TeamProject)/_apis/build/builds/$(Build.BuildId)?api-version=5.1
This REST endpoint will return you a timeline which includes details for tasks and stages.
https://dev.azure.com/$(orgName)/$(System.TeamProject)/_apis/build/builds/$(Build.BuildId)/timeline?api-version=5.1
You can specify the conditions under which each stage runs. By default, a stage runs if it does not depend on any other stage, or if all of the stages that it depends on have completed and succeeded.
Example to run a stage based upon the status of running a previous stage:
stages:
- stage: A
# stage B runs if A fails
- stage: B
condition: failed()
# stage C runs if B succeeds
- stage: C
dependsOn:
- A
- B
condition: succeeded('B')
Example of using a custom condition:
stages:
- stage: A
- stage: B
condition: and(succeeded(), eq(variables['build.sourceBranch'], 'refs/heads/master'))
Note: You cannot currently specify that a stage run based on the value of an output variable set in a previous stage.
Documentation:
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/stages?view=azure-devops&tabs=yaml#conditions

Azure DevOps Conditional execution of Job that depends on a Job in another Stage

I have a pipeline.yaml that looks like this
pool:
vmImage: image
stages:
-stage: A
jobs:
-job: a
steps:
- script: |
echo "This is stage build"
echo "##vso[task.setvariable variable=doThing;isOutput=true]Yes"
name: BuildStageRun
-stage: B
jobs:
-job: b
steps: #do something in steps
-job: c
dependsOn: a
condition: eq(dependencies.build.outputs['BuildStageRun.doThing'], 'Yes')
steps:
- script: echo "I am scripting"
So, there are 2 stages, A with one job a, and B with 2 jobs b and c. I would like to have job c executed only when job a has executed. I tried to do so by setting the variable doThing in job a to Yes and then check this variable in job c.
But I get an error:
Stage plan job c depends on unknown job a.
The varible definition and the definition of condition was taken from Azure documentation
Do you have any suggestion on how to get this working?
While Shayki is correct that it is not supported - there is a workaround that I am currently using. That I used with the help of this blog https://medium.com/microsoftazure/how-to-pass-variables-in-azure-pipelines-yaml-tasks-5c81c5d31763
Basically you create your output as normal, and then publish the variables as pipeline artifacts. In the next stage, you read the artifact in the first job, and use that to construct your conditionals e.g.
stages:
- stage: firststage
jobs:
- job: setup_variables
pool:
vmImage: 'Ubuntu-16.04'
steps:
- powershell: |
$ShouldBuildThing1 = $true
# Write to normal output for other jobs in this stage
Write-Output "##vso[task.setvariable variable=BuildThing1;isOutput=true]$ShouldBuildThing1"
# Write to file to publish later
mkdir -p $(PipelineWorkspace)/variables
Write-Output "$ShouldBuildThing1" > $PipelineWorkspace/variables/BuildThing1
name: variablesStep
# Publish the folder as pipeline artifact
- publish: $(Pipeline.Workspace)/variables
artifact: VariablesArtifact
- job: build_something
pool:
vmImage: 'Ubuntu-16.04'
dependsOn: setup_variables
condition: eq(dependencies.setup_variables.outputs['variablesStep.BuildThing1'], 'true')
variables:
BuildThing1: $[dependencies.setup_variables.outputs['variablesStep.BuildThing1']]
steps:
- powershell: |
Write-Host "Look at me I can Read $env:BuildThing1"
- somethingElse:
someInputArgs: $(BuildThing1)
- stage: secondstage
jobs:
- job: read_variables
pool:
vmImage: 'Ubuntu-16.04'
steps:
# If you download all artifacts then foldername will be same as artifact name under $(Pipeline.Workspace). Artifacts are also auto downloaded on deployment jobs.
- task: DownloadPipelineArtifact#2
inputs:
artifact: "VariablesArtifact"
path: $(Pipeline.Workspace)/VariablesArtifact
- powershell: |
$ShouldBuildThing1 = $(Get-Content $(Pipeline.Workspace)/VariablesArtifact/BuildThing1)
Write-Output "##vso[task.setvariable variable=BuildThing1;isOutput=true]$ShouldBuildThing1"
name: variablesStep
- job: secondjob
pool:
vmImage: 'Ubuntu-16.04'
dependsOn: read_variables
condition: eq(dependencies.read_variables.outputs['variablesStep.BuildThing1'], 'true')
variables:
BuildThing1: $[dependencies.setup_variables.outputs['variablesStep.BuildThing1']]
steps:
- powershell: |
Write-Host "Look at me I can Read $env:BuildThing1"
- somethingElse:
someInputArgs: $(BuildThing1)
Looks like a few options available from Microsoft now.
First is job-to-job dependencies across stages
From Microsoft:
In this example, job B1 will run whether job A1 is successful or skipped. Job B2 will check the value of the output variable from job A1 to determine whether it should run.
trigger: none
pool:
vmImage: 'ubuntu-latest'
stages:
- stage: A
jobs:
- job: A1
steps:
- bash: echo "##vso[task.setvariable variable=shouldrun;isOutput=true]true"
# or on Windows:
# - script: echo ##vso[task.setvariable variable=shouldrun;isOutput=true]true
name: printvar
- stage: B
dependsOn: A
jobs:
- job: B1
condition: in(stageDependencies.A.A1.result, 'Succeeded', 'SucceededWithIssues', 'Skipped')
steps:
- script: echo hello from Job B1
- job: B2
condition: eq(stageDependencies.A.A1.outputs['printvar.shouldrun'], 'true')
steps:
- script: echo hello from Job B2
Also, there's another option where you could consume output variables across stages.
From Microsoft site:
Stages can also use output variables from another stage. In this example, Stage B depends on a variable in Stage A.
stages:
- stage: A
jobs:
- job: A1
steps:
- bash: echo "##vso[task.setvariable variable=shouldrun;isOutput=true]true"
# or on Windows:
# - script: echo ##vso[task.setvariable variable=shouldrun;isOutput=true]true
name: printvar
- stage: B
condition: and(succeeded(), eq(dependencies.A.outputs['A1.printvar.shouldrun'], 'true'))
dependsOn: A
jobs:
- job: B1
steps:
- script: echo hello from Stage B
It's because you can't depend on a job from another stage, you can depend stage B on stage A or job c on job b.
You can't achieve your goal with YAML conditions because you want to use a variable that you declared it in the first stage, the second stage doesn't know this variable, Azure DevOps don't support it yet:
You cannot currently specify that a stage run based on the value of an
output variable set in a previous stage.
You can depend stage B on A, so if in stage A there is only one job you depend stage B on stage A:
- stage: B
dependsOn: A