Azure Pipelines YAML how to use variables between jobs that have strategies - azure-devops

Let's say I'm starting with this configuration that works:
jobs:
- job: ARelease
pool:
vmImage: 'Ubuntu-16.04'
steps:
- script: |
echo "##vso[task.setvariable variable=ReleaseVar;isOutput=true]1"
name: JobResult
- job: C
pool:
vmImage: 'Ubuntu-16.04'
dependsOn: ARelease
variables:
AVar: $[ dependencies.A.outputs['JobResult.ReleaseVar'] ]
steps:
- script: |
echo $(AVar)
As expected job C outputs 1.
Now let's say that I have to add a new job ADebug, which is almost identical to ARelease, so I use a strategy:
jobs:
- job: A
strategy:
matrix:
Release:
BUILD_TYPE: Release
Debug:
BUILD_TYPE: Debug
pool:
vmImage: 'Ubuntu-16.04'
steps:
- script: |
echo "##vso[task.setvariable variable=$(BUILD_TYPE)Var;isOutput=true]1"
name: JobResult
- job: C
pool:
vmImage: 'Ubuntu-16.04'
dependsOn: A
variables:
AReleaseVar: $[ dependencies.A.outputs['JobResult.ReleaseVar'] ]
ADebugVar: $[ dependencies.A.outputs['JobResult.DebugVar'] ]
steps:
- script: |
echo $(AReleaseVar)
echo $(ADebugVar)
I would expect that everything works and that I can see the outputs.. but the output is empty.
Queuing the job with diagnostics it seems that $[ dependencies.A.outputs['JobResult.ReleaseVar'] ] and $[ dependencies.A.outputs['JobResult.DebugVar'] ] evaluate to Null.
I've tried different variations to access those variables, but it always evaluates to Null.
Any idea what is the correct way?

https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch#set-a-multi-job-output-variable
If you're setting a variable from a matrix or slice, then to reference
the variable, you have to include the name of the job as well as the
step when you access it from a downstream job.
The format is as follows: dependencies.{job}.outputs['{matrix/slice key}.{step name}.{variable name}']
In your scenario, you have Job A run with a matrix strategy (Release, Debug), and you respectively set the variable names to be ReleaseVar and DebugVar. The way to accurately access these variables is:
dependencies.A.outputs['Release.JobResult.ReleaseVar']
dependencies.A.outputs['Debug.JobResult.DebugVar']
On a side note, perhaps just use the same variable name since it you are already able to distinguish between the values based on the matrix name.

Related

condition issue with Azure Devops pipeline

variables:
branches: $[ or(eq(variables['Build.SourceBranch'], 'refs/heads/branch/ayush'), contains(variables['Build.SourceBranch'], 'refs/heads/releases/'))]
This ${{ if eq(variables.branches, 'true') }} condition is not being executed and so the value of var is not being set. However if i set branches to true, it works.
I do not understand what the issue is. What data type does branches have?boolean or string.
According to your description, this problem is related to the expression syntax.
When you are using the runtime expression "$[]", it works in runtime of the pipeline and "${{}}" is compile time syntax, it works before the runtime. So, the condition "${{if...}}" cannot get the value of the "branches" variable.
As a workaround, you can try to use specific condition to set the variable.
variables:
branches: $[ or(eq(variables['Build.SourceBranch'], 'refs/heads/branch/ayush'), contains(variables['Build.SourceBranch'], 'refs/heads/releases/'))]
stages:
- stage: A
jobs:
- job: A1
steps:
- script: echo Hello Stage A!
- stage: B
condition: and(succeeded(), eq(variables.branches, 'true'))
jobs:
- job: B1
steps:
- script: echo Hello Stage B!
- powershell: |
Write-Host "##vso[task.setvariable variable=var;]value"

Dynamic variables not available in other stages of azure pipeline

I am new to azure pipelines and am currently experimenting with the passing variables to the later jobs. Here is the current snippet whereby I am trying to extract the project version from the pom file using maven help: evaluate plugin. The pomVersion variable is populated but is not available in the same step or later steps in the second job via the projectVersionDynamic variable.
stages:
- stage: FirstStage
jobs:
- job: FirstJob
pool:
vmImage: 'ubuntu-latest'
steps:
- task: Bash#3
inputs:
targetType: 'inline'
script: |
pomVersion=`mvn help:evaluate -Dexpression=project.version -q -DforceStdout`
echo $pomVersion ##Prints semantic version 2.27.0-SNAPSHOT
echo '##vso[task.setvariable variable=projectVersionDynamic;isOutput=true]$pomVersion'
echo '##vso[task.setvariable variable=projectVersionStatic;isOutput=true]2.27.0'
echo $(Task1.projectVersionDynamic) ##Error message
echo $projectVersionDynamic ##Is empty
name: Task1
displayName: Task1 in JobOne of FirstStage
- job: SecondJob
dependsOn: FirstJob
variables:
DynVar: $[ dependencies.FirstJob.outputs['Task1.projectVersionDynamic'] ]
StaVar: $[ dependencies.FirstJob.outputs['Task1.projectVersionStatic'] ]
pool:
vmImage: 'ubuntu-latest'
steps:
- task: Bash#3
inputs:
targetType: 'inline'
script: |
echo 'SecondJob'
echo $(DynVar) ##Is empty
echo $DynVar ##Is empty
echo $(StaVar) ##Prints 2.27.0
echo $StaVar ##Is empty
displayName: Task in JobTwo of FirstStage
Observation: projectVersionDynamic value does not get populated and is not available in the same task or subsequent tasks within or in later jobs / stages. However, the static variable gets populated in projectVersionStatic without any issues.
Is it possible to set dynamic values for user-defined variables in azure pipelines or is it that I am doing something wrong? I see an example here under the Stages section where it seems to be working.
Variables in Azure Pipelines can be really tricky sometimes. The documentation isn't always crystal-clear on how it's supposed to work. Looking at your example, a cpuple of observations:
echo '##vso[task.setvariable variable=projectVersionDynamic;isOutput=true]$pomVersion' - your single quotes need to be double quotes to expand the value of $pomVersion into the echo statement
(and this is where things get fuzzy): The purpose of task.setvariable is to communicate values to other tasks or jobs. From the documentation: "This doesn't update the environment variables, but it does make the new variable available to downstream steps within the same job."
$variable syntax won't work because the task.setVariable doesn't inject into the running environment - rather, it's a signal to the pipeline to capture the output and store it away for later use.
$(variable) syntax won't work because it's expanded just before the job starts, so it's too late to capture here.
If you use my suggestion in point 1) about double-quoting the task.setVariable, you should see the value available in the second job.
For anybody having a use case where variables defined in one azure pipeline stage needs to be used in another stage, this is how it can be done:
Following snipett of azure pipeline that defines a variable and make it available to be used in a job in the same stage and in a job in another stage:
stages:
- stage: stage1
jobs:
- job: job1
steps:
- task: "AzureCLI#2"
name: step1
inputs:
inlineScript: |
my_variable=a-value
echo "##vso[task.setvariable variable=var_name;isOutput=true]$my_variable"
- job: job2
variables:
stage1_var: $[ dependencies.job1.outputs['step1.var_name'] ]
steps:
- bash: |
echo "print variable value"
echo $(stage1_var)
- stage: stage2
jobs:
- job: job1
variables:
stage2_var: $[ stageDependencies.stage1.job1.outputs['step1.var_name'] ]
steps:
- bash: |
echo "print variable value"
echo $(stage2_var)

Create variables dynamically in azure pipeline

I'm trying to generate release notes in an azure piplelines stage and push the note to an azure service bus.
How do I expose the variable in a bash script then consume it in a subsequent job in the same stage?
I'm using a bash task to execute a git command and trying to export it as an environment variable which I want to use in the following job.
- stage: PubtoAzureServiceBus
variables:
COMMIT_MSG: "alsdkfgjdsfgjfd"
jobs:
- job: gitlog
steps:
- task: Bash#3
inputs:
targetType: 'inline'
script: |
# Write your commands here
export COMMIT_MSG=$(git log -1 --pretty=format:"Author: %aN%n%nCommit: %H%n%nNotes:%n%n%B")
env | grep C
- job:
pool: server
dependsOn: gitlog
steps:
- task: PublishToAzureServiceBus#1
inputs:
azureSubscription: 'Slack Release Notifications'
messageBody: |
{
"channel":"XXXXXXXXXX",
"username":"bot",
"iconEmoji":"",
"text":":airhorn: release :airhorn: \n`$(COMMIT_MSG)`"
}
signPayload: false
waitForCompletion: false
You need to use logging syntax and output variables like it is shown here:
trigger: none
pool:
vmImage: 'ubuntu-latest'
stages:
- stage: A
jobs:
- job: A1
steps:
- bash: echo "##vso[task.setvariable variable=shouldrun;isOutput=true]true"
# or on Windows:
# - script: echo ##vso[task.setvariable variable=shouldrun;isOutput=true]true
name: printvar
- stage: B
dependsOn: A
jobs:
- job: B1
condition: in(stageDependencies.A.A1.result, 'Succeeded', 'SucceededWithIssues', 'Skipped')
steps:
- script: echo hello from Job B1
- job: B2
variables:
varFromA: $[ stageDependencies.A.A1.outputs['printvar.shouldrun'] ]
steps:
- script: echo $(varFromA) # this step uses the mapped-in variable
Please take a look here to check documentation.
So you need to replace
export COMMIT_MSG=$(git log -1 --pretty=format:"Author: %aN%n%nCommit: %H%n%nNotes:%n%n%B")
wit logging command with isOutput=true
and then map it as here
jobs:
- job: A
steps:
- bash: |
echo "##vso[task.setvariable variable=shouldrun;isOutput=true]true"
name: ProduceVar # because we're going to depend on it, we need to name the step
- job: B
dependsOn: A
variables:
# map the output variable from A into this job
varFromA: $[ dependencies.A.outputs['printvar.shouldrun']
steps:
- script: echo $(varFromA) # this step uses the mapped-in variable
as you want to share variable between jobs (not stages as it shown in the first example).

Azure DevOps Conditional execution of Job that depends on a Job in another Stage

I have a pipeline.yaml that looks like this
pool:
vmImage: image
stages:
-stage: A
jobs:
-job: a
steps:
- script: |
echo "This is stage build"
echo "##vso[task.setvariable variable=doThing;isOutput=true]Yes"
name: BuildStageRun
-stage: B
jobs:
-job: b
steps: #do something in steps
-job: c
dependsOn: a
condition: eq(dependencies.build.outputs['BuildStageRun.doThing'], 'Yes')
steps:
- script: echo "I am scripting"
So, there are 2 stages, A with one job a, and B with 2 jobs b and c. I would like to have job c executed only when job a has executed. I tried to do so by setting the variable doThing in job a to Yes and then check this variable in job c.
But I get an error:
Stage plan job c depends on unknown job a.
The varible definition and the definition of condition was taken from Azure documentation
Do you have any suggestion on how to get this working?
While Shayki is correct that it is not supported - there is a workaround that I am currently using. That I used with the help of this blog https://medium.com/microsoftazure/how-to-pass-variables-in-azure-pipelines-yaml-tasks-5c81c5d31763
Basically you create your output as normal, and then publish the variables as pipeline artifacts. In the next stage, you read the artifact in the first job, and use that to construct your conditionals e.g.
stages:
- stage: firststage
jobs:
- job: setup_variables
pool:
vmImage: 'Ubuntu-16.04'
steps:
- powershell: |
$ShouldBuildThing1 = $true
# Write to normal output for other jobs in this stage
Write-Output "##vso[task.setvariable variable=BuildThing1;isOutput=true]$ShouldBuildThing1"
# Write to file to publish later
mkdir -p $(PipelineWorkspace)/variables
Write-Output "$ShouldBuildThing1" > $PipelineWorkspace/variables/BuildThing1
name: variablesStep
# Publish the folder as pipeline artifact
- publish: $(Pipeline.Workspace)/variables
artifact: VariablesArtifact
- job: build_something
pool:
vmImage: 'Ubuntu-16.04'
dependsOn: setup_variables
condition: eq(dependencies.setup_variables.outputs['variablesStep.BuildThing1'], 'true')
variables:
BuildThing1: $[dependencies.setup_variables.outputs['variablesStep.BuildThing1']]
steps:
- powershell: |
Write-Host "Look at me I can Read $env:BuildThing1"
- somethingElse:
someInputArgs: $(BuildThing1)
- stage: secondstage
jobs:
- job: read_variables
pool:
vmImage: 'Ubuntu-16.04'
steps:
# If you download all artifacts then foldername will be same as artifact name under $(Pipeline.Workspace). Artifacts are also auto downloaded on deployment jobs.
- task: DownloadPipelineArtifact#2
inputs:
artifact: "VariablesArtifact"
path: $(Pipeline.Workspace)/VariablesArtifact
- powershell: |
$ShouldBuildThing1 = $(Get-Content $(Pipeline.Workspace)/VariablesArtifact/BuildThing1)
Write-Output "##vso[task.setvariable variable=BuildThing1;isOutput=true]$ShouldBuildThing1"
name: variablesStep
- job: secondjob
pool:
vmImage: 'Ubuntu-16.04'
dependsOn: read_variables
condition: eq(dependencies.read_variables.outputs['variablesStep.BuildThing1'], 'true')
variables:
BuildThing1: $[dependencies.setup_variables.outputs['variablesStep.BuildThing1']]
steps:
- powershell: |
Write-Host "Look at me I can Read $env:BuildThing1"
- somethingElse:
someInputArgs: $(BuildThing1)
Looks like a few options available from Microsoft now.
First is job-to-job dependencies across stages
From Microsoft:
In this example, job B1 will run whether job A1 is successful or skipped. Job B2 will check the value of the output variable from job A1 to determine whether it should run.
trigger: none
pool:
vmImage: 'ubuntu-latest'
stages:
- stage: A
jobs:
- job: A1
steps:
- bash: echo "##vso[task.setvariable variable=shouldrun;isOutput=true]true"
# or on Windows:
# - script: echo ##vso[task.setvariable variable=shouldrun;isOutput=true]true
name: printvar
- stage: B
dependsOn: A
jobs:
- job: B1
condition: in(stageDependencies.A.A1.result, 'Succeeded', 'SucceededWithIssues', 'Skipped')
steps:
- script: echo hello from Job B1
- job: B2
condition: eq(stageDependencies.A.A1.outputs['printvar.shouldrun'], 'true')
steps:
- script: echo hello from Job B2
Also, there's another option where you could consume output variables across stages.
From Microsoft site:
Stages can also use output variables from another stage. In this example, Stage B depends on a variable in Stage A.
stages:
- stage: A
jobs:
- job: A1
steps:
- bash: echo "##vso[task.setvariable variable=shouldrun;isOutput=true]true"
# or on Windows:
# - script: echo ##vso[task.setvariable variable=shouldrun;isOutput=true]true
name: printvar
- stage: B
condition: and(succeeded(), eq(dependencies.A.outputs['A1.printvar.shouldrun'], 'true'))
dependsOn: A
jobs:
- job: B1
steps:
- script: echo hello from Stage B
It's because you can't depend on a job from another stage, you can depend stage B on stage A or job c on job b.
You can't achieve your goal with YAML conditions because you want to use a variable that you declared it in the first stage, the second stage doesn't know this variable, Azure DevOps don't support it yet:
You cannot currently specify that a stage run based on the value of an
output variable set in a previous stage.
You can depend stage B on A, so if in stage A there is only one job you depend stage B on stage A:
- stage: B
dependsOn: A

Azure Pipelines overwrite pipeline variable

How does one overwrite a pipeline variable or how does one create a pipeline variable from a job?
I'm running a prepare job where I extract the current git tag into a variable that I need in following jobs so I decided to create a pipeline variable and overwrite its value in the first job:
variables:
GIT_TAG: v0.0.1
jobs:
- job: job1
pool:
vmImage: 'ubuntu-16.04'
steps:
- powershell: |
Write-Host "##vso[task.setvariable variable=GIT_TAG]$(git describe --tags --always)"
But, in the next job GIT_TAG has the initial value of v0.0.1.
By default, if you overwrite a variable the value is available only to his job, not to the sequence jobs.
Passing variables between jobs in the same stage is a bit more complex, as it requires working with output variables.
Similarly to the example above, to pass the FOO variable:
Make sure you give a name to the job, for example job: firstjob
Likewise, make sure you give a name to the step as well, for example: name: mystep
Set the variable with the same command as before, but adding ;isOutput=true, like: echo "##vso[task.setvariable variable=FOO;isOutput=true]some value"
In the second job, define a variable at the job level, giving it the value $[ dependencies.firstjob.outputs['mystep.FOO'] ] (remember to use single quotes for expressions)
A full example:
jobs:
- job: firstjob
pool:
vmImage: 'Ubuntu-16.04'
steps:
# Sets FOO to "some value", then mark it as output variable
- bash: |
FOO="some value"
echo "##vso[task.setvariable variable=FOO;isOutput=true]$FOO"
name: mystep
# Show output variable in the same job
- bash: |
echo "$(mystep.FOO)"
- job: secondjob
# Need to explicitly mark the dependency
dependsOn: firstjob
variables:
# Define the variable FOO from the previous job
# Note the use of single quotes!
FOO: $[ dependencies.firstjob.outputs['mystep.FOO'] ]
pool:
vmImage: 'Ubuntu-16.04'
steps:
# The variable is now available for expansion within the job
- bash: |
echo "$(FOO)"
# To send the variable to the script as environmental variable, it needs to be set in the env dictionary
- bash: |
echo "$FOO"
env:
FOO: $(FOO)
More info you can find here.