I'm trying to send a post-build Slack message after the job is done/failed in a Azure DevOps YAML pipeline. But it seems I can't find a proper condition setting.
Basically, I have three stages: test, build, and notification.
I tried the following at last, but dependencies.UnitTest.result returns null, so not giving me Succeeded or Failed.
I also tried a bunch of different conditions but they didn't work. For example, a plain succeeded() and failed() without dependencies, or succeeded('Test') as stage level or succeeded('UnitTest') as job level.
In most cases, they send the success message even if they failed at the Test stage, or a syntax error for job names as argument in succeeded() or failed()
What is the proper condition to send a post-build message like Jenkins?
stages:
- stage: Test
jobs:
- job: UnitTest
steps:
- script: echo UnitTest
- script: exit 1
- stage: Build
jobs:
- job: Build
steps:
- script: echo Build
- stage: Notify
dependsOn:
- Test
- Build
condition: succeededOrFailed()
jobs:
- job: Succeed
condition: eq(dependencies.UnitTest.result, 'Succeeded')
steps:
- script: echo Succeed #(slack)
- job: Fail
condition: eq(dependencies.UnitTest.result, 'Failed')
steps:
- script: echo Fail #(slack)
--- EDIT ---
MS support confirmed jobs in multi stages can't support as yaml syntax itself.
Not same as the original flow as expected, but you can split succeed and fail into different stages as follow.
(It may increase quite number of stages just for the notification if you want different message for each jobs..)
...
- stage: Notify_Succeeded
condition: succeeded()
jobs:
- job: Succeed
steps:
- script: echo Succeed #(slack)
- stage: Notify_Fail
condition: failed()
jobs:
- job: Fail
steps:
- script: echo Fail #(slack)
It is possible but you have to use REST API. With below YAML you will get what you described:
variables:
orgName: 'thecodemanual'
stages:
- stage: Test
jobs:
- job: UnitTest
steps:
- script: echo UnitTest
- script: exit 1
- stage: Build
jobs:
- job: Build
steps:
- script: echo Build
- stage: Notify
dependsOn:
- Test
- Build
condition: succeededOrFailed()
jobs:
- job: InitialJob
condition: always()
steps:
- pwsh: |
$url = "https://dev.azure.com/$(orgName)/$(System.TeamProject)/_apis/build/builds/$(Build.BuildId)/timeline?api-version=5.1"
$timeline = Invoke-RestMethod -Uri $url -Headers #{Authorization = "Bearer $env:SYSTEM_ACCESSTOKEN"}
Write-Host "Pipeline = $($timeline | ConvertTo-Json -Depth 100)"
$test = $timeline.records | where { $_.identifier -eq "Test.UnitTest" }
$result = $test.result
Write-Host "##vso[task.setvariable variable=testResult;isOutput=true]$result"
name: initial
env:
SYSTEM_ACCESSTOKEN: $(system.accesstoken)
- job: Succeed
dependsOn: InitialJob
condition: eq(dependencies.InitialJob.outputs['initial.testResult'], 'succeeded')
steps:
- script: echo Succeed #(slack)
- job: Fail
dependsOn: InitialJob
condition: eq(dependencies.InitialJob.outputs['initial.testResult'], 'failed')
steps:
- script: echo Fail #(slack)
Let me explain what I did above:
I made a call to REST API to get result of step from previous stage (as the the moment is not possible to get task result)
I assign status of that step to output variable
I used this variable as condition to run specific job Succeed or Fail
Remark: before you run code, please change orgName to your.
EDIT
Below url return you details for your build. But you will not fine there information about specific tasks or stages. But you will get there url for a timeline.
https://dev.azure.com/$(orgName)/$(System.TeamProject)/_apis/build/builds/$(Build.BuildId)?api-version=5.1
This REST endpoint will return you a timeline which includes details for tasks and stages.
https://dev.azure.com/$(orgName)/$(System.TeamProject)/_apis/build/builds/$(Build.BuildId)/timeline?api-version=5.1
You can specify the conditions under which each stage runs. By default, a stage runs if it does not depend on any other stage, or if all of the stages that it depends on have completed and succeeded.
Example to run a stage based upon the status of running a previous stage:
stages:
- stage: A
# stage B runs if A fails
- stage: B
condition: failed()
# stage C runs if B succeeds
- stage: C
dependsOn:
- A
- B
condition: succeeded('B')
Example of using a custom condition:
stages:
- stage: A
- stage: B
condition: and(succeeded(), eq(variables['build.sourceBranch'], 'refs/heads/master'))
Note: You cannot currently specify that a stage run based on the value of an output variable set in a previous stage.
Documentation:
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/stages?view=azure-devops&tabs=yaml#conditions
Related
I have build a yaml pipeline which has multiple stages in them.
To ensure all the stages are using the same git tag, I am trying to display the git tag in the displayName of each stage.
I have a job within the stage which goes and finds the git tag
stages:
- stage : stageA
jobs:
- job: Tag
steps:
- bash: |
tag=`git describe --tags --abbrev=0` && echo "##vso[task.setvariable variable=version_tag;isOutput=true]$tag"
echo "$tag"
name: setTag
How can I use the $tag under displayName in the next stage of the pipeline?
How can I use the $tag under displayName in the next stage of the
pipeline?
No, runtime variables are absolutely impossible to get in compile time. The values of display name are captured before any of the stage actually run, unless you pass a compile-time expression, otherwise anything will be handled as 'string'.
Take a look of this:
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch#understand-variable-syntax
Only this situation can work:
trigger:
- none
pool:
vmImage: ubuntu-latest
stages:
- stage: A
jobs:
- job: A1
steps:
- bash: |
tag="testtag"
echo "##vso[task.setvariable variable=myStageVal;isOutput=true]$tag"
name: MyOutputVar
- stage: B
dependsOn: A
jobs:
- job: B1
variables:
myStageAVar: $[stageDependencies.A.A1.outputs['MyOutputVar.myStageVal']]
steps:
- bash: echo $(myStageAVar)
Let's say I have the following stage and pipeline:
stages:
- stage: A
lockBehavior: sequential
jobs:
- job: Job
steps:
- script: Hey!
lockBehavior: runLatest
stages:
- stage: A
jobs:
- job: Job
steps:
- script: Hey!
How would I print out the lockBehavior parameter out in Azure Devops when running the pipeline? I have tried printing out all variables using this code:
jobs:
- job: testing
steps:
- task: Bash#3
inputs:
targetType: 'inline'
script: 'env | sort'
But this does not work.
I checked the parameters "lockBehavior" from here . But I haven't found out a method to show the value in the pipeline log. However, there is a note "If you do not specify a lockBehavior, it is assumed to be runLatest." In my view,it's not hard to know the value if you set up it by yourself.
As my experience, you can print some "Use predefined variables" or "https://learn.microsoft.com/en-us/azure/devops/pipelines/process/runtime-parameters?view=azure-devops&tabs=script" by following the offical doc
So i guess that perhaps this parameter will be added in the future in the "Use predefined variables".
I have a stage set up in my build pipeline to detect changes in specific folders in my repo and then I use a condition based on the output variable to trigger or skip a build stage for each path. This works perfectly fine between my Detect and Build stages, but when I then reference a template this variable no longer is accessible.
Code example:
#azure-pipelines.yml file
stages:
- stage: DetectChanges
displayName: Detect Changes
jobs:
- job: DetectChanges
displayName: 'Detect changes'
steps:
- powershell: |
$pathfilters = #("ui/", "api/", "publicapi/")
foreach($path in $pathfilters) {
$changed = $(git diff HEAD HEAD~ --name-only $path)
if ($changed.count -gt 0) {
echo "$($changed.count) change$(if ($changed.count -gt 1) {echo s}) detected on $path"
echo "##vso[task.setvariable variable=$($path.Substring(0,$path.Length-1))_changed;isOutput=true]true"
}
else {
echo "No changes detected on $path"
echo "##vso[task.setvariable variable=$($path.Substring(0,$path.Length-1))_changed;isOutput=true]false"
}
}
name: detect_changes
displayName: Detect Changes
- stage: Build
displayName: Build
jobs:
- job: UIBuild
displayName: UI-Build
condition: eq(stageDependencies.DetectChanges.DetectChanges.outputs['detect_changes.ui_changed'], 'true')
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: 'Write-Host "UI-Build"'
The above code works perfectly. I output a variable in my DetectChanges stage that I then reference in my build stage for each of my paths(only have one of the 3 placeholder builds in the example above).
The problem comes in my next stages where I have a template defined:
#azure-pipelines.yml next stage snippet (after the code above)
- stage: DevDeploy
displayName: Dev Deploy
dependsOn: Build
variables:
- group: dev
jobs:
- template: templates/deploy.yml
This references my template with the below code:
#templates/deploy.yml snippet
- deployment: ui
condition: eq(stageDependencies.DetectChanges.DetectChanges.outputs['detect_changes.ui_changed'], 'true')
environment: '$(myEnv)'
strategy:
runOnce:
deploy:
steps:
- template: uideploy.yml
This stage then is always skipped due to the stageDependencies reference being null when executed from the template.
Stage execution results below show the value as null:
Started: Today at 3:39 PM
Duration: 1h 37m 22s
Evaluating: eq(stageDependencies['DetectChanges']['DetectChanges']['outputs']
['detect_changes.ui_changed'], 'true')
Expanded: eq(Null, 'true')
Result: False
Why does running a stage in a template vs in line on the same file result in a variable reference not working?
I'm trying to generate release notes in an azure piplelines stage and push the note to an azure service bus.
How do I expose the variable in a bash script then consume it in a subsequent job in the same stage?
I'm using a bash task to execute a git command and trying to export it as an environment variable which I want to use in the following job.
- stage: PubtoAzureServiceBus
variables:
COMMIT_MSG: "alsdkfgjdsfgjfd"
jobs:
- job: gitlog
steps:
- task: Bash#3
inputs:
targetType: 'inline'
script: |
# Write your commands here
export COMMIT_MSG=$(git log -1 --pretty=format:"Author: %aN%n%nCommit: %H%n%nNotes:%n%n%B")
env | grep C
- job:
pool: server
dependsOn: gitlog
steps:
- task: PublishToAzureServiceBus#1
inputs:
azureSubscription: 'Slack Release Notifications'
messageBody: |
{
"channel":"XXXXXXXXXX",
"username":"bot",
"iconEmoji":"",
"text":":airhorn: release :airhorn: \n`$(COMMIT_MSG)`"
}
signPayload: false
waitForCompletion: false
You need to use logging syntax and output variables like it is shown here:
trigger: none
pool:
vmImage: 'ubuntu-latest'
stages:
- stage: A
jobs:
- job: A1
steps:
- bash: echo "##vso[task.setvariable variable=shouldrun;isOutput=true]true"
# or on Windows:
# - script: echo ##vso[task.setvariable variable=shouldrun;isOutput=true]true
name: printvar
- stage: B
dependsOn: A
jobs:
- job: B1
condition: in(stageDependencies.A.A1.result, 'Succeeded', 'SucceededWithIssues', 'Skipped')
steps:
- script: echo hello from Job B1
- job: B2
variables:
varFromA: $[ stageDependencies.A.A1.outputs['printvar.shouldrun'] ]
steps:
- script: echo $(varFromA) # this step uses the mapped-in variable
Please take a look here to check documentation.
So you need to replace
export COMMIT_MSG=$(git log -1 --pretty=format:"Author: %aN%n%nCommit: %H%n%nNotes:%n%n%B")
wit logging command with isOutput=true
and then map it as here
jobs:
- job: A
steps:
- bash: |
echo "##vso[task.setvariable variable=shouldrun;isOutput=true]true"
name: ProduceVar # because we're going to depend on it, we need to name the step
- job: B
dependsOn: A
variables:
# map the output variable from A into this job
varFromA: $[ dependencies.A.outputs['printvar.shouldrun']
steps:
- script: echo $(varFromA) # this step uses the mapped-in variable
as you want to share variable between jobs (not stages as it shown in the first example).
I have a pipeline.yaml that looks like this
pool:
vmImage: image
stages:
-stage: A
jobs:
-job: a
steps:
- script: |
echo "This is stage build"
echo "##vso[task.setvariable variable=doThing;isOutput=true]Yes"
name: BuildStageRun
-stage: B
jobs:
-job: b
steps: #do something in steps
-job: c
dependsOn: a
condition: eq(dependencies.build.outputs['BuildStageRun.doThing'], 'Yes')
steps:
- script: echo "I am scripting"
So, there are 2 stages, A with one job a, and B with 2 jobs b and c. I would like to have job c executed only when job a has executed. I tried to do so by setting the variable doThing in job a to Yes and then check this variable in job c.
But I get an error:
Stage plan job c depends on unknown job a.
The varible definition and the definition of condition was taken from Azure documentation
Do you have any suggestion on how to get this working?
While Shayki is correct that it is not supported - there is a workaround that I am currently using. That I used with the help of this blog https://medium.com/microsoftazure/how-to-pass-variables-in-azure-pipelines-yaml-tasks-5c81c5d31763
Basically you create your output as normal, and then publish the variables as pipeline artifacts. In the next stage, you read the artifact in the first job, and use that to construct your conditionals e.g.
stages:
- stage: firststage
jobs:
- job: setup_variables
pool:
vmImage: 'Ubuntu-16.04'
steps:
- powershell: |
$ShouldBuildThing1 = $true
# Write to normal output for other jobs in this stage
Write-Output "##vso[task.setvariable variable=BuildThing1;isOutput=true]$ShouldBuildThing1"
# Write to file to publish later
mkdir -p $(PipelineWorkspace)/variables
Write-Output "$ShouldBuildThing1" > $PipelineWorkspace/variables/BuildThing1
name: variablesStep
# Publish the folder as pipeline artifact
- publish: $(Pipeline.Workspace)/variables
artifact: VariablesArtifact
- job: build_something
pool:
vmImage: 'Ubuntu-16.04'
dependsOn: setup_variables
condition: eq(dependencies.setup_variables.outputs['variablesStep.BuildThing1'], 'true')
variables:
BuildThing1: $[dependencies.setup_variables.outputs['variablesStep.BuildThing1']]
steps:
- powershell: |
Write-Host "Look at me I can Read $env:BuildThing1"
- somethingElse:
someInputArgs: $(BuildThing1)
- stage: secondstage
jobs:
- job: read_variables
pool:
vmImage: 'Ubuntu-16.04'
steps:
# If you download all artifacts then foldername will be same as artifact name under $(Pipeline.Workspace). Artifacts are also auto downloaded on deployment jobs.
- task: DownloadPipelineArtifact#2
inputs:
artifact: "VariablesArtifact"
path: $(Pipeline.Workspace)/VariablesArtifact
- powershell: |
$ShouldBuildThing1 = $(Get-Content $(Pipeline.Workspace)/VariablesArtifact/BuildThing1)
Write-Output "##vso[task.setvariable variable=BuildThing1;isOutput=true]$ShouldBuildThing1"
name: variablesStep
- job: secondjob
pool:
vmImage: 'Ubuntu-16.04'
dependsOn: read_variables
condition: eq(dependencies.read_variables.outputs['variablesStep.BuildThing1'], 'true')
variables:
BuildThing1: $[dependencies.setup_variables.outputs['variablesStep.BuildThing1']]
steps:
- powershell: |
Write-Host "Look at me I can Read $env:BuildThing1"
- somethingElse:
someInputArgs: $(BuildThing1)
Looks like a few options available from Microsoft now.
First is job-to-job dependencies across stages
From Microsoft:
In this example, job B1 will run whether job A1 is successful or skipped. Job B2 will check the value of the output variable from job A1 to determine whether it should run.
trigger: none
pool:
vmImage: 'ubuntu-latest'
stages:
- stage: A
jobs:
- job: A1
steps:
- bash: echo "##vso[task.setvariable variable=shouldrun;isOutput=true]true"
# or on Windows:
# - script: echo ##vso[task.setvariable variable=shouldrun;isOutput=true]true
name: printvar
- stage: B
dependsOn: A
jobs:
- job: B1
condition: in(stageDependencies.A.A1.result, 'Succeeded', 'SucceededWithIssues', 'Skipped')
steps:
- script: echo hello from Job B1
- job: B2
condition: eq(stageDependencies.A.A1.outputs['printvar.shouldrun'], 'true')
steps:
- script: echo hello from Job B2
Also, there's another option where you could consume output variables across stages.
From Microsoft site:
Stages can also use output variables from another stage. In this example, Stage B depends on a variable in Stage A.
stages:
- stage: A
jobs:
- job: A1
steps:
- bash: echo "##vso[task.setvariable variable=shouldrun;isOutput=true]true"
# or on Windows:
# - script: echo ##vso[task.setvariable variable=shouldrun;isOutput=true]true
name: printvar
- stage: B
condition: and(succeeded(), eq(dependencies.A.outputs['A1.printvar.shouldrun'], 'true'))
dependsOn: A
jobs:
- job: B1
steps:
- script: echo hello from Stage B
It's because you can't depend on a job from another stage, you can depend stage B on stage A or job c on job b.
You can't achieve your goal with YAML conditions because you want to use a variable that you declared it in the first stage, the second stage doesn't know this variable, Azure DevOps don't support it yet:
You cannot currently specify that a stage run based on the value of an
output variable set in a previous stage.
You can depend stage B on A, so if in stage A there is only one job you depend stage B on stage A:
- stage: B
dependsOn: A