I am trying to set Azure DevOps to skip a stage on a multi-stage pipeline if a message does not start with a given text.
From the examples documentation, I think it is just
- stage: t1
condition: not(startsWith(variables['Build.SourceVersionMessage'], '[maven-release-plugin]'))
jobs:
- job: ReleasePrepare
displayName: Prepare release
pool:
vmImage: 'ubuntu-16.04'
steps:
- script: |
env | sort
However, this gets executed regardless. Here's an example of where I expect the t1 task to not be run based on the commit message https://dev.azure.com/trajano/experiments/_build/results?buildId=110&view=results
The output of env shows that the message was passed in correctly
Just in case it is a bug I reported it here as well https://developercommunity.visualstudio.com/content/problem/697290/startswith-buildsourceversionmessage-variable-not.html
It appears that Build.SourceVersionMessage at the time of this post is only resolvable on the steps.
Here's a working example that stores the value in a variable in one step and use it in the next job (which can be a deployment)
trigger:
batch: true
branches:
include:
- master
stages:
- stage: ci
displayName: Continuous Integration
jobs:
- job: Build
pool:
vmImage: 'ubuntu-16.04'
steps:
- script: |
env | sort
echo "$(Build.SourceVersionMessage)"
- stage: t1
displayName: Release
condition: eq(variables['Build.SourceBranch'],'refs/heads/master')
jobs:
- job: GetCommitMessage
displayName: Get commit message
steps:
- bash: |
echo "##vso[task.setvariable variable=commitMessage;isOutput=true]$(Build.SourceVersionMessage)"
echo "Message is '$(Build.SourceVersionMessage)''"
name: SetVarStep
displayName: Store commit message in variable
- job: ReleasePrepare
displayName: Prepare release
dependsOn: GetCommitMessage
pool:
vmImage: 'ubuntu-16.04'
condition: not(startsWith(dependencies.GetCommitMessage.outputs['SetVarStep.commitMessage'], '[maven-release-plugin]'))
steps:
- script: |
echo this would be a candidate for release
env | sort
displayName: Don't do it if maven release
- job: NotReleasePrepare
displayName: Don't Prepare Release
dependsOn: GetCommitMessage
pool:
vmImage: 'ubuntu-16.04'
condition: startsWith(dependencies.GetCommitMessage.outputs['SetVarStep.commitMessage'], '[maven-release-plugin]')
steps:
- script: |
echo this not be a candidate for release because it was created by the plugin
env | sort
condition: startsWith(variables.commitMessage, '[maven-release-plugin]')
displayName: Do it if maven release
The build can be found in https://dev.azure.com/trajano/experiments/_build/results?buildId=133&view=logs&s=6fc7e65a-555d-5fab-c78f-9502ae9436c4&j=b5187b8c-216e-5267-fcdb-c2c33d846e05
I am trying to set Azure DevOps to skip a stage if a message does not
start with a given text.
If I am not misunderstand, the condition you want is if the message match the start with maven-release-plugin, the current stage will be queued.
If this, the condition you write is not correct, I think you should specified it:
startsWith(variables['Build.SourceVersionMessage'], '[maven-release-plugin]')
As I tested on my pipeline:
And in fact, the value of this variable is Deleted 121321. Here is the result:
As you can see, that it is successful to skip the stage. My logic is, the value of Build.SourceVersionMessage should start with othermessage. But in fact, in my pipeline, it's value is Deleted 121321. Not match, so skip this stage.
(Delete 121321 is not only my PR name, I just set the commit message as the default PR name.)
Updated 2:
Though my test logic is not incorrect, but after I reproduced with YAML and many other method tested, such as use Build.SourceVersion which can only got after the source pulled.
Yes, you are right about that Build.SourceVersionMessage does not has value in Job level. As my tested, it indeed null in job level.
But, unfortunately, this is not a bug. This is as designed in fact.
We can think that the source repos be pulled locally only the stage job begin to execute, right? You can see the Checkout log, it record the process about pull source file down.
If the stage does not be executed, the source will not be pulled down. But also, if no source pulled, the server will also could not get the value of Build.SourceVersionMessage because of no source history. That's why I also tested with the variable Build.SourceVersion in the Job level.
We can not use these two variable at the agent job level because it hasn't pulled sources yet so Build.SourceVersionMessage is null. You'll need to copy it to each step in your pipeline. This is what confirmed by our Product Group team.
But, I still need to say sorry. Sorry about our doc is not too clear to announce that this could not used in agent job level.
Related
I have a Multi-Stage YAML Pipeline along with 3 deployment stages Dev, QA and Prod. Now Using Azure DevOps REST API, I would like to fetch the latest Build Number when a particular deployment Stage got deployed successfully. For example, fetch the Build Number of the last successful QA stage deployment.
Build number is the name of the completed build, different stages in same build have same build number. You can use predefined variable $(Build.BuildNumber) to get the value(link).
If you'd like to distinct different stages, you can get stage name instead with predefined variable $(System.StageName). Sample below, combined the buildnumber and stage name.
trigger: none
stages:
- stage: Dev
pool:
vmImage: 'ubuntu-20.04'
jobs:
- job:
steps:
- bash: |
echo $(Build.BuildNumber)-$(System.StageName)
- stage: QA
pool:
vmImage: 'ubuntu-20.04'
jobs:
- job:
steps:
- bash: |
echo $(Build.BuildNumber)-$(System.StageName)
- stage: Prod
dependsOn: [Dev,QA]
pool:
vmImage: 'ubuntu-20.04'
jobs:
- job:
steps:
- bash: |
echo $(Build.BuildNumber)-$(System.StageName)
For example, the output for Dev stage:
Edit:
Regarding to get the last the build number of the last successful deployment of a particular deployment stage, it's more complicated.
Step1: You need to get all build url with build id in the target build definition with rest api "Builds - List", it also list the build number for each build run.
https://dev.azure.com/{org}/{project}/_apis/build/builds?definitions={definitionid}&api-version=6.1-preview.7
Step2: after you get all build url, you can use below rest api to check the stage result with build url(more details here.):
we use number to represent the result: 0: succeeded, 1: succeded with issues, 2: failed, 3: canceled, 4: skipped, 5: abandoned
Get https://dev.azure.com/{org}/{pro}/_build/results?buildId={id}&__rt=fps&__ver=2
You need to validate the stage result for each build with a loop, from high build id to lower, until it find the stage result is successful, then return the build number.
I have two pipelines - build and publish. Build pipeline can produce up two artifacts but it depends on given parameters.
Publish pipeline is automatically triggered when build pipeline is completed. Publish pipeline then tooks published artifacts and deploy them. However I want to run publish tasks only and only if particular artifacts exists from build pipeline.
Right now, if artifact does not exists, it will fail "download" task.
Simplified to important parts and redacted some secret info
resources:
pipelines:
- pipeline: buildDev # Internal name of the source pipeline, used elsewhere within app-ci YAML, # e.g. to reference published artifacts
source: "Build"
trigger:
branches:
- dev
- feat/*
stages:
- stage: publish
displayName: "🚀🔥 Publish to Firebase"
jobs:
- job: publish_firebase_android
displayName: "🔥🤖Publish Android to Firebase"
steps:
- script: |
- download: buildDev
artifact: android
- download: buildDev
artifact: changelog
- task: DownloadSecureFile#1
name: firebaseKey
displayName: "Download Firebase key"
inputs:
secureFile: "<secure>.json"
- script: <upload>
displayName: "Deploy APK to Firebase"
workingDirectory: "$(Pipeline.Workspace)/buildDev/android/"
- job: publish_firebase_ios
displayName: "🔥🍏Publish iOS to Firebase"
steps:
- download: buildDev
artifact: iOS
- download: buildDev
artifact: changelog
- task: DownloadSecureFile#1
name: firebaseKey
displayName: "Download Firebase key"
inputs:
secureFile: "<secure>.json"
- script: <upload...>
workingDirectory: "$(Pipeline.Workspace)/buildDev/iOS/"
displayName: "Deploy IPA to Firebase"
I've tried to find some solution but every other solution solve the only problem within the same pipeline. Based on MS Docs I can't find if there is a prepared env. a variable that could point to "pipeline resources". With that env. variable I could theoretically run a script which checks presence of artifact, set variable and use that variable as condition for steps.
I think you can use stage filters in trigger. I don't know what structure your build pipeline is, but you can set up a stage to publish artifacts. Execute that stage if there are artifacts to publish, otherwise skip it. You can do this using conditions. Here is a simple sample:
stages:
- stage: Build
jobs:
- job: build
steps:
...
- stage: Artifact
condition: ... # Set the condition based on your parameter
jobs:
- job: artifact
steps:
...
Then use the stage filter in the publishing pipeline. If the stage executes successfully, then the publish pipeline will run, otherwise, the publish pipeline will not run.
resources:
pipelines:
- pipeline: buildpipeline
source: buildpipeline
trigger:
stages:
- Artifact
Using variable groups is an option as well. You can use the variable groups to pass variable from a pipeline to another pipeline. Here are the detailed steps:
(1). Create a variable group in Pipelines/Library and add a new Variable. I will call this variable "var" later.
(2). In your build pipeline, you can update "var" based on your parameters:
variables:
- group: {group name}
- bash: |
az pipelines variable-group variable update --group-id {id} --name var --value yes
env:
AZURE_DEVOPS_EXT_PAT: $(System.AccessToken)
condition: ...
Tip 1. If you don't know your variable group id, go to Pipelines/Library and select your variable group. You can find it in the URL: https://dev.azure.com/...&variableGroupId={id}&...
Tip 2. If you meet the error "You do not have permissions to perform this operation on the variable group.", go to Pipelines/Library and select your variable group. Click on "Security" and give "{pipeline name} Build Service" user the Administrator role.
Tip 3. Use your parameter in condition to decide whether to update var.
(3). In your publish pipeline, you can use var from variable group in condition:
condition: eq(variables['var'], 'yes')
Is it possible to have a single file azure-pipelines.yaml that can :
Trigger a job A on a push from any branch BUT main
Trigger a job B on PR to main and all subsequent commit on that PR
Trigger a job C when main is merged
I have tried to play arround with trigger, pr keywords and even with condition(), variables['Build.Reason'], or System.PullRequest.TargetBranch but I didn't manage to reach the expected result.
I start thinking it cannot be done with a single file - am I wrong?
You can set conditions on your stages to run depending on a variable but I am not pretty sure this will work with your conditions. Maybe you could also combine some variable values.
For example source branch main and pr is created.
and(eq(variables['Build.Reason'], 'PullRequest'), eq(variables['Build.SourceBranch'], 'refs/heads/main'))
Azure documentation sample:
variables:
isMain: $[eq(variables['Build.SourceBranch'], 'refs/heads/main')]
stages:
- stage: A
jobs:
- job: A1
steps:
- script: echo Hello Stage A!
- stage: B
condition: and(succeeded(), eq(variables.isMain, 'true'))
jobs:
- job: B1
steps:
- script: echo Hello Stage B!
- script: echo $(isMain)
Keep in mind that triggers are appending resources. This means that if you specify a trigger like the below, it will be triggered whether the branch filter is triggered OR the pr is created.
trigger:
branches:
include:
- '*'
pr:
branches:
include:
- current
As you said this can be accomplished for sure with separate files for the pipelines.
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/conditions?view=azure-devops&tabs=yaml
Your first question is possible by using trigger with include and exclude branches as below
trigger:
branches:
include:
- stage
- releases/*
exclude:
- master
Refer CI triggers in Azure Repos Git documentation to have more understanding.
I'm porting one of our release pipelines from ADO classic to YAML.
Neither classic nor YAML allow you to select jobs/tasks while creating the run.
At least in classic, once the release is created, you can disable a stage's jobs/tasks for that specific release before triggering the stage.
In YAML, however, only stages are selectable on the run creation screen. But after the run is started, I can't find a way to disable tasks and/or jobs like I can in classic.
Oddly enough, after a YAML stage runs there's an option to "Rerun failed jobs."
So my question is, is it possible to enable/disable jobs and/or tasks when creating a new YAML pipeline run (or after creating, but before triggering a stage)?
After creating the pipeline, I am afraid that there is no existing option can cancel a singal stage before triggering.
For a workaround, you can create Pipeline environment and add the Approval check. Then you can use the Environment in your stage.
In this case, before running this stage, you can stop the running of this stage by denying approval.
enable/disable jobs and/or tasks when creating a new YAML pipeline run
To achieve this requirement, you can use the If Expression and Runtime Parameters in Yaml Pipeline.
Here is an example:
parameters:
- name: test1
type: string
default: false
values:
- true
- false
pool:
vmImage: ubuntu-latest
stages:
- stage: testA
jobs:
- ${{ if eq(parameters['test1'], 'true' ) }}:
- job: test1
steps:
- script: echo 1
- job: test2
steps:
- script: echo 1
- stage: testB
jobs:
- job: test3
steps:
- script: echo 1
When you run the Pipeline, you can select the value to run the jobs/task.
I am creating YAML pipeline in Azure DevOps that consists of two stages.
The first stage (Prerequisites) is responsible for reading the git commit and creates a comma separated variable containing the list of services that has been affected by the commit.
The second stage (Build) is responsible for building and unit testing the project. This Stage consists of many templates, one for each Service. In the template script, the job will check if the relevant Service in in the variable created in the previous stage. If the job finds the Service it will continue to build and test the service. However if it cannot find the service, it will skip that job.
Run.yml:
stages:
- stage: Prerequisites
jobs:
- job: SetBuildQueue
steps:
- task: powershell#2
name: SetBuildQueue
displayName: 'Set.Build.Queue'
inputs:
targetType: inline
script: |
## ... PowerShell script to get changes - working as expected
Write-Host "Build Queue Auto: $global:buildQueueVariable"
Write-Host "##vso[task.setvariable variable=buildQueue;isOutput=true]$global:buildQueueVariable"
- stage: Build
jobs:
- job: StageInitialization
- template: Build.yml
parameters:
projectName: Service001
projectLocation: src/Service001
- template: Build.yml
parameters:
projectName: Service002
projectLocation: src/Service002
Build.yml:
parameters:
projectName: ''
projectLocation: ''
jobs:
- job:
displayName: '${{ parameters.projectName }} - Build'
dependsOn: SetBuildQueue
continueOnError: true
condition: and(succeeded(), contains(dependencies.SetBuildQueue.outputs['SetBuildQueue.buildQueue'], '${{ parameters.projectName }}'))
steps:
- task: NuGetToolInstaller#1
displayName: 'Install Nuget'
Issue:
When the first stages runs it will create a variable called buildQueue which is populated as seen in the console output of the PowerShell script task:
Service001 Changed
Build Queue Auto: Service001;
However when it gets to stage two and it tries to run the build template, when it checks the conditions it returns the following output:
Started: Today at 12:05 PM
Duration: 16m 7s
Evaluating: and(succeeded(), contains(dependencies['SetBuildQueue']['outputs']['SetBuildQueue.buildQueue'], 'STARS.API.Customer.Assessment'))
Expanded: and(True, contains(Null, 'service001'))
Result: False
So my question is how do I set the dependsOn and condition to get the information from the previous stage?
It because you want to access the variable in a different stage from where you defined them. currently, it's impossible, each stage it's a new instance of a fresh agent.
In this blog you can find a workaround that involves writing the variable to disk and then passing it as a file, leveraging pipeline artifacts.
To pass the variable FOO from a job to another one in a different stage:
Create a folder that will contain all variables you want to pass; any folder could work, but something like mkdir -p $(Pipeline.Workspace)/variables might be a good idea.
Write the contents of the variable to a file, for example echo "$FOO" > $(Pipeline.Workspace)/variables/FOO. Even though the name could be anything you’d like, giving the file the same name as the variable might be a good idea.
Publish the $(Pipeline.Workspace)/variables folder as a pipeline artifact named variables
In the second stage, download the variables pipeline artifact
Read each file into a variable, for example FOO=$(cat $(Pipeline.Workspace)/variables/FOO)
Expose the variable in the current job, just like we did in the first example: echo "##vso[task.setvariable variable=FOO]$FOO"
You can then access the variable by expanding it within Azure Pipelines ($(FOO)) or use it as an environmental variable inside a bash script ($FOO).