Azure pipelines - Stages are skiped despite no conditions - azure-devops

I'm trying to set up a pipeline in Azure.
Actual behavior
On pull-request a build validation triggers the pipeline and all stages and jobs are triggered.
After the merge, all jobs are skipped.
Expected behavior
After the merge of the pull request, I would expect stage Bto be triggered
Question
What am I missing so pipeline triggers correctly on merge?
azure.pipelines.yml
trigger:
branches:
include:
- master
stages:
- template: 'main.yml'
main.yml
stages:
- stage: 'A'
condition: startsWith(variables['build.reason'], 'PullRequest')
jobs:
- job: A
steps:
- script: echo A
- stage: 'B'
jobs:
- job: 'B'
steps:
- script: echo B

The trigger feature only works for the whole pipeline and not for an individual stage/job in the pipeline.
Normally, we use the different trigger types (CI/PR, resources) and filters (branches, paths, tags) to define when the pipeline should be triggered.
In the pipeline, we generally specify conditions to a stage, job or step to define when the this stage, job or step should be run or not. The conditions will be verified after the pipeline has been triggered.
To specify conditions, you can use one of the following ways:
use the condition key.
use the if expression.

I found the issue and applied suggestions from #bright-ran-msft
Stage A and B are implicitly linked. Since stage A was not triggered on the merge, stage B would not start either.
Solution
Instead of using condition, it is required to use if
Example
stages:
- ${{ if eq( variables['build.reason'], 'PullRequest') }}:
- stage: 'A'
jobs:
- job: A
steps:
- script: echo A
- stage: 'B'
jobs:
- job: 'B'
steps:
- script: echo B

Related

Trigger pipeline when merged in and created new branch

I have one pipeline created with two stages.
Lets say,
Stage A
Stage B
I would like to run Stage A when PR is merged into qa branch and run Stage B when release/* or hotfix/* branch is created.
Can this be achieved in one pipeline? If so how?
You can try to use CI trigger:
trigger:
- qa
- releases/*
- hotfix/*
Then detect them for each stage: Specify conditions
variables:
isQA: $[eq(variables['Build.SourceBranch'], 'refs/heads/qa')]
- stage: B
condition: and(succeeded(), eq(variables.isQA, 'true'))
jobs:
- job: B1
steps:
- script: echo Hello Stage B!
- script: echo $(isQA)
Check additional conditions: contains

Run first two stages parallelly in Azure DevOps pipeline

Is it possible to run the first two stages parallelly in Azure DevOps pipelines? By default, Each stage starts only after the preceding stage is complete unless otherwise specified via the dependsOn property.
Current situation is:
I would like to run both the stages, iOS_Dev_Build and iOS_QA_Build parallelly. No dependsOn condition is added for iOS_QA_Build. But by default, it is waiting for the iOS_Dev_Build stage to complete before it starts
Please add dependsOn: [] in the iOS_QA_Build stage.
My example:
stages:
- stage: DEV
jobs:
- job: A
steps:
- bash: echo "A"
- stage: QA
dependsOn: []
jobs:
- job: A
steps:
- bash: echo "A"
When you define multiple stages in a pipeline, by default, they run sequentially in the order in which you define them in the YAML file. dependsOn: [] removes the implicit dependency on previous stage and causes this to run in parallel.
For more details, please refer Specify dependencies.

Azure Devops multiple triggers condition in a single file

Is it possible to have a single file azure-pipelines.yaml that can :
Trigger a job A on a push from any branch BUT main
Trigger a job B on PR to main and all subsequent commit on that PR
Trigger a job C when main is merged
I have tried to play arround with trigger, pr keywords and even with condition(), variables['Build.Reason'], or System.PullRequest.TargetBranch but I didn't manage to reach the expected result.
I start thinking it cannot be done with a single file - am I wrong?
You can set conditions on your stages to run depending on a variable but I am not pretty sure this will work with your conditions. Maybe you could also combine some variable values.
For example source branch main and pr is created.
and(eq(variables['Build.Reason'], 'PullRequest'), eq(variables['Build.SourceBranch'], 'refs/heads/main'))
Azure documentation sample:
variables:
isMain: $[eq(variables['Build.SourceBranch'], 'refs/heads/main')]
stages:
- stage: A
jobs:
- job: A1
steps:
- script: echo Hello Stage A!
- stage: B
condition: and(succeeded(), eq(variables.isMain, 'true'))
jobs:
- job: B1
steps:
- script: echo Hello Stage B!
- script: echo $(isMain)
Keep in mind that triggers are appending resources. This means that if you specify a trigger like the below, it will be triggered whether the branch filter is triggered OR the pr is created.
trigger:
branches:
include:
- '*'
pr:
branches:
include:
- current
As you said this can be accomplished for sure with separate files for the pipelines.
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/conditions?view=azure-devops&tabs=yaml
Your first question is possible by using trigger with include and exclude branches as below
trigger:
branches:
include:
- stage
- releases/*
exclude:
- master
Refer CI triggers in Azure Repos Git documentation to have more understanding.

Trigger Azure pipelines in a specific order

My team is responsible for 10 microservices and it would be useful to have a single pipeline that triggers their individual CI/CD pipelines.
I know it is possible to use pipeline triggers, like
resources:
pipelines:
- pipeline: MasterPipeline
source: DeployAllMicroservices
trigger: true
and I can add this to the pipelines and create a very simple DeployAllMicroservices pipeline. This works, but the pipelines will be triggered in a random order.
The thing is, two services need to be rolled out first before the other 8 can be deployed. Is there a way to first trigger pipeline A & B, where pipelines C-J are triggered after their completion?
Something else I've tried is to load the pipeline files A.yml, B.yml as templates from the master pipeline.
steps:
- template: /CmcBs/Pipelines/A.yml
- template: /CmcBs/Pipelines/B.yml
but that doesn't work with full-fledged pipelines (starting with trigger, pool, parameters, et cetera).
Currently DevOps does not support multiple pipelines as the triggering pipeline of one pipeline at the same time.
There is a workaround you can refer to:
Set pipelineA as the triggering pipeline of the pipelineB.
Set pipelineB as the triggering pipeline of the other pipelines(pipelines c-j).
For more info about the triggering pipeline, please see Trigger one pipeline after another.
Another approach is to use stages in order to execute first pipeline A,B and then C to J.
An example .yml for this approach would be the below.
trigger:
- none
pool:
vmImage: ubuntu-latest
stages:
- stage: FirstBatch
displayName: Build First Batch
jobs:
- job: pipelineA
displayName: Build pipelineA
steps:
- script: echo pipelineA
displayName: pipelineA
- job: pipelineB
displayName: Build pipelineB
steps:
- script: echo pipelineB
displayName: pipelineB
- stage: SecondBatch
displayName: Build Second Batch
jobs:
- job: pipelineC
displayName: Build pipelineC
steps:
- checkout: none
- script: echo Build pipelineC
displayName: Build pipelineC
- job: pipelineD
displayName: Build pipelineD
steps:
- checkout: none
- script: echo Build pipelineD
displayName: Build pipelineD
- job: pipelineE
displayName: Build pipelineE
steps:
- checkout: none
- script: echo Build pipelineE
displayName: Build pipelineE
The drawback of this approach would be to have a single pipeline and not separated pipelines for your microservices. In order to decouple more this solution you could use templates.

Manage dependencies in Azure YAML Devops Pipelines

Our solution consists of few microservices calling and supporting each other.
To simplify, please consider this dependency graph:
MS1 --> MS2
--> MS3 --> MS4
MS1 depends on MS2 and MS3
MS3 depends on MS4
MS4 is independent
GOAL: Zero downtime during deployment
Currently we are analyzing the possible ways to solve the couple scenarios described bellow:
Deploy all microservices in order, to ensure that all End-To-End tests passes. This means to deploy first MS4, then MS3, MS2, MS1, then run the tests (all of this in slots), then switch the slots if everything passes
Deploy any service individually (the others have not changed at all), run the tests (slots again), then switch the slot if everything succeeds
Our first approach is to have a single (big) pipeline which has separate stages per microservice and checks if that microservice have changed to deploy it. If no change is detected for the microservice, then we would like to cancel the Stage and proceed with the next one.
This pipeline contain templates for each stage, as for example:
- stage: MS1
jobs:
- job: CheckMS1Changes
steps:
- template: templates/ms1-check-changes.yml
- job: BuildMS1
dependsOn: CheckMS1Changes
displayName: Build MS1
- template: templates/ms1-build.yml
- job: ReleaseMS1
dependsOn: BuildMS1
displayName: Release MS1
- template: templates/ms1-release.yml
We think that this will cover the described scenarios. The "cancel command" should be placed inside the templates/ms1-check-changes.yml file
The problem is that we have not found in documentation how to cancel a complete Stage. Which makes me think that maybe our complete approach is wrong.
We also have not found how to cancel a Job or group of jobs, because we also have doubts if we should have stages per microservice or not.
You can see, we are new to this stuff.
Could you give some advice on what could be a good strategy for the described scenarios?
Based on your pipeline. I think you can move the CheckChanges jobs in a separate stage. And then use the logging commands ##vso[task.setvariable variable=MS1;isOutput=true]true to set an output flag variable (ie. MS1) which indicts if changes are detected for each microservice. Then you can use these flags in the condition expressions dependencies.dependencyStageName.outputs['dependencyStageJobName.taskName.varialbeName']
Than you can make following stages dependsOn this stage. And add conditions to decide to skip or run this stage. See below simple example:
stages:
- stage: ChangeStage
pool:
vmImage: windows-latest
jobs:
- job: ChangeJob
steps:
- powershell: |
echo "##vso[task.setvariable variable=MS1;isOutput=true]true" #set MS1 flag to true if changes made to MS1
echo "##vso[task.setvariable variable=MS2;isOutput=true]true"
echo "##vso[task.setvariable variable=MS3;isOutput=true]true"
name: ChangeTask
- stage: MS3Stage
dependsOn: ChangeStage
condition: eq(dependencies.ChangeStage.outputs['ChangeJob.ChangeTask.MS3'], 'true')
pool:
vmImage: windows-latest
jobs:
- template: ...
- stage: MS2Stage
dependsOn:
- MS3Stage
- ChangeStage
condition: |
and
(
eq(dependencies.ChangeStage.outputs['ChangeJob.ChangeTask.MS2'], 'true'),
in(dependencies.MS3Stage.result, 'Succeeded', 'Canceled', 'Skipped')
)
pool:
vmImage: windows-latest
jobs:
- template: ...
- stage: MS1Stage
dependsOn:
- MS2Stage
- ChangeStage
condition: |
and
(
eq(dependencies.ChangeStage.outputs['ChangeJob.ChangeTask.MS1'], 'true'),
in(dependencies.MS2Stage.result, 'Succeeded', 'Canceled', 'Skipped')
)
pool:
vmImage: windows-latest
jobs:
- template: ...
In above pipeline. The top stage(ie. ChangeStage) will first run and check if the changes are made to thte microservices and set the output variable to true accordingly.
MS2Stage depends on MS3Stage. And the condition for MS2Stage as below: Which means MS2Stage will only run on the condition of the output flag MS2 is true and MS3Stage is succeeded, skipped or canceled.
MS3Stage and MS1Stage are similar with MS2Stage.
condition: |
and
(
eq(dependencies.ChangeStage.outputs['ChangeJob.ChangeTask.MS2'], 'true'),
in(dependencies.MS3Stage.result, 'Succeeded', 'Canceled', 'Skipped')
)