Automatically create sequential stages in Azure Pipeline - azure-devops

I have an Azure Pipeline azure-pipelines.yml:
parameters:
- name: "stages"
type: object
default:
- stage1
- stage2
# possibly more...
pool:
vmImage: 'ubuntu-latest'
stages:
- stage: InitialStage
displayName: I must run first
jobs: ...
- ${{ each stage in parameters.stages }}:
- stage:
dependsOn: InitialStage
condition: succeeded()
jobs: ...
What I want to achieve is creating a pipeline where the first stage is InitialStage, then I want to generate the next stages to run after it sequentially. So the final pipeline should look like:
InitialStage ==> stage1 ==> stage2 ==> ...
One stage must run after the previous completed and the first stage in absolute must be InitialStage.
If I use the syntax above, I get the stages run in parallel like:
InitialStage ==> stage1
|
==> stage2
|
...
How can I achieve that?

Just need to remove the dependsOn keyword. Per Microsoft Docs:
When you define multiple stages in a pipeline, by default, they run sequentially in the order in which you define them in the YAML file. The exception to this is when you add dependencies. With dependencies, stages run in the order of the dependsOn requirements.
Can also remove the condition keyword:
You can specify the conditions under which each stage runs. By default, a stage runs if it does not depend on any other stage, or if all of the stages that it depends on have completed and succeeded. You can customize this behavior by forcing a stage to run even if a previous stage fails or by specifying a custom condition.
If you do need to map custom stage dependencies, I have done this before as well by creating a stage object. Something like:
parameters:
- name: environmentObjects
type: object
default:
- environmentName: 'dev'
dependsOn: ''
- environmentName: 'tst'
dependsOn: 'dev'
and then looped through and dynamically build the dependsOn. This would allow for scenarios with two test environmnts depending on dev but not each other.

Related

Run first two stages parallelly in Azure DevOps pipeline

Is it possible to run the first two stages parallelly in Azure DevOps pipelines? By default, Each stage starts only after the preceding stage is complete unless otherwise specified via the dependsOn property.
Current situation is:
I would like to run both the stages, iOS_Dev_Build and iOS_QA_Build parallelly. No dependsOn condition is added for iOS_QA_Build. But by default, it is waiting for the iOS_Dev_Build stage to complete before it starts
Please add dependsOn: [] in the iOS_QA_Build stage.
My example:
stages:
- stage: DEV
jobs:
- job: A
steps:
- bash: echo "A"
- stage: QA
dependsOn: []
jobs:
- job: A
steps:
- bash: echo "A"
When you define multiple stages in a pipeline, by default, they run sequentially in the order in which you define them in the YAML file. dependsOn: [] removes the implicit dependency on previous stage and causes this to run in parallel.
For more details, please refer Specify dependencies.

Azure pipelines - Stages are skiped despite no conditions

I'm trying to set up a pipeline in Azure.
Actual behavior
On pull-request a build validation triggers the pipeline and all stages and jobs are triggered.
After the merge, all jobs are skipped.
Expected behavior
After the merge of the pull request, I would expect stage Bto be triggered
Question
What am I missing so pipeline triggers correctly on merge?
azure.pipelines.yml
trigger:
branches:
include:
- master
stages:
- template: 'main.yml'
main.yml
stages:
- stage: 'A'
condition: startsWith(variables['build.reason'], 'PullRequest')
jobs:
- job: A
steps:
- script: echo A
- stage: 'B'
jobs:
- job: 'B'
steps:
- script: echo B
The trigger feature only works for the whole pipeline and not for an individual stage/job in the pipeline.
Normally, we use the different trigger types (CI/PR, resources) and filters (branches, paths, tags) to define when the pipeline should be triggered.
In the pipeline, we generally specify conditions to a stage, job or step to define when the this stage, job or step should be run or not. The conditions will be verified after the pipeline has been triggered.
To specify conditions, you can use one of the following ways:
use the condition key.
use the if expression.
I found the issue and applied suggestions from #bright-ran-msft
Stage A and B are implicitly linked. Since stage A was not triggered on the merge, stage B would not start either.
Solution
Instead of using condition, it is required to use if
Example
stages:
- ${{ if eq( variables['build.reason'], 'PullRequest') }}:
- stage: 'A'
jobs:
- job: A
steps:
- script: echo A
- stage: 'B'
jobs:
- job: 'B'
steps:
- script: echo B

Azure pipelines approval is required before the condition is evaluated

I have a CI/CD pipeline for a solution with several projects. I check the changes and only build the projects which have been changed, as opposed of building all of them.
I accomplish this using a condition at the build stage of each project. This is the relevant part:
- stage: S_BuildChannelUpdate
dependsOn: 'PreSteps'
jobs:
- job: 'BuildChannelUpdate'
variables:
BuildCondition: $[ stageDependencies.PreSteps.Check_changes.outputs['pwsh_script.BuildChannelUpdate'] ]
condition: eq(variables['BuildCondition'], True)
This works as I expect, the build steps are only executed if the conditions are met. So far so good.
For the deployment part, I want to do it only if there is something new to be deployed. I.e. project was changed and the build was successful. Again, here is the relevant part:
- stage: 'S_ReleaseChannelUpdate'
dependsOn:
- PreSteps
- S_BuildChannelUpdate
jobs:
- deployment: 'ReleaseChannelUpdate'
variables:
ReleaseCondition: $[ stageDependencies.PreSteps.Check_changes.outputs['pwsh_script.BuildChannelUpdate'] ]
condition: eq(variables['ReleaseCondition'], True)
environment: 'dev'
strategy:
runOnce:
deploy:
steps:
The problem here is that I want to set an approval for the releases and the pipeline asks me to approve it before evaluating the condition. I would like to get the approval request only if the ReleaseCondition is True.
I was also expecting that since the stage S_BuildChannelUpdate was skipped (condition not met), the stage S_ReleaseChannelUpdate will consider its dependencies not met.
Any suggestions?
The problem here is that I want to set an approval for the releases
and the pipeline asks me to approve it before evaluating the
condition. I would like to get the approval request only if the
ReleaseCondition is True
For this issue, here agree with PaulVrugt. Approval is executed at the stage level. Azure Pipelines pauses the execution of a pipeline prior to each stage, and waits for all pending checks to be completed. If the condition is set at the job level, the condition will not be executed before the approval, so as a solution, we need to set the condition at the stage level.
For example:
- stage: 'S_ReleaseChannelUpdate'
dependsOn:
- PreSteps
- S_BuildChannelUpdate
condition: eq(variables['ReleaseCondition'], True)
jobs:
- deployment: 'ReleaseChannelUpdate'
environment: 'dev'
strategy:
runOnce:
deploy:
steps:
With this definition, before executing approval, pipeline will first determine whether ReleaseCondition is True, if ReleaseCondition is False, then the stage will be skipped and do not check approval.
- stage: 'S_ReleaseChannelUpdate'
dependsOn:
- S_BuildChannelUpdate
For this, if stage S_BuildChannelUpdate was skipped (condition not met), the stage S_ReleaseChannelUpdate will also be skipped

Manage dependencies in Azure YAML Devops Pipelines

Our solution consists of few microservices calling and supporting each other.
To simplify, please consider this dependency graph:
MS1 --> MS2
--> MS3 --> MS4
MS1 depends on MS2 and MS3
MS3 depends on MS4
MS4 is independent
GOAL: Zero downtime during deployment
Currently we are analyzing the possible ways to solve the couple scenarios described bellow:
Deploy all microservices in order, to ensure that all End-To-End tests passes. This means to deploy first MS4, then MS3, MS2, MS1, then run the tests (all of this in slots), then switch the slots if everything passes
Deploy any service individually (the others have not changed at all), run the tests (slots again), then switch the slot if everything succeeds
Our first approach is to have a single (big) pipeline which has separate stages per microservice and checks if that microservice have changed to deploy it. If no change is detected for the microservice, then we would like to cancel the Stage and proceed with the next one.
This pipeline contain templates for each stage, as for example:
- stage: MS1
jobs:
- job: CheckMS1Changes
steps:
- template: templates/ms1-check-changes.yml
- job: BuildMS1
dependsOn: CheckMS1Changes
displayName: Build MS1
- template: templates/ms1-build.yml
- job: ReleaseMS1
dependsOn: BuildMS1
displayName: Release MS1
- template: templates/ms1-release.yml
We think that this will cover the described scenarios. The "cancel command" should be placed inside the templates/ms1-check-changes.yml file
The problem is that we have not found in documentation how to cancel a complete Stage. Which makes me think that maybe our complete approach is wrong.
We also have not found how to cancel a Job or group of jobs, because we also have doubts if we should have stages per microservice or not.
You can see, we are new to this stuff.
Could you give some advice on what could be a good strategy for the described scenarios?
Based on your pipeline. I think you can move the CheckChanges jobs in a separate stage. And then use the logging commands ##vso[task.setvariable variable=MS1;isOutput=true]true to set an output flag variable (ie. MS1) which indicts if changes are detected for each microservice. Then you can use these flags in the condition expressions dependencies.dependencyStageName.outputs['dependencyStageJobName.taskName.varialbeName']
Than you can make following stages dependsOn this stage. And add conditions to decide to skip or run this stage. See below simple example:
stages:
- stage: ChangeStage
pool:
vmImage: windows-latest
jobs:
- job: ChangeJob
steps:
- powershell: |
echo "##vso[task.setvariable variable=MS1;isOutput=true]true" #set MS1 flag to true if changes made to MS1
echo "##vso[task.setvariable variable=MS2;isOutput=true]true"
echo "##vso[task.setvariable variable=MS3;isOutput=true]true"
name: ChangeTask
- stage: MS3Stage
dependsOn: ChangeStage
condition: eq(dependencies.ChangeStage.outputs['ChangeJob.ChangeTask.MS3'], 'true')
pool:
vmImage: windows-latest
jobs:
- template: ...
- stage: MS2Stage
dependsOn:
- MS3Stage
- ChangeStage
condition: |
and
(
eq(dependencies.ChangeStage.outputs['ChangeJob.ChangeTask.MS2'], 'true'),
in(dependencies.MS3Stage.result, 'Succeeded', 'Canceled', 'Skipped')
)
pool:
vmImage: windows-latest
jobs:
- template: ...
- stage: MS1Stage
dependsOn:
- MS2Stage
- ChangeStage
condition: |
and
(
eq(dependencies.ChangeStage.outputs['ChangeJob.ChangeTask.MS1'], 'true'),
in(dependencies.MS2Stage.result, 'Succeeded', 'Canceled', 'Skipped')
)
pool:
vmImage: windows-latest
jobs:
- template: ...
In above pipeline. The top stage(ie. ChangeStage) will first run and check if the changes are made to thte microservices and set the output variable to true accordingly.
MS2Stage depends on MS3Stage. And the condition for MS2Stage as below: Which means MS2Stage will only run on the condition of the output flag MS2 is true and MS3Stage is succeeded, skipped or canceled.
MS3Stage and MS1Stage are similar with MS2Stage.
condition: |
and
(
eq(dependencies.ChangeStage.outputs['ChangeJob.ChangeTask.MS2'], 'true'),
in(dependencies.MS3Stage.result, 'Succeeded', 'Canceled', 'Skipped')
)

Can I have 1 pipeline and dynamically target multiple environments?

Is it possible to have 1 pipeline and dynamically pass stages to the pipeline rather than predefining the stages in the pipeline? I am trying to get out of copying pipelines over and over for different stages.
I am trying to get out of copying pipelines over and over for different stages.
You could use the Template in the YAML file in the stages instead of copying pipelines over and over for different stages:
stages:
- stage: QA
jobs:
- job:
steps:
- template: ChildForTemplate.yml
parameters:
param1: $(Var1)
- stage: Test
jobs:
- job:
- template: ChildForTemplate.yml
parameters:
param2: $(Var2)
But this method still requires you to predefine the stages in the pipeline.
Is it possible to have 1 pipeline and dynamically pass stages to the
pipeline rather than predefining the stages in the pipeline?
We do not currently support this feature.
If we want to completely dynamically pass stages to the pipeline, we need set YAML like:
stages:
- stage: $(StageValue)
jobs:
- job:
steps:
- template: ChildForTemplate.yml
parameters:
param1: $(Var1)
However, there are no specific rules for the value of the target $ (StageValue), it could be only QA, or QA and Test, If we pass multiple values, then Yaml will not tactfully start different stages to execute the pipeline.
Hope this helps.