Can an Azure pipeline have multiple pipeline resources referencing the same source? - azure-devops

I'm using the pipeline resource to trigger a second pipeline from a first pipeline. The two pipelines are in different repositories. They may even be in different projects.
This is the pipeline resource definition in the second pipeline.
resources:
pipelines:
- pipeline: xyz_build
source: company.xyz_source
trigger:
enabled: true
branches:
include:
- develop
- release/*
I want the second pipeline to take different actions, based on whether it was triggered by a build on the develop branch or a release branch. If an Azure pipeline can handle multiple pipeline resources which reference the same source, then I can rewrite my pipeline resources like this, and then choose different execution paths based on the value of $(Resources.TriggeringAlias).
resources:
pipelines:
- pipeline: xyz_develop_build
source: company.xyz_source
trigger:
enabled: true
branches:
include:
- develop
- pipeline: xyz_release_build
source: company.xyz_source
trigger:
enabled: true
branches:
include:
- release/*
Does this work in Azure Pipelines? Does Azure Pipelines support this?
EDITED TO ADD: It runs when I trigger it manually. I guess we'll find out what happens when somebody runs a build on xyz.source.

Update:
You can still capture the branch of the source resource from the different repo or even different project. But you can't set conditions at compile time to erase unnecessary tasks at the beginning of compilation(in the same repo, you can).
You can determine conditions at runtime to achieve the purpose of 'skip' tasks.
I provide an example here which have pipelines from two different projects, one for ProjectA and the other for ProjectB:
ProjectA
trigger:
- none
pool:
vmImage: ubuntu-latest
steps:
- script: echo Hello, world!
displayName: 'Run a one-line script'
ProjectB
resources:
pipelines:
- pipeline: ProjectA
project: ProjectA
source: ProjectA
trigger:
enabled: true
branches:
include:
- develop
- release
trigger:
- none
pool:
vmImage: ubuntu-latest
jobs:
- job:
displayName: Handle ProjectA develop branch
condition: eq(variables['resources.pipeline.ProjectA.sourceBranch'],'refs/heads/develop')
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
# Write your PowerShell commands here.
Write-Host "Hello World"
Write-Host $(resources.pipeline.ProjectA.sourceBranch)
- job:
displayName: Handle ProjectA release branch
condition: eq(variables['resources.pipeline.ProjectA.sourceBranch'],'refs/heads/release')
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
# Write your PowerShell commands here.
Write-Host "Hello World"
Write-Host $(resources.pipeline.ProjectA.sourceBranch)
Original Answer:
For example, pipeline A trigger PipelineB
PipelineA
trigger:
- none
pool:
vmImage: ubuntu-latest
steps:
- script: echo Hello, world!
displayName: 'Run a one-line script'
PipelineB
resources:
pipelines:
- pipeline: PipelineA
project: xxx
source: PipelineA
trigger:
enabled: true
branches:
include:
- develop
- release
trigger:
- none
pool:
vmImage: ubuntu-latest
steps:
- ${{ if eq(variables['Build.SourceBranch'],'refs/heads/develop') }}:
- script: echo "this is develop branch"
- ${{ if eq(variables['Build.SourceBranch'],'refs/heads/release') }}:
- script: echo "this is release branch"
Please make sure both branches have the above YML file(Pipeline will looking for the same name YML file in every branches).

Related

Manually triggering Devops pipeline with pipeline resource should use latest resource pipeline run for that branch

I have 2 pipelines in the same repo:
Build
Deploy
The Build pipeline is declared as a pipeline resource in the Deploy pipeline:
resources:
pipelines:
- pipeline: Build
source: BuildPipelineName
trigger: true
When I run the Build pipeline, the Deploy pipeline is correctly triggered on the same branch. However, when I run the Deploy pipeline manually, it does not use the latest pipeline run from same branch.
I tried adding a couple of variations of the line below to the to the pipeline resource, but the variable does not expand:
branch: ${{ variables.Build.SourceBranchName }}
Is there any way to make this work?
Workaround that achieves the result I am looking for, but is not very elegant:
- ${{ if ne(variables['Build.Reason'], 'ResourceTrigger') }}:
- task: DeleteFiles#1
displayName: 'Remove downloaded artifacts from pipeline resource'
inputs:
SourceFolder: $(Pipeline.Workspace)
- task: DownloadPipelineArtifact#2
displayName: 'Download artifacts for branch'
inputs:
source: 'specific'
project: 'myProject'
pipeline: <BuildPipelineId>
runVersion: 'latestFromBranch'
runBranch: $(Build.SourceBranch)
For example, if I have a build pipeline named 'BuildPipelineAndDeployPipeline',
then the below YAML definition can get the latest build pipeline run from a specific branch:
resources:
pipelines:
- pipeline: BuildPipelineAndDeployPipeline
project: xxx
source: BuildPipelineAndDeployPipeline
trigger:
branches:
- main
pool:
vmImage: 'windows-latest'
steps:
- task: CmdLine#2
inputs:
script: |
echo Write your commands here
echo Hello world
echo $(resources.pipeline.BuildPipelineAndDeployPipeline.runID)

Extending my Azure DevOps Pipeline to include Build validation for all Pull Requests

I have 30 or so Java Microservices that run of the same ci and cd template. i.e. Each of my Microservices has a build pipeline as follows and as shown below it runs automatically on a merge to master:
name: 0.0.$(Rev:r)
trigger:
- master
pr: none
variables:
- group: MyCompany
resources:
repositories:
- repository: templates
type: git
name: <id>/yaml-templates
stages:
- stage: Build
jobs:
- job: build
displayName: Build
steps:
- template: my-templates/ci-build-template.yaml#templates
- stage: PushToContainerRegistry
dependsOn: Build
condition: succeeded()
jobs:
- job: PushToContainerRegistry
displayName: PushToContainerRegistry
steps:
- template: my-templates/ci-template.yaml#templates
Where ci-build-template.yaml contains...
steps:
- checkout: self
path: s
- task: PowerShell#2
- task: Gradle#2
displayName: 'Gradle Build'
- task: SonarQubePrepare#4
displayName: SonarQube Analysis
- task: CopyFiles#2
displayName: Copy build/docker to Staging Directory
I would like to implement pr build validation wherever someone raises a pr to merge into master. In the PR build only the Build stage should run and from the build template only some tasks within ci-build-template.yaml should run.
A few questions for my learning:
How can i uplift the yaml pipeline above to make the "PushToContainerRegistry" skip if it is a pr build?
How can i uplift ci-build-template.yaml to make the "SonarQubePrepare#4" and "CopyFiles#2" skip if it is a pr build?
And lastly how can i uplift the yaml pipeline above to enable build validation for all pr's that have a target branch of master?
Whilst doing my own research i know you can do this via clickops but I am keep on learning on how to implement via yaml.
thanks
How can i uplift the yaml pipeline above to make the
"PushToContainerRegistry" skip if it is a pr build?
How can i uplift ci-build-template.yaml to make the
"SonarQubePrepare#4" and "CopyFiles#2" skip if it is a pr build?
Just need to use condition of task:
For example,
pool:
name: Azure Pipelines
steps:
- script: |
echo Write your commands here
echo Hello world
echo $(Build.Reason)
displayName: 'Command Line Script'
condition: and(succeeded(), ne(variables['Build.Reason'], 'PullRequest'))
Above definition will skip the step if Pull request trigger the pipeline.
Please refer to these documents:
https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/azure-repos-git?view=azure-devops&tabs=yaml#using-the-trigger-type-in-conditions
https://learn.microsoft.com/en-us/azure/devops/pipelines/build/variables?view=azure-devops&tabs=yaml#build-variables-devops-services
And lastly how can i uplift the yaml pipeline above to enable build
validation for all pr's that have a target branch of master?
You can use this expression in the condition:
eq(variables['System.PullRequest.TargetBranch'], 'refs/heads/master')
If you are based on DevOps git repo, then just need to add branch policy is ok:
https://learn.microsoft.com/en-us/azure/devops/repos/git/branch-policies?view=azure-devops&tabs=browser#configure-branch-policies

Azure DevOps - Trigger another pipeline

I got two pipelines in my project, one for test and one for build. The reason for this is that the tests need to be run on an self hosted agent to be able to run integration tests.
I don't want to run the build pipeline if the tests are failing. This is my configuration:
Test (Pipeline name)
name: Test
trigger:
- azure-pipelines
pool:
vmImage: "windows-latest"
steps:
- script: echo Test pipeline
Build (Pipeline name)
name: Build
trigger: none
resources:
pipelines:
- pipeline: test
source: Test
trigger: true
pool:
vmImage: "windows-latest"
steps:
- script: echo Build pipeline
The Test pipeline is running as expected but the Build pipeline never gets triggered even if I run it in the cloud as in the example above. Anyone see what the problem is?
It is possible to call another pipeline as shown in the other answer but to start a different agent OS, I would suggest using a Multistage pipeline or Strategy Matrix.
Each stage can run with its own VM or Agent pool.
Here is an example:
trigger:
- main
stages:
- stage: Build
pool:
vmImage: ubuntu-latest
jobs:
- job: BuildJob
steps:
- script: echo Building
- stage: TestWithLinux
dependsOn: Build
pool:
vmImage: ubuntu-latest
jobs:
- job: Testing
steps:
- script: echo Test with Linux OS
- stage: TestWithWindows
dependsOn: Build
pool:
vmImage: windows-latest
jobs:
- job: Testing
steps:
- script: echo Test with Windows OS
- stage: Final
dependsOn: [TestWithLinux,TestWithWindows]
pool:
vmImage: ubuntu-latest
jobs:
- job: FinalJob
steps:
- script: echo Final Job
You can replace vmImage: xxx with your own hosted Agent like:
pool: AgentNameX
And the final result would look like this:
Or It is possible to use a strategy with the matrix. Let's say we have a code that should be run on 3 different agents, we can do the following:
jobs:
- job:
strategy:
matrix:
Linux:
imageName: 'ubuntu-latest'
Mac:
imageName: 'macOS-latest'
Windows:
imageName: 'windows-latest'
pool:
vmImage: $(imageName)
steps:
- powershell: |
"OS = $($env:AGENT_OS)" | Out-Host
displayName: 'Test with Agent'
It can work as a stand-alone or in multi-stages as well as shown in the image:
Here is a list of supported hosted agents.
Disclaimer: I wrote 2 articles about this in my personal blog.
Make sure you use the correct pipeline name. I would also suggest adding project inside resources pipeline.
For example I have a pipeline named first.
first.yml
trigger:
- none
pr: none
pool:
vmImage: ubuntu-latest
steps:
- script: echo Running the first pipeline, should trigger the second.
displayName: 'First pipeline'
second.yml
trigger:
- none
pool:
vmImage: ubuntu-latest
resources:
pipelines:
- pipeline: first
source: first
project: test-project
trigger: true # Run second pipeline when the run of the first pipeline ends
steps:
- script: echo this pipeline was triggered from the first pipeline
displayName: 'Second pipeline'
Two pipeline definitions seem to be correct. You may try checking the following configurations from the Azure DevOps portal.
(I guess you are using GitHub)
Make sure the Build pipeline definition is in the default branch. Or else, you can configure its branch as the default branch for the Build pipeline from the Azure DevOps portal.
Go to pipeline -> Edit -> More options -> Triggers -> YAML -> Get Sources ->
Then change the value in Default branch for manual and scheduled builds to the branch which holds pipeline yaml.
Configure the Build Completion trigger under triggers.
Go to pipeline -> Edit -> More options -> Triggers -> Build completion -> Add -> then select the Test pipeline as the Triggering build from the drop down. Also, add the listed branch filer too (if needed).
This will make sure that the pipeline completion trigger is completed correctly since portal configuration hold the highest priority.
P.S. Better to disable the PR trigger as well with pr: none config in the yaml if it is not required as default.

Trigger Azure pipelines in a specific order

My team is responsible for 10 microservices and it would be useful to have a single pipeline that triggers their individual CI/CD pipelines.
I know it is possible to use pipeline triggers, like
resources:
pipelines:
- pipeline: MasterPipeline
source: DeployAllMicroservices
trigger: true
and I can add this to the pipelines and create a very simple DeployAllMicroservices pipeline. This works, but the pipelines will be triggered in a random order.
The thing is, two services need to be rolled out first before the other 8 can be deployed. Is there a way to first trigger pipeline A & B, where pipelines C-J are triggered after their completion?
Something else I've tried is to load the pipeline files A.yml, B.yml as templates from the master pipeline.
steps:
- template: /CmcBs/Pipelines/A.yml
- template: /CmcBs/Pipelines/B.yml
but that doesn't work with full-fledged pipelines (starting with trigger, pool, parameters, et cetera).
Currently DevOps does not support multiple pipelines as the triggering pipeline of one pipeline at the same time.
There is a workaround you can refer to:
Set pipelineA as the triggering pipeline of the pipelineB.
Set pipelineB as the triggering pipeline of the other pipelines(pipelines c-j).
For more info about the triggering pipeline, please see Trigger one pipeline after another.
Another approach is to use stages in order to execute first pipeline A,B and then C to J.
An example .yml for this approach would be the below.
trigger:
- none
pool:
vmImage: ubuntu-latest
stages:
- stage: FirstBatch
displayName: Build First Batch
jobs:
- job: pipelineA
displayName: Build pipelineA
steps:
- script: echo pipelineA
displayName: pipelineA
- job: pipelineB
displayName: Build pipelineB
steps:
- script: echo pipelineB
displayName: pipelineB
- stage: SecondBatch
displayName: Build Second Batch
jobs:
- job: pipelineC
displayName: Build pipelineC
steps:
- checkout: none
- script: echo Build pipelineC
displayName: Build pipelineC
- job: pipelineD
displayName: Build pipelineD
steps:
- checkout: none
- script: echo Build pipelineD
displayName: Build pipelineD
- job: pipelineE
displayName: Build pipelineE
steps:
- checkout: none
- script: echo Build pipelineE
displayName: Build pipelineE
The drawback of this approach would be to have a single pipeline and not separated pipelines for your microservices. In order to decouple more this solution you could use templates.

Azure Devops Build Pipeline - Unexpected value stages

I'm refactoring a pipeline to use a stage as template so I don't have duplicate code in my test-publish build pipeline and release build pipeline. But I get the error which I commented in the following .yml lines.
resources:
- repo: self
clean: true
trigger:
branches:
include:
- development
stages: # error on this line: unexpected value 'stages'
- template: build-job.yml
- stage: Publish
jobs:
- job: PublishClickOnce
steps:
- task: PublishSymbols#2
displayName: 'Publish symbols path'
inputs:
SearchPattern: '**\bin\**\*.pdb'
PublishSymbols: false
continueOnError: true
The example provided by Microsoft:
# File: azure-pipelines.yml
trigger:
- master
pool:
vmImage: 'ubuntu-latest'
stages:
- stage: Install
jobs:
- job: npminstall
steps:
- task: Npm#1
inputs:
command: 'install'
- template: templates/stages1.yml
- template: templates/stages2.yml
I checked against the documentation but can't see anything wrong with it. Can you point out my mistake and what I should change?
Azure Devops Build Pipeline - Unexpected value stages
The error may be from the template. Since the template is directly nested under stages, you should make sure the template is also under the stage.
Like the following YAML:
resources:
- repo: self
clean: true
trigger:
branches:
include:
- master
pool:
vmImage: 'windows-latest'
stages:
- template: build-job.yml
- stage: Publish
jobs:
- job: PublishClickOnce
steps:
- task: PowerShell#2
inputs:
targetType : inline
script: |
Write-Host "Hello world!"
Then the build-job.yml:
stages:
- stage: test
jobs:
- job: test
steps:
- script: echo testdemo
displayName: 'templateTest'
It works fine on my side, you could check if it works for you.
Besides, if you set the template is directly nested under steps, then the template should start with steps.
If it not work for you, please share you detailed build error log in your question.
Hope this helps.
When I was refactoring from a job to a stages layout, I was getting the same "unexpected value 'stages'" in the editor until I indented the rest of the yaml.
Although this may not directly answer the original poster's issue, it does answer the title of this issue and this issue was the first one I selected when I was searching.