I have enabled CI and set a pipeline trigger with on main and applied path filters for 2 folders A and B.
I want to create 2 agents jobs X and Y which runs tasks based on the trigger.
Example: If the trigger is due to changes in folder A, then Agent X should run and if the trigger is due to changes in folder B, then Agent Y should run.
What exactly is the condition that should be kept in order to implement this scenario.
in your situation, you can create two yaml files in the repo and create two pipeline based on them.
Jusk like below.
trigger:
paths:
include:
- A/
pool:
vmImage: ubuntu-latest
steps:
- script: echo Hello, world!
displayName: 'Run a one-line script'
And
trigger:
paths:
include:
- B/
pool:
vmImage: ubuntu-latest
steps:
- script: echo Hello, world!
displayName: 'Run a one-line script'
Related
Migrating from .gitlab-ci.yml to azure-pipelines.yml. In the .gitlab-ci.yml, I havee a scenario where one job(in the deploy stage) needs two other jobs(from test stages) for its execution
.deploy
stage: deploy
needs:
- testmethod1
- testmethod2
deployPROD:
extends: .deploy
Now here, does the deployProd job executes the testmethods again or just checks if they have been executed?
Moving to azure context, I created a templatefolder in my repository, with test file just to replicate this scenario.
My azure-pipelines.yml file is as shown below:
trigger:
- azure-pipelines
pool:
vmImage: ubuntu-latest
jobs:
- job: InitialA
steps:
- script: echo hello from initial A
- job: InitialB
steps:
- script: echo hello from initial B
- job: Subsequent
dependsOn:
- templates/test1.yml/testme
steps:
- script: echo hello from subsequent
I used the dependsOn key to show the depending jobs. Now the structure of the repo, along with the template file, looks like this:
But I end up getting the following error :
So is my approach correct? Am I using the correct keywords in azure? if yes, what is the path that I need to consider in the dependsOnkey?
Suggestions welcome.
You can add a template job and the Subsequent job can dependsOn the template job.
Please refer Template types & usage for more template usages in Azure Pipeline.
Code sample:
trigger:
- none
pool:
vmImage: ubuntu-latest
jobs:
- job: InitialA
steps:
- script: echo hello from initial A
- job: InitialB
steps:
- script: echo hello from initial B
- job: templates
steps:
- template: test.yml # Template reference
- job: Subsequent
dependsOn: templates
steps:
- script: echo hello from subsequent
I have 3 environments in Azure , Sandbox, Test and Prod.
I have yaml pipeline in azure devops which builds the infrastructure. The environment built depends on the variables in the terraform code
The same pipleline is used to deploy to all environments depending on conditions in the yaml file.I want Dev to trigger on a merge to master but only want test and prod to deploy manually. How can i set this up in the yaml file?
For Dev, you can set up build validation of branch 'Master':
And in the Master branch, the YML file should be like this:
trigger:
- master
pool:
vmImage: ubuntu-latest
steps:
- script: echo Hello, world!
displayName: 'Run a one-line script'
This can make sure the pipeline of the master branch(Dev environment) only triggers after the PR merge is completed.
For test and prod environment, you can create a branch with the same name YML that the pipeline is looking for. And use the below YML definition:
trigger:
- none
pool:
vmImage: ubuntu-latest
steps:
- script: echo Hello, world!
displayName: 'Run a one-line script'
This can make sure the test and prod only can be triggered manually.
Above solution just need one pipeline.
Based on your requirement, you can add the condition in your YAML pipeline.
Dev to trigger on a merge to master
You can set the variable: Build.Reason and System.PullRequest.TargetBranch.
For example:
condition: and(eq(variables['Build.Reason'], 'PullRequest'), eq(variables['System.PullRequest.TargetBranch'], 'refs/heads/master'))
want test and prod to deploy manually
You can set the variable: Build.Reason
condition: eq(variables['Build.Reason'], 'Manual')
YAML example:
stages:
- stage: Dev
condition: and(eq(variables['Build.Reason'], 'PullRequest'), eq(variables['System.PullRequest.TargetBranch'], 'refs/heads/master'))
jobs:
- job: A
steps:
- xx
- stage: Prod
condition: eq(variables['Build.Reason'], 'Manual')
jobs:
- job: B
steps:
- xx
- stage: Test
condition: eq(variables['Build.Reason'], 'Manual')
jobs:
- job: C
steps:
- xx
When the pipeline is triggered by Pull Request and the target branch is master, the Dev stage will run.
When the pipeline is triggered manually, the Test and Prod stage will run.
Refer to this doc: Condition and Predefined variables.
I need to run some tasks if a pull request is opened. Im new to this so apologies .
eg A pull request is rasied on github.
If this happens I want to build some review apps based on whether the above condition is true
I need to do this using YAML
This is how pr trigger works. The below example defines two triggers. When a new pr is created for the develop branch then the pipeline will be triggered. This means that pr is open for develop branch until it merges. The second trigger will be for main branch. This means that when you merge or commit code on your main branch then the pipeline will be also triggered.
trigger:
branches:
include:
- 'main'
pr:
branches:
include:
- develop
https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/azure-repos-git?view=azure-devops&tabs=yaml#pr-triggers
This worked for me
Starter pipeline
Start with a minimal pipeline that you can customize to build and deploy your code.
Add steps that build, run tests, deploy, and more:
https://aka.ms/yaml
pr:
branches:
include:
- '*'
#trigger:
#- main
pool:
vmImage: ubuntu-latest
steps:
- script: echo Hello, world!
displayName: 'Run a one-line script'
- script: |
echo Add other tasks to build, test, and deploy your project.
echo See https://aka.ms/yaml
displayName: 'Run a multi-line script'
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
# Write your PowerShell commands here.
Write-Host "pullrequest"
**Write-Host "PR Number is:-" $(System.PullRequest.PullRequestId)"
Write-Host "PR Number is:-" $(System.PullRequest.PullRequestNumber)"**
I got two pipelines in my project, one for test and one for build. The reason for this is that the tests need to be run on an self hosted agent to be able to run integration tests.
I don't want to run the build pipeline if the tests are failing. This is my configuration:
Test (Pipeline name)
name: Test
trigger:
- azure-pipelines
pool:
vmImage: "windows-latest"
steps:
- script: echo Test pipeline
Build (Pipeline name)
name: Build
trigger: none
resources:
pipelines:
- pipeline: test
source: Test
trigger: true
pool:
vmImage: "windows-latest"
steps:
- script: echo Build pipeline
The Test pipeline is running as expected but the Build pipeline never gets triggered even if I run it in the cloud as in the example above. Anyone see what the problem is?
It is possible to call another pipeline as shown in the other answer but to start a different agent OS, I would suggest using a Multistage pipeline or Strategy Matrix.
Each stage can run with its own VM or Agent pool.
Here is an example:
trigger:
- main
stages:
- stage: Build
pool:
vmImage: ubuntu-latest
jobs:
- job: BuildJob
steps:
- script: echo Building
- stage: TestWithLinux
dependsOn: Build
pool:
vmImage: ubuntu-latest
jobs:
- job: Testing
steps:
- script: echo Test with Linux OS
- stage: TestWithWindows
dependsOn: Build
pool:
vmImage: windows-latest
jobs:
- job: Testing
steps:
- script: echo Test with Windows OS
- stage: Final
dependsOn: [TestWithLinux,TestWithWindows]
pool:
vmImage: ubuntu-latest
jobs:
- job: FinalJob
steps:
- script: echo Final Job
You can replace vmImage: xxx with your own hosted Agent like:
pool: AgentNameX
And the final result would look like this:
Or It is possible to use a strategy with the matrix. Let's say we have a code that should be run on 3 different agents, we can do the following:
jobs:
- job:
strategy:
matrix:
Linux:
imageName: 'ubuntu-latest'
Mac:
imageName: 'macOS-latest'
Windows:
imageName: 'windows-latest'
pool:
vmImage: $(imageName)
steps:
- powershell: |
"OS = $($env:AGENT_OS)" | Out-Host
displayName: 'Test with Agent'
It can work as a stand-alone or in multi-stages as well as shown in the image:
Here is a list of supported hosted agents.
Disclaimer: I wrote 2 articles about this in my personal blog.
Make sure you use the correct pipeline name. I would also suggest adding project inside resources pipeline.
For example I have a pipeline named first.
first.yml
trigger:
- none
pr: none
pool:
vmImage: ubuntu-latest
steps:
- script: echo Running the first pipeline, should trigger the second.
displayName: 'First pipeline'
second.yml
trigger:
- none
pool:
vmImage: ubuntu-latest
resources:
pipelines:
- pipeline: first
source: first
project: test-project
trigger: true # Run second pipeline when the run of the first pipeline ends
steps:
- script: echo this pipeline was triggered from the first pipeline
displayName: 'Second pipeline'
Two pipeline definitions seem to be correct. You may try checking the following configurations from the Azure DevOps portal.
(I guess you are using GitHub)
Make sure the Build pipeline definition is in the default branch. Or else, you can configure its branch as the default branch for the Build pipeline from the Azure DevOps portal.
Go to pipeline -> Edit -> More options -> Triggers -> YAML -> Get Sources ->
Then change the value in Default branch for manual and scheduled builds to the branch which holds pipeline yaml.
Configure the Build Completion trigger under triggers.
Go to pipeline -> Edit -> More options -> Triggers -> Build completion -> Add -> then select the Test pipeline as the Triggering build from the drop down. Also, add the listed branch filer too (if needed).
This will make sure that the pipeline completion trigger is completed correctly since portal configuration hold the highest priority.
P.S. Better to disable the PR trigger as well with pr: none config in the yaml if it is not required as default.
My team is responsible for 10 microservices and it would be useful to have a single pipeline that triggers their individual CI/CD pipelines.
I know it is possible to use pipeline triggers, like
resources:
pipelines:
- pipeline: MasterPipeline
source: DeployAllMicroservices
trigger: true
and I can add this to the pipelines and create a very simple DeployAllMicroservices pipeline. This works, but the pipelines will be triggered in a random order.
The thing is, two services need to be rolled out first before the other 8 can be deployed. Is there a way to first trigger pipeline A & B, where pipelines C-J are triggered after their completion?
Something else I've tried is to load the pipeline files A.yml, B.yml as templates from the master pipeline.
steps:
- template: /CmcBs/Pipelines/A.yml
- template: /CmcBs/Pipelines/B.yml
but that doesn't work with full-fledged pipelines (starting with trigger, pool, parameters, et cetera).
Currently DevOps does not support multiple pipelines as the triggering pipeline of one pipeline at the same time.
There is a workaround you can refer to:
Set pipelineA as the triggering pipeline of the pipelineB.
Set pipelineB as the triggering pipeline of the other pipelines(pipelines c-j).
For more info about the triggering pipeline, please see Trigger one pipeline after another.
Another approach is to use stages in order to execute first pipeline A,B and then C to J.
An example .yml for this approach would be the below.
trigger:
- none
pool:
vmImage: ubuntu-latest
stages:
- stage: FirstBatch
displayName: Build First Batch
jobs:
- job: pipelineA
displayName: Build pipelineA
steps:
- script: echo pipelineA
displayName: pipelineA
- job: pipelineB
displayName: Build pipelineB
steps:
- script: echo pipelineB
displayName: pipelineB
- stage: SecondBatch
displayName: Build Second Batch
jobs:
- job: pipelineC
displayName: Build pipelineC
steps:
- checkout: none
- script: echo Build pipelineC
displayName: Build pipelineC
- job: pipelineD
displayName: Build pipelineD
steps:
- checkout: none
- script: echo Build pipelineD
displayName: Build pipelineD
- job: pipelineE
displayName: Build pipelineE
steps:
- checkout: none
- script: echo Build pipelineE
displayName: Build pipelineE
The drawback of this approach would be to have a single pipeline and not separated pipelines for your microservices. In order to decouple more this solution you could use templates.