I'm new to Azure DevOps and am looking to create the correct pipeline/release structures for my projects. I'm struggling to get to the point of how I'd build my code passing in the correct value for my build configs.
I have created my repo and branches:
main (main branch for released/to be released code)
feature/add_new_customer (feature branches per piece of work)
uat (a branch merging in multiple features commits specifically for testing prior to releases)
I created a build pipeline and this has created me an "azure-pipelines.yaml" file that I have used to build and publish the build files. This is triggered from my branches:
trigger:
- main
- feature/*
- uat
All good up to now. But this was by specifying the build config statically/hardcoded in yaml
variables:
solution: '**/*.sln'
buildPlatform: 'Any CPU'
**buildConfiguration: 'Release'**
I have now added a variable to my pipeline, BuildConfig, and set to "UAT" so I can reference in my yaml like so:
variables:
solution: '**/*.sln'
buildPlatform: 'Any CPU'
**buildConfiguration: '$(BuildConfig)'**
and this lets me set the build config "dynamically", but if I create another pipeline, say to build my release build from the main branch this creates an azure-pipelines-1.yaml which must be wrong as I'd then have to duplicate these files just to change the trigger branch?
Is there a "proper" way to create builds for different environments based on the branch I am checking in? I've seen the Environments but they just seem to offer me VM's or Kubernetes? I'm simply looking to build a .net framework legacy web form application so don't need anything fancy.
I've not started on the deployment process yet!!! :D
Sure, you can have a single pipeline that changes its behavior based upon variables within the release pipelines themselves. You'll want to do this at the time you run the build, instead of at the root variable level.
There are two ways you can accomplish this:
The condition: parameter within the build task
YAML If Statements that selectively set the variables.
Either would work in your scenario, I believe using an if statement would be best in your case. Here's what it would look like in a simple example:
name: Stackoverflow-Example-Variables
trigger:
- main
- feature/*
- uat
stages:
- stage: your_build_stage
variables:
- name: solution
value: '**/*.sln'
- name: buildPlatform
value: 'Any CPU'
- name: buildConfiguration
${{ if eq(variables['Build.SourceBranch'], 'refs/heads/main') }}:
value: 'Release'
${{ if ne(variables['Build.SourceBranch'], 'refs/heads/main') }}:
value: 'UAT'
displayName: "Build Solution"
jobs:
- job: output_message_job
displayName: "Output Message Job"
pool:
vmImage: "ubuntu-latest"
steps:
- powershell: |
Write-Host ${{ variables.buildConfiguration }}
Write-Host $(Build.SourceBranch)
Related
I got two pipelines in my project, one for test and one for build. The reason for this is that the tests need to be run on an self hosted agent to be able to run integration tests.
I don't want to run the build pipeline if the tests are failing. This is my configuration:
Test (Pipeline name)
name: Test
trigger:
- azure-pipelines
pool:
vmImage: "windows-latest"
steps:
- script: echo Test pipeline
Build (Pipeline name)
name: Build
trigger: none
resources:
pipelines:
- pipeline: test
source: Test
trigger: true
pool:
vmImage: "windows-latest"
steps:
- script: echo Build pipeline
The Test pipeline is running as expected but the Build pipeline never gets triggered even if I run it in the cloud as in the example above. Anyone see what the problem is?
It is possible to call another pipeline as shown in the other answer but to start a different agent OS, I would suggest using a Multistage pipeline or Strategy Matrix.
Each stage can run with its own VM or Agent pool.
Here is an example:
trigger:
- main
stages:
- stage: Build
pool:
vmImage: ubuntu-latest
jobs:
- job: BuildJob
steps:
- script: echo Building
- stage: TestWithLinux
dependsOn: Build
pool:
vmImage: ubuntu-latest
jobs:
- job: Testing
steps:
- script: echo Test with Linux OS
- stage: TestWithWindows
dependsOn: Build
pool:
vmImage: windows-latest
jobs:
- job: Testing
steps:
- script: echo Test with Windows OS
- stage: Final
dependsOn: [TestWithLinux,TestWithWindows]
pool:
vmImage: ubuntu-latest
jobs:
- job: FinalJob
steps:
- script: echo Final Job
You can replace vmImage: xxx with your own hosted Agent like:
pool: AgentNameX
And the final result would look like this:
Or It is possible to use a strategy with the matrix. Let's say we have a code that should be run on 3 different agents, we can do the following:
jobs:
- job:
strategy:
matrix:
Linux:
imageName: 'ubuntu-latest'
Mac:
imageName: 'macOS-latest'
Windows:
imageName: 'windows-latest'
pool:
vmImage: $(imageName)
steps:
- powershell: |
"OS = $($env:AGENT_OS)" | Out-Host
displayName: 'Test with Agent'
It can work as a stand-alone or in multi-stages as well as shown in the image:
Here is a list of supported hosted agents.
Disclaimer: I wrote 2 articles about this in my personal blog.
Make sure you use the correct pipeline name. I would also suggest adding project inside resources pipeline.
For example I have a pipeline named first.
first.yml
trigger:
- none
pr: none
pool:
vmImage: ubuntu-latest
steps:
- script: echo Running the first pipeline, should trigger the second.
displayName: 'First pipeline'
second.yml
trigger:
- none
pool:
vmImage: ubuntu-latest
resources:
pipelines:
- pipeline: first
source: first
project: test-project
trigger: true # Run second pipeline when the run of the first pipeline ends
steps:
- script: echo this pipeline was triggered from the first pipeline
displayName: 'Second pipeline'
Two pipeline definitions seem to be correct. You may try checking the following configurations from the Azure DevOps portal.
(I guess you are using GitHub)
Make sure the Build pipeline definition is in the default branch. Or else, you can configure its branch as the default branch for the Build pipeline from the Azure DevOps portal.
Go to pipeline -> Edit -> More options -> Triggers -> YAML -> Get Sources ->
Then change the value in Default branch for manual and scheduled builds to the branch which holds pipeline yaml.
Configure the Build Completion trigger under triggers.
Go to pipeline -> Edit -> More options -> Triggers -> Build completion -> Add -> then select the Test pipeline as the Triggering build from the drop down. Also, add the listed branch filer too (if needed).
This will make sure that the pipeline completion trigger is completed correctly since portal configuration hold the highest priority.
P.S. Better to disable the PR trigger as well with pr: none config in the yaml if it is not required as default.
I have a solution where a git branch is directly related to an environment (this has to be this way, so please do not discuss whether this is good or bad, I know it is not best practice).
We have the option to run a verification deployment (including automatic tests) towards an environment, without actually deploying the solution to the environment. Because of this, I would like to set up a pipeline that runs this verification for an environment, whenever a pull request is opened towards that environment's branch. Moreover, I am using a template for the majority of the pipeline. The actual pipeline in the main repository is just a tiny solution that points towards the template pipeline in another repository. This template, in turn, has stages for each respective environment.
I have, in the main pipeline, successfully added a solution that identifies the current branch, which for pull requests should be the target branch:
variables:
- name: currentBranch
${{ if eq(variables['Build.Reason'], 'PullRequest') }}:
value: $(System.PullRequest.TargetBranch)
${{ if ne(variables['Build.Reason'], 'PullRequest') }}:
value: $(Build.SourceBranch)
I would like to send this variable currentBranch down to the template through a parameter, as my template pipeline has different stages depending on the branch. My solution was to use the pipeline like this:
extends:
template: <template-reference>
parameters:
branch: $(currentBranch)
...and then for a stage in my pipeline do this:
- stage: TestAndDeployBranchName
condition: eq('${{ parameters.branch }}', 'refs/heads/branchName')
jobs:
- job1... etc.
Basically, the stage should run if the current branch is either "branchName", or (for pull requests) when the target branch is "branchName", which comes from the "branch" parameters that is sent to the template.
However, I see here that System.PullRequest.TargetBranch is not available for templates and further here that the parameters are not available for templates (the variable is empty) when the template is expanded. Thus my pipeline does not work as expected (the condition does not trigger when it should, ie. when there is a match on the branch name).
Is there any way that I can use System.PullRequest.TargetBranch in a condition within a template, or should I look for another solution?
After investigating this further I concluded that what I am trying to do is not possible.
In short, System.PullRequest.TargetBranch (and I assume at least some other variables within System.PullRequest are not available in compile time for template, which is when conditions are evaluated. Thus, using these variables in a condition in a template is not possible.
As my goal was to have certain steps run for pull requests only, based on the target branch of the pull request, I solved this by creating duplicate pipelines. Each pipeline is the same and references the same template, except for that the input parameter for the template is different. I then added each "PR pipelines" to run as part of the branch policy each respective branch this was applicable.
This works great, however it requires me to create a new pipeline if I have the same requirement for another branch. Moreover, I have to maintain each PR pipeline separately (which can be both good and bad).
Not an ideal solution, but it works.
Reference PR pipeline:
trigger: none # no trigger as PR triggers are set by branch policies
#This references the template repository to reuse the basic pipeline
resources:
repositories:
- repository: <template repo>
type: git # "git" means azure devops repository
name: <template name> # Syntax: <project>/<repo>
ref: refs/heads/master # Grab latest pipeline template from the master branch
stages:
- stage: VerifyPullRequest
condition: |
and(
not(failed()),
not(canceled()),
eq(variables['Build.Reason'], 'PullRequest')
)
displayName: 'Verify Pull Request'
jobs:
- template: <template reference> # Template reference
parameters:
image: <image>
targetBranch: <targetBranch> # Adjust this to match each respective relevant branch
The targetBranch parameter is the used in relevant places in the template to run PR verification.
Example of branch policy:
(Set this up for each relevant branch)
Picture of branch policy set up
After checking your script, we find we can not use the
variables:
- name: currentBranch
${{ if eq(variables['Build.Reason'], 'PullRequest') }}:
value: $(System.PullRequest.TargetBranch)
${{ if ne(variables['Build.Reason'], 'PullRequest') }}:
value: $(Build.SourceBranch)
in the variables.
The variables will duplicate the second value to first one.
This will cause your issue.
So, on my side, I create a work around and hope this will help you. Here is my main yaml:
parameters:
- name: custom_agent
displayName: Use Custom Agent
type: boolean
default: true
- name: image
type: string
default: default
resources:
repositories:
- repository: templates
type: git
name: Tech-Talk/template
trigger: none
pool:
vmImage: windows-latest
# vmImage: ubuntu-20.04
stages:
- stage: A
jobs:
- job: A1
steps:
- task: PowerShell#2
name: printvar
inputs:
targetType: 'inline'
script: |
If("$(Build.Reason)" -eq "PullRequest"){
Write-Host "##vso[task.setvariable variable=currentBranch;isOutput=true]$(System.PullRequest.TargetBranch)"
}
else{
Write-Host "##vso[task.setvariable variable=currentBranch;isOutput=true]$(Build.SourceBranch)"
}
- stage: B
condition: eq(dependencies.A.outputs['A1.printvar.currentBranch'], 'refs/heads/master')
dependsOn: A
jobs:
- job: B1
variables:
varFromA: $[ stageDependencies.A.A1.outputs['printvar.currentBranch'] ]
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
# Write your PowerShell commands here.
Write-Host "$(varFromA)"
- template: temp.yaml#templates
parameters:
branchName: $(varFromA)
agent_pool_name: ''
db_resource_path: $(System.DefaultWorkingDirectory)
Please Note:
If we use this, we need to modified your temp yaml.
We need to move the condition to the main yaml and make the temp yaml only steps is left.
I have four YAML "release" pipelines where I use the same YAML syntax to define a continuation trigger. Here is the YAML definition for the trigger.
resources:
pipelines:
- pipeline: Build # Name of the pipeline resource
source: BuildPipeline # Name of the pipeline as registered with Azure DevOps
trigger: true
Not really sure about this syntax where I don't specify any branch but everything was working fine till recently. More recently I updated two of the YAML release pipelines and they now are not getting triggered when the build pipeline completes. All pipelines if executed manually work fine.
All release pipelines have the same YAML for the continuation trigger definition (see above) and have the same branch set for "Default branch for manual and scheduled builds".
I don't know how to investigate why some of the release pipelines are not triggered (any log available somewhere?) and I don't see them executed and failing, they simply are not being triggered. How do I investigate this issue?
For your question about investigating the logs - you can see what pipeline runs were created, but unfortunately you can't see what wasn't. So far as Azure DevOps is concerned, if "nothing occurred" to set off a trigger, then there's nothing to log.
As for the pipelines themselves not triggering, from the pipeline editor, check the trigger settings to ensure that nothing is set there - UI and YAML settings tend to cancel one another out:
Finally, if you want to specify a branch, you can use some combination of the following options:
resources:
pipelines:
- pipeline: Build # Name of the pipeline resource
source: BuildPipeline # Name of the pipeline as registered with Azure DevOps
trigger:
branches:
include: # branch names which will trigger a build
exclude: # branch names which will not
tags:
include: # tag names which will trigger a build
exclude: # tag names which will not
paths:
include: # file paths which must match to trigger a build
exclude: # file paths which will not trigger a build
I believe I found the issue and it's the removal of the following statements from my deploy pipelines
pool:
vmImage: windows-2019
I removed these statements because I transformed all jobs into deployment jobs as follows
- deployment: MyDeployJob
displayName: 'bla bla bla'
environment:
name: ${{ parameters.AzureDevopsEnv }}
resourceType: VirtualMachine
resourceName: ${{ parameters.AzureDevopsVM }}
The pipelines with no pool statement run perfectly well if started manually but I'm convinced fail at being triggered if started via the pipeline completion trigger. I do not understand this behavior but I placed the pool statement back in all deploy pipelines and all are now getting triggered as the build pipeline completes.
I found that when defining the resource pipeline (trigger) in a template that you extend in the depending pipeline, there are two things that can prevent builds from being triggered:
There are syntax errors in the template (or the parent .yaml)
The depending pipeline needs to be updated before Azure Devops realizes you made edits to the template it extends
This worked for me:
template.yaml
stages:
- stage: SomeBuildStage
displayName: Build The Project
jobs:
- job: SomeJob
displayName: Build NuGet package from Project
pool:
name: My Self-hosted Agent Pool # Using Pool here works fine for me, contrary to #whatever 's answer
steps:
- pwsh: |
echo "This template can be extended by multiple pipelines in order to define a trigger only once."
# I still use CI triggers as well here (optional)
trigger:
branches:
include:
- '*'
# This is where the triggering pipeline is defined
resources:
pipelines:
- pipeline: trigger-all-builds # This can be any string
source: trigger-all-builds # This is the name defined in the Azure Devops GUI
trigger: true
depending-pipeline.yaml
extends:
template: template.yaml
# I still use CI triggers as well here (optional)
trigger:
paths:
include:
- some/subfolder
triggering-pipeline.yaml
stages:
- stage: TriggerAllBuilds
displayName: Trigger all package builds
jobs:
- job: TriggerAllBuilds
displayName: Trigger all builds
pool:
name: My Self-hosted Agent Pool
steps:
- pwsh: |
echo "Geronimooo!"
displayName: Geronimo
trigger: none
pr: none
Let's suppose I have 3 environments on Azure: Dev, Test and Prod. I have the same pipeline for building and deploying the resources and the code for each one of the environments except for two differences:
different trigger branch
different variable values
What is the correct approach for this scenario? Because at least 3 come to my mind, none of which is perfect:
Option 1: I guess I could create a single pipeline on Azure DevOps (triggered by any of 3 branches) with 3 stages for each environment and for each stage add a condition to run depending on the source branch, like this:
condition: eq(variables['Build.SourceBranch'], 'refs/heads/a-branch-name')
and in each stage reference different variables. But this would introduce code duplication in each stage - when adding or modifying a step I would have to remember to edit 3 stages - not desirable.
Option 2: Create 3 separate YAML files in my repository, each one of them with specified trigger branch and referencing the same variable names, then create 3 different pipeline on Azure DevOps, each one of them with different variable values. But this would also introduce code duplication.
Option 3: Create 1 build-and-deploy.yaml file as a template with the steps defined in it and then create another 3 YAML files referring to that template, each with different trigger branch and with different variable values in each Azure Pipeline, like this:
trigger:
branches:
include:
- a-branch-name
steps:
- template: build-and-deploy.yaml
parameters:
parameterName1: $(parameterValue1)
parameterName2: $(parameterValue2)
This seems to be the best option but I haven't seen it used anywhere in the examples so maybe I'm just unaware of downsides of it, if there are any.
Here's how to do it with a shared pipeline config that gets included into env-specific pipelines.
To support 2 environments (dev and prod) you'd need:
1 shared pipeline yaml
2 env-specific yamls, one for each env
2 pipelines created in Azure DevOps, one for each env; each pipeline referencing corresponding yaml
pipeline-shared.yml:
variables:
ARTIFACT_NAME: ApiBuild
NPM_CACHE_FOLDER: $(Pipeline.Workspace)/.npm
stages:
- stage: Build
displayName: Build
pool:
vmImage: 'ubuntu-latest'
demands: npm
jobs:
...
- stage: Release
displayName: Release
dependsOn: Build
pool:
vmImage: 'ubuntu-latest'
jobs:
...
pipeline-dev.yml:
# Trigger builds on commits to branches
trigger:
- dev
# Do not trigger builds on PRs
pr: none
extends:
template: pipeline-shared.yml
pipeline-prod.yml
trigger:
- master
pr: none
extends:
template: pipeline-shared.yml
According to your description, if you want different stages to share the same repo resource, but their trigger branch and variable values are different.
Regarding trigger branch, you can use expression {{if ......}} to determine the trigger branch condition.
Regarding variable values, you can define templates and variable groups to specify them through parameters.
Here is an example, you can refer to it:
First go to Library under Pipelines, click on the Variable group to add a variable group. You can add multiple variables to this variable group.
Repo structure:
azure-pipelines.yml:
sample:
stages:
- template: stage/test.yml
parameters:
${{if contains(variables['Build.SourceBranch'], 'master')}}:
variableGroup: devGroup
stageName: Dev
test: a
${{if contains(variables['Build.SourceBranch'], 'test')}}:
stageName: test
test: b
stage/test. yml:
parameters:
- name: stageName
displayName: Test
type: string
default: test
- name: test
displayName: Test
type: string
default: test
- name: variableGroup
displayName: Test
type: string
default: test
stages:
- stage: Test_${{ parameters.stageName }}
variables:
- group: ${{parameters.variableGroup}}
jobs:
- job: Test1
pool:
vmImage: vs2017-win2016
steps:
- script: echo "Hello Test1"
- script: echo ${{ parameters.test }}
- script: echo $(dev1)
Of course, if you want to use a single variable, you can define the variable directly in yaml without adding a variable group.
I'm trying to learn how to use the new yaml configured pipeline system for Azure Devops, and I'm having a bit of trouble getting my head around the way the variables are supposed to work.
When I setup the pipeline, it created a file azure-pipelines.yml and committed this to the master branch.
By default, this file looks like so...
trigger:
- master
pool:
vmImage: 'windows-latest'
variables:
solution: '**/*.sln'
buildPlatform: 'Any CPU'
buildConfiguration: 'Release'
My project is setup with the following build configurations... "prod", "staging", "develop".
What I'm confused about, is where am I supposed to override these default variables for the actual pipelines?
I can modify the values directly in this file, but that's not really going to work. When I merge the changes back from "master" to "staging" etc, then presumably the pipelines for these lower environments will then be trying to build with "prod" configuration.
Surely there must be some way to configure variables independent of the source code.
There are 2 places where I can see an option to add Variables...
When I choose "Edit" for the pipeline, up in the top right, there is a "Variables" button next to run.
I can add variables there, but they don't appear to do anything. They are not applied when I run the pipeline.
Also, to make things more confusing, when I choose to "Run pipeline", there is also an option to define variables, but likewise, these don't seem to do anything. The build still just runs with the pre-defined values from the yaml file.
Agree with Shayki Abramczyk. This method could manually override the variable value on the UI interface.
I would like to share the method of automatically appending values to variables.
You could use Expressions to judge different situations(e.g. build branch). Then you could set the value for different situations.
Here is an example:
trigger:
- '*'
pool:
vmImage: 'windows-latest'
variables:
${{ if eq(variables['Build.SourceBranchName'], 'master') }}:
buildConfiguration: Prod
${{ if eq(variables['Build.SourceBranchName'], 'staging') }}:
buildConfiguration: Staging
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
# Write your PowerShell commands here.
Write-Host $(buildConfiguration)
This sample code can select the corresponding value according to different trigger branch names. (master: Prod , staging: staging)
Hope this helps.
You can define the variables with Let users override this value when running this pipeline:
Use the variable in the build step with $(BuildConfiguration ).
When you run the build you can override the value:
May be you need not a variables but a parameters?
parameters:
- name: STAND_NAME
displayName: Select Stand to deploy
type: string
default: none
values:
- dev
- stage
- prod
variables:
- group: global-variables # use global variable from library
- name: STAND_NAME
value: ${{ parameters.STAND_NAME }}
- ${{ if eq(parameters['STAND_NAME'], 'prod') }}:
- name: variable_depends_on_stand
value: "prod_value" #
- ${{ if eq(parameters['STAND_NAME'], 'stage') }}:
- name: variable_depends_on_stand
value: "stage_value"
- ${{ if eq(parameters['STAND_NAME'], 'dev') }}:
- name: variable_depends_on_stand
value: "dev_value"
- name: SOME_OTHER_GLOBAL_VARIABLE
value: some_other_value
It would be display like this in pipeline:
screenshot of the pipeline WUI