doing a task after a looping YAML template-ized azure devOps pipeline - azure-devops

I have a YAML Azure DevOps pipeline that loops through series of configurations, copying artifacts to various places. What I want to do is, after the looping is done, to do something else (I'd like to send an email, but the question is more general than that).
But I can't insert anything after the looping part of the YAML, at least not with any of the experiments I've tried. Here's the YAML that calls the YAML template, with a comment for where I'd like to do another step. How might I do this?
parameters:
- name: configuration
type: object
default:
- Texas
- Japan
- Russia
- Spaghetti
- Philosophy
trigger:
- dev
- master
resources:
repositories:
- repository: templates
name: BuildTemplates
type: git
stages:
- ${{ each configuration in parameters.configuration }}:
- template: build.yml#templates
parameters:
configuration: ${{ configuration }}
appName: all
# Where I'd like to have another task or job or step or stage that can send an email or perhaps other things

Just define a new stage:
stages:
- ${{ each configuration in parameters.configuration }}:
- template: build.yml#templates
parameters:
configuration: ${{ configuration }}
appName: all
- stage: secondStage
jobs:
- job: jobOne
steps:
- task: PowerShell#2

Related

Run Azure DevOps deployment pipeline in parallel

I have a deployment pipeline in Azure DevOps which deploys database changes to a list of databases. Rather than having these run sequentially I would like to run them in parallel. The rolling deployment strategy supports running in parallel but I don't know how to pass variables in this configuration. For Jobs there is a Matrix option to pass variables to different executions. However I can't find an equivalent to use for deployment.
Here is the relevant pipeline portion
trigger:
- master
pool:
vmImage: 'windows-latest'
variables:
- group: LibraryData
parameters:
- name: 'qaDatabases'
type: object
default:
- databaseSet:
databases: ['DB1','DB1-A']
- databaseSet:
databases: ['DB2','DB2-A']
- databaseSet:
databases: ['DB3','DB3-A']
#Build details skipped, working fine
- stage: DeployToQA
jobs:
- deployment: DeployQA
environment:
name: QA
resourceType: VirtualMachine
tags: Database
strategy:
rolling:
maxParallel: 2
deploy:
steps:
- ${{ each databaseSet in parameters.qaDatabases }}:
- template: Pipeline-Templates/DBDeploy.yml
parameters:
DatabaseServer: "$(lib-QADBServer)"
DBName: ${{ databaseSet.databases[0] }}
DBaName: ${{ databaseSet.databases[1] }}

For-Each an Object in Azure Devops pipeline?

I starting to write an appication in microservices and want to have a build step to push the image from my pipeline. For this at the moment I have 3 services to push:
- stage: build_and_publish_containers
displayName: 'Docker (:Dev Push)'
jobs:
- template: "docker/publish.yaml"
parameters:
appName: Authorization_Service
projectPath: "Services/AuthorizationService"
imageName: authorization-service
imageTag: ${{variables.imageTag}}
- template: "docker/publish.yaml"
parameters:
appName: Registration_Service
projectPath: "Services/RegistrationService"
imageName: registration-service
imageTag: ${{variables.imageTag}}
- template: "docker/publish.yaml"
parameters:
appName: Tennant_Service
projectPath: "Services/TennantService"
imageName: tennant-service
imageTag: ${{variables.imageTag}}
Even with only this 3 services (and I want to have much more) I have a lot of duplicated code here I want to reduce.
I tried it with an array and an each-function but I have several information here (name / path / imagename) and that could grow.
Is there a better way?
If that would be a programming language I would have an array of a data model, is that something that is possible in azure devops?
Or maybe could each information saved in a json file (so 3 files at the moment and growing) and azure could get all files and informations out of this?
you could check the example below to define your complex object nested loops in Azure pipelines. By the way, you could also look into the github doc for more reference.
parameters:
- name: environmentObjects
type: object
default:
- environmentName: 'dev'
result: ['123']
- environmentName: 'uat'
result: ['223', '323']
pool:
vmimage: ubuntu-latest
steps:
- ${{ each environmentObject in parameters.environmentObjects }}:
- ${{ each result in environmentObject.result }}:
- script: echo ${{ result }}

Azure DevOps: An error occurred while loading the YAML build pipeline. An item with the same key has already been added

Lately I've been trying to make an Azure DevOps pipeline to deploy to 3 environments, with 2 different data centers and 2 different service connections. I've been trying to achieve this with using as little lines of YAML as possible.
After a lot of trial and error, I'm stuck on this message "An error occurred while loading the YAML build pipeline. An item with the same key has already been added."
deploy-env.yaml:
parameters:
- name: OPENSHIFT_NAMESPACE
displayName: 'OpenShift namespace'
type: object
values: []
- name: DCR
displayName: 'Data Center'
type: object
values: []
- name: OSC
displayName: 'Openshift service connection'
type: object
values: []
stages:
- ${{ each namespace in parameters.OPENSHIFT_NAMESPACE }}:
- ${{ each dcr in parameters.DCR }}:
- ${{ each osc in parameters.OSC }}:
- stage: deploy-${{ convertToJson(namespace) }}
jobs:
- deployment: deploy_to_dcr
environment: ${{ namespace }}
displayName: 'Deploy to DCR1'
strategy:
runOnce:
deploy:
steps:
- template: steps/deploy_to_cluster_with_helm_templating.yml#pipeline_templates
parameters:
DATA_CENTER: ${{ dcr }}
OPENSHIFT_NAMESPACE: ${{ namespace }}
OPENSHIFT_SERVICE_CONNECTION: '${{ osc }}'
HELM_VALUES:
- 'global.namespace=${{ namespace }}'
- 'global.data_center=${{ DCR }}'
azure-pipeline.yaml
resources:
repositories:
- repository: pipeline_templates
type: git
name: pipeline-templates
stages:
- template: deploy-env.yaml
parameters:
OPENSHIFT_NAMESPACE:
- development
- test
- acceptance
DCR:
- dcr1
- dcr2
OSC:
- OCP4DCR1
- OCP4DCR2
Does anyone knows why this error occurs? I've found other articles where stage/job names we're not unique, but that is not the case in this example.
Thanks in advance.
This line is getting repeated twelve times, with only four different values:
- stage: deploy-${{ convertToJson(namespace) }}
Stage names must be unique.

Azure Devops pipeline finish with warning badge when a job failed

I have a pipeline (Azure Devops yaml) with some jobs.
They run in parallel.
But when one fails, the build ends with a warning and not an error badge.
Why?
I expected the pipeline to end in error.
Edit:
pipelines.yml
pool:
name: Default
jobs:
- job: FirstJob
steps:
- script: echo First!
- template: template.yml
parameters:
name: Second
- template: template.yml
parameters:
name: Third
- template: template.yml
parameters:
name: Fourth
- template: template.yml
parameters:
name: Fifth
- template: template.yml
parameters:
name: Sixth
- template: template.yml
parameters:
name: Seventh
template.yml:
parameters:
- name: name
type: string
default: --
jobs:
- job: ${{ parameters.name }}
dependsOn: FirstJob
continueOnError: true
variables:
BuildTag: ${{ parameters.name }}
steps:
- task : AddTag#0
inputs:
tags: '$(BuildTag)'
- task: DotNetCoreCLI#2
inputs:
command: 'build'
projects: 'foo'
condition: and(succeeded(), eq(variables['BuildTag'], 'Sixth'))
I see my mistake with continueOnError: true.
Azure Devops pipeline finish with warning badge when a job failed
Thanks for your comments and sample, which make me found out the reason for this issue:
The property continueOnError. This behavior is by designed and is not a bug. There is no way to fix it at present.
If we set the continueOnError: true, which will make the future jobs should run even if this job fails. In order to achieve it, Azure Devops will use a "deceptive way" to treat error as warning, so that the error will not prevent the building. That the reason why the job failed, but the pipeline show it as warning.
We could even just reproduce this issue with Control Options Continue on error in non-YAML task:
Besides, it does not affect the completion of the PR.
To resolve this issue, you could comment it in the YAML. If it is necessary for you, you could set the condition: always() for the future jobs:
jobs:
- job: ${{ parameters.name }}
dependsOn: FirstJob
variables:
BuildTag: ${{ parameters.name }}
steps:
...
- job: Second
condition: always()
- job: Third
condition: always()
Hope this helps.
Maybe u need to uncheck "advanced" => "FAIL ON STDERR" (If this option is selected, the build will fail when the remote commands or script write to STDERR).

Different agent pool or demands for the same Azure Pipeline based on trigger

I have an Azure Pipeline written in YAML which runs from a CI trigger whenever changes are made to the master branch. It can also be manually triggered from Pull Requests or by users against any branch.
Due to the use of a number of licensed components, the build from master needs to run on a specific agent. The other builds do not, and in fact I would rather they run on other agent(s).
So my question is, is there any way to specify a different agent/pool within the YAML pipeline based on what triggered the build, or what branch the build is building? I'd like this to be behaviour which is configured in the pipeline permanently, rather than requiring users to update the YAML on each branch they wish to build elsewhere.
I can't see anything obvious in the sections of the documentation on the pool/demands/condition keywords.
I solved this by putting the steps for the job into a template, and then creating a set of jobs in the pipeline with different condition entries, so that we can set demands based on those conditions.
A skeleton version looks like this:
- stage: Build
jobs:
- job: TopicBranchAndPullRequestBuild
condition: or(startsWith(variables['Build.SourceBranch'], 'refs/heads/topic'), startsWith(variables['Build.SourceBranch'], 'refs/pull'))
displayName: 'Build topic Branch or Pull Request'
pool:
name: the-one-and-only-pool
demands:
- HasLicensedComponents -equals false
steps:
- template: build-template.yml
- job: MasterAndReleaseBranchBuild
condition: or(eq(variables['Build.SourceBranch'], 'refs/heads/master'), startsWith(variables['Build.SourceBranch'], 'refs/heads/release'))
displayName: 'Build master or release Branch'
pool:
name: the-one-and-only-pool
demands:
- HasLicensedComponents -equals true
steps:
- template: build-template.yml
Obviously the values given here are for example only, but otherwise this is what I have working.
Could you try expressions? I've used this with success on variable groups so it might work on agent pools.
- ${{ if eq(variables['build.SourceBranchName'], 'prod') }}:
- pool: Host1
- ${{ if eq(variables['build.SourceBranchName'], 'staging') }}:
- pool: Host2
- ${{ if not(and(eq(variables['build.SourceBranchName'], 'staging'), eq(variables['build.SourceBranchName'], 'prod'))) }}:
- pool: Host3
Credit for original workaround for dynamically pulling variable groups here: https://github.com/MicrosoftDocs/vsts-docs/issues/3702#issuecomment-574278829
The pool keyword specifies which pool to use for a job of the pipeline. A pool specification also holds information about the job's strategy for running. You can specify a pool at the pipeline, stage, or job level. The pool specified at the lowest level of the hierarchy is used to run the job.
is there any way to specify a different agent/pool within the YAML
pipeline based on what triggered the build, or what branch the build
is building?
For this issue , you can create different yaml files for different branches, specify different agent pools in the corresponding yaml file.
You can switch to the yaml file of different branches in pipeline with the option shown below:
parameters:
- name: self_hosted_agent
displayName: Choose your pool agent
type: string
default: windows-latest
values:
- windows-latest
- self-hosted
- name: agent_name
displayName: Choose your agent
type: string
default: 'Laptop Clint'
values:
- 'Laptop Clint'
- 'Laptop Eastwood'
variables:
- name: pool_agent_key
${{ if eq(parameters.self_hosted_agent, 'windows-latest') }}:
value: vmImage
${{ if eq(parameters.self_hosted_agent, 'self-hosted') }}:
value: name
- name: pool_agent_value
${{ if eq(parameters.self_hosted_agent, 'windows-latest') }}:
value: windows-latest
${{ if eq(parameters.self_hosted_agent, 'self-hosted') }}:
value: Iot-Windows
- name: agent_name
value: ${{ parameters.agent_name }}
stages:
- stage: build
displayName: Build
pool:
${{ variables.pool_agent_key }}: ${{ variables.pool_agent_value }}
${{ if eq(parameters.self_hosted_agent, 'self-hosted') }}:
demands:
- agent.name -equals ${{ variables.agent_name }}