Why we need AZure Devops Yaml template extends? - azure-devops

My issue
Yaml Devops propose the extends bloc. This is the kind of sample one use:
# main.yaml
trigger: none
extends:
template: ./extends-pipeline1.yaml
And:
# extends-pipeline1.yaml
stages:
- stage: Build
jobs:
- job:
steps:
- script: echo Build solution
- stage: Deploy
jobs:
- job:
steps:
- script: echo Deploy solution
Works fine, but this works as well without extends:
# main.yaml
trigger: none
stages:
- template: extends-pipeline1.yaml
What I did
I read the release note:
Release Note
My question
What is the benefit to use extends?
Thanks

What is the benefit to use extends?
There are two benefits of this feature.
First, 'extends' is more free than 'stages'.
When you use stages, then the pipeline will expect stages in the template, you must write the stages part in the template, otherwise the validation step will not be passed.
For example, when you use stages, your pipeline must be like this:
azure-pipelines.yml
trigger:
- none
pool:
vmImage: ubuntu-latest
stages:
- template: extends-pipeline1.yaml
extends-pipeline1.yaml
stages: #You must have this section in the template.
- stage: Build
jobs:
- job:
steps:
- script: echo Build solution
- stage: Deploy
jobs:
- job:
steps:
- script: echo Deploy solution
But if you use extends, your pipeline could be like this instead of must writing stages section in template:
azure-pipelines.yml
trigger:
- none
pool:
vmImage: ubuntu-latest
extends:
template: extends-pipeline1.yaml
extends-pipeline1.yaml
jobs:
- job:
steps:
- script: echo Build solution
Second, 'extends' has a feature named Required template.(Daniel had already mentioned this.)
If you set the 'required template' of the service connection, then the service connection will only been able to use in specific YAML template file which been 'extends'.
No matter directly use the service connection via main YAML file or use the service connection in the YAML file referred via 'stages'-template will both make the pipeline failed.
This feature can help you increase the security and this feature is exclusive to the 'extends' feature.

Related

Pipeline trigger from another pipeline yaml using resource tag

I have 2 basic pipelines.
pipeline 1 and pipeline 2.
I have read the following article below
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/resources?view=azure-devops&tabs=schema#change-default-branch-for-triggers-optional
In a summary, I put the following in my the following in my pipeline2 to listen to pipeline1 execution.
# azure-pipelines-trigger.yml
name: Pipeline2
trigger: none
# this pipeline will be triggered by another pipeline
resources:
pipelines:
- pipeline: Pipeline1
project: DevOpsProject
source: Pipeline1
trigger:
branches:
include:
- releases/*
- main
exclude:
- topic/*
tags:
- Verified
- Signed
stages:
- Production
- PreProduction
pool:
vmImage: ubuntu-latest
stages:
- stage: Dev
displayName: 'Build_Deploy'
jobs:
- job: 'Build_Deploy'
steps:
- script: echo This pipeline was set to be triggered after first pipeline completes.
- stage: Prod
displayName: 'Build_Deploy'
jobs:
- job: 'Build_Deploy'
steps:
- script: echo This pipeline was set to be triggered after first pipeline completes.
Therefore, the default branch of pipeline 2 would trigger when Pipeline1 production and preproduction has launched and finished. The requirement is more complex than that. The requirements are:
Pipeline 2 would only listen to the trigger if it's in release/xxxx branch, not the default one/ master branch.
We don't want the trigger to create another pipeline run. The trigger is only meant to listen to kick off the Prod stage of pipeline 2.
any thoughts? Thanks.

doing a task after a looping YAML template-ized azure devOps pipeline

I have a YAML Azure DevOps pipeline that loops through series of configurations, copying artifacts to various places. What I want to do is, after the looping is done, to do something else (I'd like to send an email, but the question is more general than that).
But I can't insert anything after the looping part of the YAML, at least not with any of the experiments I've tried. Here's the YAML that calls the YAML template, with a comment for where I'd like to do another step. How might I do this?
parameters:
- name: configuration
type: object
default:
- Texas
- Japan
- Russia
- Spaghetti
- Philosophy
trigger:
- dev
- master
resources:
repositories:
- repository: templates
name: BuildTemplates
type: git
stages:
- ${{ each configuration in parameters.configuration }}:
- template: build.yml#templates
parameters:
configuration: ${{ configuration }}
appName: all
# Where I'd like to have another task or job or step or stage that can send an email or perhaps other things
Just define a new stage:
stages:
- ${{ each configuration in parameters.configuration }}:
- template: build.yml#templates
parameters:
configuration: ${{ configuration }}
appName: all
- stage: secondStage
jobs:
- job: jobOne
steps:
- task: PowerShell#2

Why can't I use a variable to define the environment property in the Azure Pipeline YAML config file?

I'm trying to create a deploy pipeline YAML template for all environments/stages. I've set up the Environments on Azure DevOps so that I can add checks and approvals on the Test and Prod environments before they get deployed. I've set up a library group for each stage and each one of them has a variable called 'env' which defines the current stage running in the pipeline. For some reason, the environment property under the deployment job (see code snippet below) doesn't read that variable.
Has anyone faced this issue before, or is there a reason why the variable won't be read for that specific property?
Note: I've tested the variables and they do work, for example, the stage property outputs as 'deploy-dev/test/prod' (depending on the environment)
- stage: deploy-$(env)
jobs:
- deployment: DeployWeb
displayName: deploy Web App
pool:
vmImage: 'Ubuntu-latest'
# creates an environment if it doesn't exist
environment: 'smarthotel-$(env)'
strategy:
runOnce:
deploy:
steps:
- script: echo Hello world
You can't do this because it have to be know at compilation phase.
But you can try this (lets name this file deploy.yml):
parameters:
- name: env
type: string
default: 'dev'
stages:
- stage: deploy-${{ parameters.env }}
jobs:
- deployment: DeployWeb
displayName: deploy Web App
pool:
vmImage: 'Ubuntu-latest'
# creates an environment if it doesn't exist
environment: 'smarthotel-${{ parameters.env }}'
strategy:
runOnce:
deploy:
steps:
- script: echo Hello world
and then you need to run as follows (in build.yml file):
stages:
- template: deploy.yml
parameters:
env: dev
- template: deploy.yml
parameters:
env: qa
- template: deploy.yml
parameters:
env: prod

Azure Pipelines environments approvals

I have set up 2 environments and protected only one environment.
However pipeline run expect me to approve even before it starts.
I am assuming that Build and DevEnv deployment should happen un attended and should stop for QAEnv alone. Am I missing anything?
You need to add dependsOn: <environment> to your jobs. As it stands, it's trying to run all of the stages at once.
You also have all of those jobs within a single stage, which looks off to me.
You need to split them into multiple stages:
stages:
- stage: Build
jobs: ...
- stage: DEV
jobs: ...
- stage: QA
jobs: ...
Agree with Daniel Mann.
You could split the jobs into two stages (Dev stage and QA stage).
Here is an example:
stages:
- stage: Dev_Stage
jobs:
- deployment: DeployWeb
displayName: deploy Web App
pool:
vmImage: 'Ubuntu-latest'
environment: 'env1'
strategy:
runOnce:
deploy:
steps:
- script: echo Hello world
- stage: QA_Stage
jobs:
- deployment: DeployWeb
displayName: deploy Web App
pool:
vmImage: 'Ubuntu-latest'
environment: 'env2'
strategy:
runOnce:
deploy:
steps:
- script: echo Hello world
Result:
In this case, the stage1 has no check steps , the stage2 needs to be checked.
If you set the environment for the two stages separately, the two stages are independent of each other, they will not interfere with the other stage.
Hope this helps.

Azure DevOps Pipeline define variable in deployment and reuse in subsequent job

I'm using an Azure DevOps pipeline to deploy my code and now I'm in need of passing a variable value from a deployment job to a subsequent job that depends on it. I've read up on this example but it does not seem to work at all.
What I'm trying to do is run an Azure ARM Deployment that provisions a Key Vault. The name of the key vault is outputted from the ARM deployment job and I'm then trying to pass that name to another job which needs to add specific secrets. Access control is taken care of, but I still need to pass the name.
I've boiled the problem down to the basics of passing a variable from a deployment to a job. Here is my complete test pipeline (almost entirely copied from here):
trigger: none
stages:
- stage: X
jobs:
- deployment: A
pool:
vmImage: "ubuntu-16.04"
environment: test
strategy:
runOnce:
deploy:
steps:
- script: echo "##vso[task.setvariable variable=myOutputVar;isOutput=true]this is the deployment variable value"
name: setvarStep
- script: echo $(setvarStep.myOutputVar)
name: echovar
- job: B
dependsOn: A
pool:
vmImage: "ubuntu-16.04"
variables:
myVarFromDeploymentJob: $[ dependencies.A.outputs['deploy.setvarStep.myOutputVar'] ]
steps:
- script: "echo $(myVarFromDeploymentJob)"
name: echovar
Once I run this the echoed value is blank in job B, but defined in deployment A. Why is this? And is there a way to dum everything in dependencies.A.outputs so that I can see what I have to work with?
How can I pass a variable from a runOnce deployment job to a regular job?
I've solved it. The problem is that the documentation here specifies this schema for fetching the variable for a runOnce deployment:
$[dependencies.<job-name>.outputs['<lifecycle-hookname>.<step-name>.<variable-name>']]
This is in fact WRONG. The <lifecycle-hookname> parameter should be replaced with the name of the deployment like this:
$[dependencies.<job-name>.outputs['<job-name>.<step-name>.<variable-name>']]
The example from this documentation (scroll down a bit) is correct.
A full example pipeline that I've tested and works:
trigger: none
stages:
- stage: X
jobs:
- deployment: A # This name is important
pool:
vmImage: 'ubuntu-16.04'
environment: staging
strategy:
runOnce:
deploy:
steps:
- script: echo "##vso[task.setvariable variable=myOutputVar;isOutput=true]this is the deployment variable value"
name: setvarStep # This name is also important
- script: echo $(setvarStep.myOutputVar)
name: echovar
- job: B
dependsOn: A
pool:
vmImage: 'ubuntu-16.04'
variables:
myVarFromDeploymentJob: $[ dependencies.A.outputs['A.setvarStep.myOutputVar'] ]
steps:
- script: "echo $(myVarFromDeploymentJob)"
name: echovar