Azure Devops Release Jobs/Pipeline - azure-devops

We are using Azure devops for our CI/CD. Typically, all the CI are written as azure yaml files and the release jobs have to be created on devops portal (using GUI). One of the general principle which we want to follow is to have everything as code.
Questions:
Can Azure release pipelines be created as code (yaml , etc) ?
I spent some time on it and it seems it is limited. Please correct me if i am wrong here.
Release pipelines have numerous things like approvals, auto trigger, release trigger, etc . Is it possible with release pipelines in yaml ?

Azure deployments can be configured with code. You can add multiple release triggers (pipeline, pull request etc.) Approvals can be configured per environment (https://www.programmingwithwolfgang.com/deployment-approvals-yaml-pipeline/), then reference the environment in your pipeline.
The example below is triggered when its own yaml code changes and when the Build pipeline completes.
trigger:
branches:
include:
- myBranch
paths:
include:
- '/Deployment/azure-deploy.yml'
resources:
pipelines:
- pipeline: BuildPipeline
project: myProjectName
source: 'myBuildPipeline'
trigger:
enabled: true
jobs:
- deployment: Deploy
displayName: Deploy
environment: $(environment)
pool:
vmImage: 'windows-latest'
strategy:
runOnce:
deploy:
steps:
- task: AzureRmWebAppDeployment#4
displayName: Deploy Web App
inputs:
ConnectionType: 'AzureRM'
azureSubscription: $(azureSubscription)
appType: 'webApp'
appSettings:
-SETTING-1 "$(mySetting1)"
WebAppName: '$(myAppName)'
package: '$(Pipeline.Workspace)/**/*.zip'
[1]: https://www.programmingwithwolfgang.com/deployment-approvals-yaml-pipeline/

Related

Azure Devops Passing build id as parameter to YAML deployment pipeline

I want to create a parameter in YAML deploy pipeline to let user mention the build id they want to pass for deployment while running manually.
How can I use that specific build id passed as parameter during deployment inside deployment pipeline?
Deployment pipeline resource definition is:
resources:
pipelines:
- pipeline: build
source: build_pipeline_name
trigger:
branches:
- master
Choosing from Resources is not an option due to access restriction on the Environments we are using in pipeline.
If you want to download just specific artifact you won't be able to do this usisng just resource, as you cannot parameterize resources. However, if this is your goal you can parameterize this task:
parameters:
- name: runId
type: number
# Download an artifact named 'WebApp' from a specific build run to 'bin' in $(Build.SourcesDirectory)
steps:
- task: DownloadPipelineArtifact#2
inputs:
source: 'specific'
artifact: 'WebApp'
path: $(Build.SourcesDirectory)/bin
project: 'FabrikamFiber'
pipeline: 12
runVersion: 'specific'
runId: ${{ parameters.runId }}
However, I'm not sure if I understood you.

Azure DevOps pipeline repository trigger doesn't fire

Context
I'm creating a CI/CD configuration for an application having this repository configuration (each repository in the same Organization and Project):
Frontend repository (r1)
API Service repository (r2)
Infrastructure As Code repo (r3)
Within the repository r3 there are the solution's Azure DevOps Pipelines, each one of them has been configured for Manual & Scheduled trigger on develop branch:
Frontend CI Pipeline p1
Backend CI Pipeline p2
Deployment Pipeline p3
The behavior I want is
Git commit on r1 repo
Pipeline p1 on repo r3 triggered (this will create artifacts, apply a tag and notify)
Pipeline p3 triggered by p1 completion (this will deploy the artifacts)
Pipeline p1 looks like the following
trigger: none
resources:
containers:
- container: running-image
image: ubuntu:latest
options: "-v /usr/bin/sudo:/usr/bin/sudo -v /usr/lib/sudo/libsudo_util.so.0:/usr/lib/sudo/libsudo_util.so.0 -v /usr/lib/sudo/sudoers.so:/usr/lib/sudo/sudoers.so -v /etc/sudoers:/etc/sudoers"
repositories:
- repository: frontend
name: r1
type: git
ref: develop
trigger:
branches:
include:
- develop
exclude:
- main
name: $(SourceBranchName)_$(date:yyyyMMdd)$(rev:.r) - Frontend App [CI]
variables:
- name: imageName
value: fronted-app
- name: containerRegistryConnection
value: apps-registry-connection
pool:
vmImage: "ubuntu-latest"
stages:
- stage: Build
displayName: Build and push
jobs:
- job: JobBuild
displayName: Build job
container: running-image
steps:
- checkout: frontend
displayName: Checkout Frontend repository
path: fe
persistCredentials: true
...
Pipeline p3 looks like the following
name: $(SourceBranchName)_$(date:yyyyMMdd)$(rev:.r) - App [CD]
trigger: none
resources:
containers:
- container: running-image
image: ubuntu:latest
options: "-v /usr/bin/sudo:/usr/bin/sudo -v /usr/lib/sudo/libsudo_util.so.0:/usr/lib/sudo/libsudo_util.so.0 -v /usr/lib/sudo/sudoers.so:/usr/lib/sudo/sudoers.so -v /etc/sudoers:/etc/sudoers"
pipelines:
- pipeline: app-fe-delivery
source: "p1"
trigger:
stages:
- Build
branches:
include:
- develop
pool:
vmImage: "ubuntu-latest"
stages:
- stage: Delivery
jobs:
- job: JobDevelopment
steps:
- template: ../templates/template-setup.yaml # Template reference
parameters:
serviceDisplayName: ${{ variables.serviceDisplayName }}
serviceName: ${{ variables.serviceName }}
...
Issue
Even if followed step by step all the rules exposed in the official documentation:
Pipeline p1 is never triggered by any commit on develop branch in r1 repository
Even if manually run Pipeline p1, Pipeline p3 is never triggered
Remarks
As stated in the pipelines YAML reference, Triggers are enabled by default
in the same documentation, if no branch include filter is expressed, the trigger will happen on all branches
As stated in the triggers for Checkout Multiple repositories in pipelines triggers happens only for repos in Azure DevOps repositories
is it possible to disable pipeline CI triggers (trigger: none) and have resource's repositories triggers happening
Build agent user has been authorized to access and queue new builds
Couple possible solutions.
First off believe your issue is with:
trigger: none
This means the pipeline will only work manually. In the documentation you referenced :
Triggers are enabled by default on all the resources. However, you can choose to override/disable triggers for each resource.
The way this is configured all push triggers are disabled.
One possible way to achieve what you are attempting I believe is to remove the trigger:none from p1 and p3
If I read your question correctly you are trying to do a CI/CD build deployment on the repository. If so, may I suggest if the scenario is appropriate (i.e. a Build will always trigger a deployment) then combine these pipelines into one and put an if statement around the deployment stage similar to:
- ${{ if eq(variables['Build.SourceBranch'], 'refs/heads/master')}}:
Also, if deploying to multiple environments this can be followed up with a loop indented in one line:
- ${{ each environmentNames in parameters.environmentNames }}:
I noticed you are already using template so this would be just moving the template call up from the job to the stage and have it act as a wrapper. Feel free to provide feedback. If this answer isn't appropriate, I can update it accordingly.

use files checked out from previous job in another job in an Azure pipeline

I have a pipeline I created in Azure DevOps that builds an Angular application and runs some tests on it. I separated the pipeline into two jobs, Build and Test. The Build job completes successfully. The Test job checks out the code from Git again even though the Build job already did it. The Test job needs the files created in the Build job in order to run successfully like the npm packages.
Here is my YAML file:
trigger:
- develop
variables:
npm_config_cache: $(Pipeline.Workspace)/.npm
system.debug: false
stages:
- stage: Client
pool:
name: Windows
jobs:
- job: Build
displayName: Build Angular
steps:
- template: templates/angularprodbuild.yml
- job: Test
displayName: Run Unit and Cypress Tests
dependsOn: Build
steps:
- template: templates/angularlinttest.yml
- template: templates/angularunittest.yml
- template: templates/cypresstest.yml
My agent pool is declared at the stage level so both jobs would be using the same agent. Also I added a dependsOn to the Test job to ensure the same agent would be used. After checking logs, the same agent is in fact used.
How can I get the Test job to use the files that were created in the Build job and not checkout the code again? I'm using Angular 11 and Azure DevOps Server 2020 if that helps.
use files checked out from previous job in another job in an Azure pipeline
If you are using a self-hosted agent, by default, none of the workspace are cleaned in between two consecutive jobs. As a result, you can do incremental builds and deployments, provided that tasks are implemented to make use of that.
So, we could use - checkout: none in the next job to skip checking out the same code in the Build job:
- job: Test
displayName: Run Unit and Cypress Tests
dependsOn: Build
steps:
- checkout: none
- template: templates/angularlinttest.yml
But just as Bo Søborg Petersen said, DependsOn does not ensure that the same agent is used. You need add a User Capability to that specific build agent then in the build definition you put that capability as a demand:
pool:
name: string
demands: string | [ string ]
Please check this document How to send TFS build to a specific agent or server for some more info.
In the test job, we could use predefined variables like $(System.DefaultWorkingDirectory) to access the files for Node and npm.
On the other hand, if you are using the Hosted agent, we need use PublishBuildArtifacts task to publish Artifact to the azure artifacts, so that we could use the DownloadBuildArtifacts task to download the artifacts in the next job:
jobs:
- job: Build
pool:
vmImage: 'ubuntu-16.04'
steps:
- script: npm test
- task: PublishBuildArtifacts#1
inputs:
pathtoPublish: '$(System.DefaultWorkingDirectory)'
artifactName: WebSite
# download the artifact and deploy it only if the build job succeeded
- job: Deploy
pool:
vmImage: 'ubuntu-16.04'
steps:
- checkout: none #skip checking out the default repository resource
- task: DownloadBuildArtifacts#0
displayName: 'Download Build Artifacts'
inputs:
artifactName: WebSite
downloadPath: $(System.DefaultWorkingDirectory)
You could check Official documents and examples for some more details.
Assume that the agent is cleaned between jobs, so to access the files, you need to create an artifact during the build job and then download it during the test job.
Also, DependsOn does not ensure that the same agent is used, only that the second job runs after the first job.
Also you can set the second job to not checkout the code with "-checkout: none"

Conditional Approval gate in deployment jobs in azure pipelines

Since conditional approval doesn't work in azure yaml pipeline i've been trying a workaround using 2 environment in deployment stage, shown in yaml.
using a conditions in job and a variable i want to check if approval required or not
but when i run the pipeline , i see its still asking for approval even though the condition is not satisfied for the deployment job that requires approval. Post approval though the job that required approval skips as expected. I dont understand why its asking for approval.
Are approval executed first for a stage before jobs conditions are evaluated?
Did i miss something in the yaml?
trigger:
- none
variables:
- group: pipelinevariables
# Agent VM image name
- name: vmImageName
value: 'ubuntu-latest'
stages:
- stage: Deploy
displayName: Deploy stage
jobs:
- deployment: DeployWebWithoutApprval
displayName: deploy Web App without approval
condition: and(succeeded(),ne(variables.DEV_APPROVAL_REQUIRED,'true'))
pool:
vmImage: $(vmImageName)
# creates an environment if it doesn't exist
environment: 'app-dev'
strategy:
runOnce:
deploy:
steps:
- script: echo No approval
- deployment: DeployWebWithApprval
displayName: deploy Web App with approval
dependsOn: DeployWebWithoutApprval
condition: and(eq(dependencies.DeployWebWithoutApprval.result,'Skipped'),eq(variables.DEV_APPROVAL_REQUIRED,'true'))
pool:
vmImage: $(vmImageName)
# creates an environment if it doesn't exist
environment: 'app-dev-with-approval'
strategy:
runOnce:
deploy:
steps:
- script: echo requires approval
update :
this works if i define 2 stages and and same set of conditions but that would show 2 stages in build details page which we don't want
Another question is
Can we conditionally insert stage template based on variable value from variable group
stages
​${{ifeq(variables['Policy_Approval_Required'],'true')}}:
Insert template conditionally is supported, you can check the following link: https://github.com/microsoft/azure-pipelines-agent/issues/1749. Check the following example:
- ${{ if eq(variables['Build.Reason'], 'PullRequest') }}:
- template: sharedstep.yml#templates
parameters:
value: true
I have had the exact same issue with approval gates and conditions. It is unfortunately not supported as of yet, but it's reported to Microsoft (here). There is also this issue.
Seems like an issue with the order of evaluating approvals vs conditions.

How to use different Service Connection for every stage in Azure Pipelines?

When using multistage pipelines from yaml in Azure Pipelines and every stage is deploying resources to a separate environment, I'd like to use a dedicated service connection for each stage. In my case every stage is making use of the same deployment jobs, i.e. yaml templates. So I'm using a lot of variables that have specific values dependent on the environment. This works fine, except for the service connection.
Ideally, the variable that contains the service connection name, is added to the stage level like this:
stages:
- stage: Build
# (Several build-stage specific jobs here)
- stage: DeployToDEV
dependsOn: Build
condition: succeeded()
variables:
AzureServiceConnection: 'AzureSubscription_DEV' # This seems like a logical solution
jobs:
# This job would ideally reside in a yaml template
- job: DisplayDiagnostics
pool:
vmImage: 'Ubuntu-16.04'
steps:
- checkout: none
- task: AzurePowerShell#4
inputs:
azureSubscription: $(AzureServiceConnection)
scriptType: inlineScript
inline: |
Get-AzContext
azurePowerShellVersion: LatestVersion
- stage: DeployToTST
dependsOn: Build
condition: succeeded()
variables:
AzureServiceConnection: 'AzureSubscription_TST' # Same variable, different value
jobs:
# (Same contents as DeployToDEV stage)
When this code snippet is executed, it results in the error message:
There was a resource authorization issue: "The pipeline is not valid.
Job DisplayDiagnostics: Step AzurePowerShell input
ConnectedServiceNameARM references service connection
$(AzureServiceConnection) which could not be found. The service
connection does not exist or has not been authorized for use. For
authorization details, refer to https://aka.ms/yamlauthz.
So, it probably can't expand the variable AzureServiceConnection soon enough when the run is started. But if that's indeed the case, then what's the alternative solution to make use of separate service connections for every stage?
One option that works for sure is setting the service connection name directly to all tasks, but that would involve duplicating identical yaml tasks for every stage, which I obviously want to avoid.
Anyone has a clue on this? Thanks in advance!
Currently you can not pass a variable as a serviceConnection.
Apparently the service connection name is picked up on push/commit and whatever that is there will be picked up.
E.g. if you have a $(variable) it will pick $(variable) instead of the value.
Workaround I have used so far is to use a template for the steps at each stage and pass a different parameter with the serviceConnection.
Refer: https://github.com/venura9/azure-devops-yaml/blob/master/azure-pipelines.yml for a sample implementation. you are more than welcome to pull request with updates.
- stage: DEV
displayName: 'DEV(CD)'
condition: and(succeeded('BLD'), eq(variables['Build.SourceBranch'], 'refs/heads/develop'))
dependsOn:
- BLD
variables:
stage: 'dev'
jobs:
- job: Primary_AustraliaSouthEast
pool:
vmImage: $(vmImage)
steps:
- template: 'pipelines/infrastructure/deploy.yml'
parameters: {type: 'primary', spn: 'SuperServicePrincipal', location: 'australiasoutheast'}
- template: 'pipelines/application/deploy.yml'
parameters: {type: 'primary', spn: 'SuperServicePrincipal'}