I defined the following stage in my azure devops release :
steps:
- bash: |
# Write your commands here
echo 'Hello world'
curl -X POST -H "Authorization: Bearer dapiXXXXXXXX" -d #conf/dbfs_api.json https://adb-YYYYYYYY.X.azuredatabricks.net/api/2.0/jobs/create > file.json
displayName: 'Bash Script'
my repo has a folder called conf with the file dbfs_api.json inside of it , unfortunately this file is not found during the deployment of this stage and I get the following error:
Couldn't read data from file "D:ar1a/conf/dbfs_api.json", this makes an empty POST.
The release stages of an Azure Pipelines workflow don't perform a checkout by default. You can manually add a checkout task to the release stage to perform a checkout:
A deployment job doesn't automatically clone the source repo. You can checkout the source repo within your job with checkout: self. Deployment jobs only support one checkout step.
jobs:
- deployment:
environment: 'prod'
strategy:
runOnce:
deploy:
steps:
- checkout: self
Or you can create a pipeline artifact in the build stage and consume the artifact in a later artifact.
stages:
- stage: build
jobs:
- job:
steps:
- publish: '$(Build.ArtifactStagingDirectory)/scripts'
artifact: drop
- stage: test
dependsOn: build
jobs:
- job:
steps:
- download: current
artifact: drop
See:
Use Artifacts across stages
Deployment jobs
Related
I have 3 environments in Azure , Sandbox, Test and Prod.
I have yaml pipeline in azure devops which builds the infrastructure. The environment built depends on the variables in the terraform code
The same pipleline is used to deploy to all environments depending on conditions in the yaml file.I want Dev to trigger on a merge to master but only want test and prod to deploy manually. How can i set this up in the yaml file?
For Dev, you can set up build validation of branch 'Master':
And in the Master branch, the YML file should be like this:
trigger:
- master
pool:
vmImage: ubuntu-latest
steps:
- script: echo Hello, world!
displayName: 'Run a one-line script'
This can make sure the pipeline of the master branch(Dev environment) only triggers after the PR merge is completed.
For test and prod environment, you can create a branch with the same name YML that the pipeline is looking for. And use the below YML definition:
trigger:
- none
pool:
vmImage: ubuntu-latest
steps:
- script: echo Hello, world!
displayName: 'Run a one-line script'
This can make sure the test and prod only can be triggered manually.
Above solution just need one pipeline.
Based on your requirement, you can add the condition in your YAML pipeline.
Dev to trigger on a merge to master
You can set the variable: Build.Reason and System.PullRequest.TargetBranch.
For example:
condition: and(eq(variables['Build.Reason'], 'PullRequest'), eq(variables['System.PullRequest.TargetBranch'], 'refs/heads/master'))
want test and prod to deploy manually
You can set the variable: Build.Reason
condition: eq(variables['Build.Reason'], 'Manual')
YAML example:
stages:
- stage: Dev
condition: and(eq(variables['Build.Reason'], 'PullRequest'), eq(variables['System.PullRequest.TargetBranch'], 'refs/heads/master'))
jobs:
- job: A
steps:
- xx
- stage: Prod
condition: eq(variables['Build.Reason'], 'Manual')
jobs:
- job: B
steps:
- xx
- stage: Test
condition: eq(variables['Build.Reason'], 'Manual')
jobs:
- job: C
steps:
- xx
When the pipeline is triggered by Pull Request and the target branch is master, the Dev stage will run.
When the pipeline is triggered manually, the Test and Prod stage will run.
Refer to this doc: Condition and Predefined variables.
I have 30 or so Java Microservices that run of the same ci and cd template. i.e. Each of my Microservices has a build pipeline as follows and as shown below it runs automatically on a merge to master:
name: 0.0.$(Rev:r)
trigger:
- master
pr: none
variables:
- group: MyCompany
resources:
repositories:
- repository: templates
type: git
name: <id>/yaml-templates
stages:
- stage: Build
jobs:
- job: build
displayName: Build
steps:
- template: my-templates/ci-build-template.yaml#templates
- stage: PushToContainerRegistry
dependsOn: Build
condition: succeeded()
jobs:
- job: PushToContainerRegistry
displayName: PushToContainerRegistry
steps:
- template: my-templates/ci-template.yaml#templates
Where ci-build-template.yaml contains...
steps:
- checkout: self
path: s
- task: PowerShell#2
- task: Gradle#2
displayName: 'Gradle Build'
- task: SonarQubePrepare#4
displayName: SonarQube Analysis
- task: CopyFiles#2
displayName: Copy build/docker to Staging Directory
I would like to implement pr build validation wherever someone raises a pr to merge into master. In the PR build only the Build stage should run and from the build template only some tasks within ci-build-template.yaml should run.
A few questions for my learning:
How can i uplift the yaml pipeline above to make the "PushToContainerRegistry" skip if it is a pr build?
How can i uplift ci-build-template.yaml to make the "SonarQubePrepare#4" and "CopyFiles#2" skip if it is a pr build?
And lastly how can i uplift the yaml pipeline above to enable build validation for all pr's that have a target branch of master?
Whilst doing my own research i know you can do this via clickops but I am keep on learning on how to implement via yaml.
thanks
How can i uplift the yaml pipeline above to make the
"PushToContainerRegistry" skip if it is a pr build?
How can i uplift ci-build-template.yaml to make the
"SonarQubePrepare#4" and "CopyFiles#2" skip if it is a pr build?
Just need to use condition of task:
For example,
pool:
name: Azure Pipelines
steps:
- script: |
echo Write your commands here
echo Hello world
echo $(Build.Reason)
displayName: 'Command Line Script'
condition: and(succeeded(), ne(variables['Build.Reason'], 'PullRequest'))
Above definition will skip the step if Pull request trigger the pipeline.
Please refer to these documents:
https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/azure-repos-git?view=azure-devops&tabs=yaml#using-the-trigger-type-in-conditions
https://learn.microsoft.com/en-us/azure/devops/pipelines/build/variables?view=azure-devops&tabs=yaml#build-variables-devops-services
And lastly how can i uplift the yaml pipeline above to enable build
validation for all pr's that have a target branch of master?
You can use this expression in the condition:
eq(variables['System.PullRequest.TargetBranch'], 'refs/heads/master')
If you are based on DevOps git repo, then just need to add branch policy is ok:
https://learn.microsoft.com/en-us/azure/devops/repos/git/branch-policies?view=azure-devops&tabs=browser#configure-branch-policies
I have two pipelines - build and publish. Build pipeline can produce up two artifacts but it depends on given parameters.
Publish pipeline is automatically triggered when build pipeline is completed. Publish pipeline then tooks published artifacts and deploy them. However I want to run publish tasks only and only if particular artifacts exists from build pipeline.
Right now, if artifact does not exists, it will fail "download" task.
Simplified to important parts and redacted some secret info
resources:
pipelines:
- pipeline: buildDev # Internal name of the source pipeline, used elsewhere within app-ci YAML, # e.g. to reference published artifacts
source: "Build"
trigger:
branches:
- dev
- feat/*
stages:
- stage: publish
displayName: "🚀🔥 Publish to Firebase"
jobs:
- job: publish_firebase_android
displayName: "🔥🤖Publish Android to Firebase"
steps:
- script: |
- download: buildDev
artifact: android
- download: buildDev
artifact: changelog
- task: DownloadSecureFile#1
name: firebaseKey
displayName: "Download Firebase key"
inputs:
secureFile: "<secure>.json"
- script: <upload>
displayName: "Deploy APK to Firebase"
workingDirectory: "$(Pipeline.Workspace)/buildDev/android/"
- job: publish_firebase_ios
displayName: "🔥🍏Publish iOS to Firebase"
steps:
- download: buildDev
artifact: iOS
- download: buildDev
artifact: changelog
- task: DownloadSecureFile#1
name: firebaseKey
displayName: "Download Firebase key"
inputs:
secureFile: "<secure>.json"
- script: <upload...>
workingDirectory: "$(Pipeline.Workspace)/buildDev/iOS/"
displayName: "Deploy IPA to Firebase"
I've tried to find some solution but every other solution solve the only problem within the same pipeline. Based on MS Docs I can't find if there is a prepared env. a variable that could point to "pipeline resources". With that env. variable I could theoretically run a script which checks presence of artifact, set variable and use that variable as condition for steps.
I think you can use stage filters in trigger. I don't know what structure your build pipeline is, but you can set up a stage to publish artifacts. Execute that stage if there are artifacts to publish, otherwise skip it. You can do this using conditions. Here is a simple sample:
stages:
- stage: Build
jobs:
- job: build
steps:
...
- stage: Artifact
condition: ... # Set the condition based on your parameter
jobs:
- job: artifact
steps:
...
Then use the stage filter in the publishing pipeline. If the stage executes successfully, then the publish pipeline will run, otherwise, the publish pipeline will not run.
resources:
pipelines:
- pipeline: buildpipeline
source: buildpipeline
trigger:
stages:
- Artifact
Using variable groups is an option as well. You can use the variable groups to pass variable from a pipeline to another pipeline. Here are the detailed steps:
(1). Create a variable group in Pipelines/Library and add a new Variable. I will call this variable "var" later.
(2). In your build pipeline, you can update "var" based on your parameters:
variables:
- group: {group name}
- bash: |
az pipelines variable-group variable update --group-id {id} --name var --value yes
env:
AZURE_DEVOPS_EXT_PAT: $(System.AccessToken)
condition: ...
Tip 1. If you don't know your variable group id, go to Pipelines/Library and select your variable group. You can find it in the URL: https://dev.azure.com/...&variableGroupId={id}&...
Tip 2. If you meet the error "You do not have permissions to perform this operation on the variable group.", go to Pipelines/Library and select your variable group. Click on "Security" and give "{pipeline name} Build Service" user the Administrator role.
Tip 3. Use your parameter in condition to decide whether to update var.
(3). In your publish pipeline, you can use var from variable group in condition:
condition: eq(variables['var'], 'yes')
Problem
I have two repositories A and B in one project, each with their own pipeline A-CI and B-CI. The repos are Azure Repos Git (so not external ones). I got it working to trigger pipeline B-CI whenever A-CI has completed. If A-CI got triggered by a commit to branch develop, then B-CI is triggered to build master although B also has a develop branch.
I want to build a new release of B for the dev environment, when a new dev version of A has been built.
Is it possible to let a pipeline-resource trigger the B-CI pipeline to build the branch with the same name as the branch which was just built by the pipeline-resource? It would be fine for me if it would fallback to master in case a matching branch is not available in B.
This scenario is working however if A-C and B-CI would both refer to different pipeline yamls of the same repository.
Pipeline YAMLs
A-CI
trigger:
- '*'
stages:
- stage: Build
jobs:
- job: BuildJob
pool:
name: 'MyBuildPool'
steps:
- powershell: |
Write-Host "Building A"
B-CI
resources:
pipelines:
- pipeline: Pipeline_A
source: 'A-CI'
trigger:
branches:
- master
- develop
- feature/*
trigger:
- '*'
stages:
- stage: Build
jobs:
- job: BuildJob
pool:
name: MyBuildPool
steps:
- powershell: |
Write-Host $(Build.SourceBranch) # is always refs/heads/master
Write-Host $(Build.Reason) # is ResourceTrigger
Background Info
The main idea behind that is, that A contains the IaC project and whenever the infrastructure of the project changes, then all apps should be deployed, too.
I do not want to put the IaC into the app repo because we have multiple apps, so I would have to split the IaC code into several chunks.
And then I would probably still have the same problem because some resources, like Azure KeyVault, are shared among the apps so A would still include the common stuff used by all apps and changes to it would require re-deployments of all apps.
Please check pipeline triggers:
If the triggering pipeline and the triggered pipeline use the same repository, then both the pipelines will run using the same commit when one triggers the other. This is helpful if your first pipeline builds the code, and the second pipeline tests it.
However, if the two pipelines use different repositories, then the triggered pipeline will use the latest version of the code from its default branch.
In this case, since master if the default branch of your B-CI, $(Build.SourceBranch) is always refs/heads/master.
As a workaround:
You can create a new yaml pipeline for repository B. You can use similar content of the yaml file for B-CI. And you only need to change something in it to:
resources:
pipelines:
- pipeline: Pipeline_A
source: 'A-CI'
trigger:
branches:
- develop
When we create the new yaml file, it's always placed in master branch. For me, I created a file with same name in dev branch, and copy the same content in it. Then i delete the new yaml file in master branch, now when dev of A-CI pipeline is built, dev of B repos will be used.
I think I have found a nicely working workaround as the built-in pipeline-triggers are not addressing our specific problem (though I can't say if we have an odd approach and there are better ways).
What I am doing now it to use the Azure CLI DevOps extension based on this docs entry and trigger the pipelines manually.
Pipeline YAMLs
A-CI
trigger:
- '*'
stages:
- stage: Build
displayName: Build something and create a pipeline artifact
jobs:
- job: BuildJob
pool:
name: 'MyBuildPool'
steps:
- powershell: |
Write-Host "Building A"
# steps to publish artifact omitted
- stage: TriggerAppPipelines
displayName: Trigger App Pipeline
jobs:
- job: TriggerAppPipelinesJob
displayName: Trigger App Pipeline
steps:
- bash: az extension list | grep azure-devops
displayName: 'Ensure Azure CLI DevOps extension is installed'
- bash: |
echo ${AZURE_DEVOPS_CLI_PAT} | az devops login
az devops configure --defaults organization=https://dev.azure.com/MyOrg project="MyProject" --use-git-aliases true
displayName: 'Login Azure CLI'
env:
AZURE_DEVOPS_CLI_PAT: $(System.AccessToken)
# By passing the build Id of this A-CI run, I can use that in B-CI to download the matching artifact of A-CI.
# If there is no matching branch, then the command fails.
- bash: |
az pipelines run --branch $(Build.SourceBranch) --name "B-CI" --variables a_Version="$(Build.BuildId)" -o none
displayName: 'Trigger pipeline'
B-CI
trigger:
- '*'
stages:
- stage: Build
jobs:
- job: BuildJob
pool:
name: MyBuildPool
steps:
- powershell: |
Write-Host $(Build.SourceBranch) # is same as the the triggering A-CI branch
Write-Host $(Build.Reason) # B-CI is triggered manually but the user is Project Collection Build Service, so automated runs can be distinguished
As B-CI is triggered manually now, there is no need for a resource node anymore.
I'm trying to convert but after reading some sources im not sure how I can convert this part of yaml code that works with gitlab CI to azure pipelines yaml:
build:
stage: build
script:
- npm run build
artifacts:
paths:
- dist
only:
- master
deploy:
stage: deploy
script:
- npm i -g netlify-cli
- netlify deploy --site $NETLIFY_SITE_ID --auth $NETLIFY_AUTH_TOKEN --prod
dependencies:
- build
only:
- master
Especially I want to set the artifact path in the build stage and then somehow set that in the deploy stage.
Here how it looks now in my azure-pipelines yaml:
- script: |
npm run build
displayName: 'Build'
- script: |
npm i -g netlify-cli
netlify deploy --site $NETLIFY_SITE_ID --auth $NETLIFY_AUTH_TOKEN --prod
displayName: 'Deploy'
See below sample:
variables:
- name: netlify.site.id
value: {value}
- name: netlify.auth.token
value: {token}
trigger:
- master
pool:
vmImage: 'vs2017-win2016'
stages:
- stage: Build
jobs:
- job: ARM
steps:
- script: npm -version
- publish: $(System.DefaultWorkingDirectory)
artifact: dist
- stage: Deploy
dependsOn: Build
condition: succeeded()
jobs:
- job: APP
steps:
- bash: |
npm i -g netlify-cli
netlify deploy --site $(netlify.site.id) --auth $(netlify.auth.token) --prod
Tip1: If the value of netlify.auth.token and netlify.site.id is very private for you, and you do not want it public in YAML. You can store them in variable group. And then change the Variables part as:
variables:
- group: {group name}
See this doc.
Tip2: For the stage dependency, you can use dependsOn keyword in VSTS yaml to achieve the dependency. See this.
Tip3: In VSTS, you must specify stages, jobs and steps which used as the entry point for the server to compile the respective part.
A stage is a collection of related jobs.
A job is a collection of steps to be run by an agent or on the
server.
Steps are a linear sequence of operations that make up a job.
Tip4: To achieve publishing artifacts in VSTS with YAML, there has 2 different format. One is what I show for you above. The publish keyword is a shortcut for the Publish Pipeline Artifact task.
Another one format, see this Publishing artifacts