Gitlab yaml to azure pipelines yaml file - azure-devops

I'm trying to convert but after reading some sources im not sure how I can convert this part of yaml code that works with gitlab CI to azure pipelines yaml:
build:
stage: build
script:
- npm run build
artifacts:
paths:
- dist
only:
- master
deploy:
stage: deploy
script:
- npm i -g netlify-cli
- netlify deploy --site $NETLIFY_SITE_ID --auth $NETLIFY_AUTH_TOKEN --prod
dependencies:
- build
only:
- master
Especially I want to set the artifact path in the build stage and then somehow set that in the deploy stage.
Here how it looks now in my azure-pipelines yaml:
- script: |
npm run build
displayName: 'Build'
- script: |
npm i -g netlify-cli
netlify deploy --site $NETLIFY_SITE_ID --auth $NETLIFY_AUTH_TOKEN --prod
displayName: 'Deploy'

See below sample:
variables:
- name: netlify.site.id
value: {value}
- name: netlify.auth.token
value: {token}
trigger:
- master
pool:
vmImage: 'vs2017-win2016'
stages:
- stage: Build
jobs:
- job: ARM
steps:
- script: npm -version
- publish: $(System.DefaultWorkingDirectory)
artifact: dist
- stage: Deploy
dependsOn: Build
condition: succeeded()
jobs:
- job: APP
steps:
- bash: |
npm i -g netlify-cli
netlify deploy --site $(netlify.site.id) --auth $(netlify.auth.token) --prod
Tip1: If the value of netlify.auth.token and netlify.site.id is very private for you, and you do not want it public in YAML. You can store them in variable group. And then change the Variables part as:
variables:
- group: {group name}
See this doc.
Tip2: For the stage dependency, you can use dependsOn keyword in VSTS yaml to achieve the dependency. See this.
Tip3: In VSTS, you must specify stages, jobs and steps which used as the entry point for the server to compile the respective part.
A stage is a collection of related jobs.
A job is a collection of steps to be run by an agent or on the
server.
Steps are a linear sequence of operations that make up a job.
Tip4: To achieve publishing artifacts in VSTS with YAML, there has 2 different format. One is what I show for you above. The publish keyword is a shortcut for the Publish Pipeline Artifact task.
Another one format, see this Publishing artifacts

Related

Extending my Azure DevOps Pipeline to include Build validation for all Pull Requests

I have 30 or so Java Microservices that run of the same ci and cd template. i.e. Each of my Microservices has a build pipeline as follows and as shown below it runs automatically on a merge to master:
name: 0.0.$(Rev:r)
trigger:
- master
pr: none
variables:
- group: MyCompany
resources:
repositories:
- repository: templates
type: git
name: <id>/yaml-templates
stages:
- stage: Build
jobs:
- job: build
displayName: Build
steps:
- template: my-templates/ci-build-template.yaml#templates
- stage: PushToContainerRegistry
dependsOn: Build
condition: succeeded()
jobs:
- job: PushToContainerRegistry
displayName: PushToContainerRegistry
steps:
- template: my-templates/ci-template.yaml#templates
Where ci-build-template.yaml contains...
steps:
- checkout: self
path: s
- task: PowerShell#2
- task: Gradle#2
displayName: 'Gradle Build'
- task: SonarQubePrepare#4
displayName: SonarQube Analysis
- task: CopyFiles#2
displayName: Copy build/docker to Staging Directory
I would like to implement pr build validation wherever someone raises a pr to merge into master. In the PR build only the Build stage should run and from the build template only some tasks within ci-build-template.yaml should run.
A few questions for my learning:
How can i uplift the yaml pipeline above to make the "PushToContainerRegistry" skip if it is a pr build?
How can i uplift ci-build-template.yaml to make the "SonarQubePrepare#4" and "CopyFiles#2" skip if it is a pr build?
And lastly how can i uplift the yaml pipeline above to enable build validation for all pr's that have a target branch of master?
Whilst doing my own research i know you can do this via clickops but I am keep on learning on how to implement via yaml.
thanks
How can i uplift the yaml pipeline above to make the
"PushToContainerRegistry" skip if it is a pr build?
How can i uplift ci-build-template.yaml to make the
"SonarQubePrepare#4" and "CopyFiles#2" skip if it is a pr build?
Just need to use condition of task:
For example,
pool:
name: Azure Pipelines
steps:
- script: |
echo Write your commands here
echo Hello world
echo $(Build.Reason)
displayName: 'Command Line Script'
condition: and(succeeded(), ne(variables['Build.Reason'], 'PullRequest'))
Above definition will skip the step if Pull request trigger the pipeline.
Please refer to these documents:
https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/azure-repos-git?view=azure-devops&tabs=yaml#using-the-trigger-type-in-conditions
https://learn.microsoft.com/en-us/azure/devops/pipelines/build/variables?view=azure-devops&tabs=yaml#build-variables-devops-services
And lastly how can i uplift the yaml pipeline above to enable build
validation for all pr's that have a target branch of master?
You can use this expression in the condition:
eq(variables['System.PullRequest.TargetBranch'], 'refs/heads/master')
If you are based on DevOps git repo, then just need to add branch policy is ok:
https://learn.microsoft.com/en-us/azure/devops/repos/git/branch-policies?view=azure-devops&tabs=browser#configure-branch-policies

Azure DevOPS - run task only if artifact exists from build

I have two pipelines - build and publish. Build pipeline can produce up two artifacts but it depends on given parameters.
Publish pipeline is automatically triggered when build pipeline is completed. Publish pipeline then tooks published artifacts and deploy them. However I want to run publish tasks only and only if particular artifacts exists from build pipeline.
Right now, if artifact does not exists, it will fail "download" task.
Simplified to important parts and redacted some secret info
resources:
pipelines:
- pipeline: buildDev # Internal name of the source pipeline, used elsewhere within app-ci YAML, # e.g. to reference published artifacts
source: "Build"
trigger:
branches:
- dev
- feat/*
stages:
- stage: publish
displayName: "🚀🔥 Publish to Firebase"
jobs:
- job: publish_firebase_android
displayName: "🔥🤖Publish Android to Firebase"
steps:
- script: |
- download: buildDev
artifact: android
- download: buildDev
artifact: changelog
- task: DownloadSecureFile#1
name: firebaseKey
displayName: "Download Firebase key"
inputs:
secureFile: "<secure>.json"
- script: <upload>
displayName: "Deploy APK to Firebase"
workingDirectory: "$(Pipeline.Workspace)/buildDev/android/"
- job: publish_firebase_ios
displayName: "🔥🍏Publish iOS to Firebase"
steps:
- download: buildDev
artifact: iOS
- download: buildDev
artifact: changelog
- task: DownloadSecureFile#1
name: firebaseKey
displayName: "Download Firebase key"
inputs:
secureFile: "<secure>.json"
- script: <upload...>
workingDirectory: "$(Pipeline.Workspace)/buildDev/iOS/"
displayName: "Deploy IPA to Firebase"
I've tried to find some solution but every other solution solve the only problem within the same pipeline. Based on MS Docs I can't find if there is a prepared env. a variable that could point to "pipeline resources". With that env. variable I could theoretically run a script which checks presence of artifact, set variable and use that variable as condition for steps.
I think you can use stage filters in trigger. I don't know what structure your build pipeline is, but you can set up a stage to publish artifacts. Execute that stage if there are artifacts to publish, otherwise skip it. You can do this using conditions. Here is a simple sample:
stages:
- stage: Build
jobs:
- job: build
steps:
...
- stage: Artifact
condition: ... # Set the condition based on your parameter
jobs:
- job: artifact
steps:
...
Then use the stage filter in the publishing pipeline. If the stage executes successfully, then the publish pipeline will run, otherwise, the publish pipeline will not run.
resources:
pipelines:
- pipeline: buildpipeline
source: buildpipeline
trigger:
stages:
- Artifact
Using variable groups is an option as well. You can use the variable groups to pass variable from a pipeline to another pipeline. Here are the detailed steps:
(1). Create a variable group in Pipelines/Library and add a new Variable. I will call this variable "var" later.
(2). In your build pipeline, you can update "var" based on your parameters:
variables:
- group: {group name}
- bash: |
az pipelines variable-group variable update --group-id {id} --name var --value yes
env:
AZURE_DEVOPS_EXT_PAT: $(System.AccessToken)
condition: ...
Tip 1. If you don't know your variable group id, go to Pipelines/Library and select your variable group. You can find it in the URL: https://dev.azure.com/...&variableGroupId={id}&...
Tip 2. If you meet the error "You do not have permissions to perform this operation on the variable group.", go to Pipelines/Library and select your variable group. Click on "Security" and give "{pipeline name} Build Service" user the Administrator role.
Tip 3. Use your parameter in condition to decide whether to update var.
(3). In your publish pipeline, you can use var from variable group in condition:
condition: eq(variables['var'], 'yes')

How do you copy azure repo folders to a folder on a VM in an Environment in a pipeline?

I have an Environment called 'Dev' that has a resource, which is a VM. As part of the 'Dev' pipeline I want to copy files from a specific folder on the develop branch of a specific repo to a specific folder on the VM that's on the Environment.
I've not worked with Environments before or yaml pipelines much but I gather I need to use the CopyFiles#2 task.
So I've got an azure pipeline yaml file something like this:
variables:
isDev: $[eq(variables['Build.SourceBranch'], 'refs/heads/develop')]
stages:
- stage: Build
jobs:
- job: Build
pool:
vmImage: 'windows-latest'
steps:
- task: CopyFiles#2
displayName: 'Copy Files'
inputs:
contents: 'myFolder\**'
Overwrite: true
targetFolder: $(Build.ArtifactStagingDirectory)
- task: PublishBuildArtifacts#1
inputs:
pathToPublish: $(Build.ArtifactStagingDirectory)
artifactName: myArtifact
- stage: Deployment
dependsOn: Build
condition: and(succeeded(), eq(variables.isDev, true))
jobs:
- deployment: Deploy
displayName: Deploy to Dev
pool:
vmImage: 'windows-latest'
environment: Dev
strategy:
runOnce:
deploy:
steps:
- script: echo Foo Bar
The first question is how to I get this to copy the files to a specific path on the Dev environment?
Is the PublishBuildArtifacts really needed? The reason I ask is that I want this to copy files every time the pipeline is run and not error if the artifact already exists.
It also feels a bit dirty to have to check the branch is the correct branch this way. Is there a better way to do it?
The deployment strategy you're using relies on specifying an agent pool, which means it doesn't run on the machines in the environment. If you use a strategy such as rolling, it will run the specified steps on those machines automatically, including any download steps to download artifacts.
Ref: https://learn.microsoft.com/en-us/azure/devops/pipelines/process/deployment-jobs?view=azure-devops#deployment-strategies
You need to publish artifacts as part of the pipeline if you want them to be automatically available to down-stream jobs. Each run will get a different set of artifacts, even if the actual artifact contents are the same.
That said, based on the YAML you posted, you probably don't need to. In fact, you don't need the "build" stage at all. You could just add a checkout step during your rolling deployment, and the repo would be cloned on each of the target machines.
Ok, worked this out with help from this article: https://dev.to/kenakamu/azure-devops-yaml-release-pipeline-trigger-when-build-pipeline-completed-54d5.
I've taken the advice from Daniel Mann regarding the strategy being 'rolling'. I then split my pipeline into 2 pipelines; 1 for building the artifacts and 1 for releasing (copying them).
If you want just download the particular folders instead of all the source files from the repository, you can try using the REST API "Items - Get" to download each particular folder individually.
GET https://dev.azure.com/{organization}/{project}/_apis/git/repositories/{repositoryId}/items?path={path}&download=true&$format=zip&versionDescriptor.version={versionDescriptor.version}&resolveLfs=true&api-version=6.0
For example:
Have the repository like as below.
Now, in the YAML pipeline, I just want to download the 'TestFolder01' folder from the main branch.
jobs:
- job: build
. . .
steps:
- checkout: none # Do not check out all the source files.
- task: Bash#3
displayName: 'Download particular folder'
env:
SYSTEM_ACCESSTOKEN: $(System.AccessToken)
inputs:
targetType: inline
script: |
curl -X GET \
-o TestFolder01.zip \
-u :$SYSTEM_ACCESSTOKEN 'https://dev.azure.com/MyOrg/MyProj/_apis/git/repositories/ShellScripts/items?path=/res/TestFolder01&download=true&$format=zip&versionDescriptor.version=main&resolveLfs=true&api-version=6.0'
This will download the 'TestFolder01' folder as a ZIP file (TestFolder01.zip) into the current working directory. You can use the unzip command to decompress it.
[UPDATE]
If you want to download the particular folders in the deploy job which target to your VM environment, yes, the folders will be download into the pipeline working directory on the VM.
Actually, you can consider a VM type environment resource is a self-hosted agent installed on the VM. So, when your deploy job is targeting to the VM environment resource, it is running on the self-hosted agent on the VM.
The pipeline working directory is under the directory where you install the VM environment resource (self-hosted agent). Normally, you can use the variable $(Pipeline.Workspace) to get value of this path (see here).
stages:
- stage: Deployment
jobs:
- deployment: Deploy
displayName: 'Deploy to Dev'
environment: 'Dev.VM-01'
strategy:
runOnce:
deploy:
steps:
- task: Bash#3
displayName: 'Download particular folder'
env:
SYSTEM_ACCESSTOKEN: $(System.AccessToken)
inputs:
targetType: inline
script: |
echo "Current working directory: $PWD"
curl -X GET \
-o TestFolder01.zip \
-u :$SYSTEM_ACCESSTOKEN 'https://dev.azure.com/MyOrg/MyProj/_apis/git/repositories/ShellScripts/items?path=/res/TestFolder01&download=true&$format=zip&versionDescriptor.version=main&resolveLfs=true&api-version=6.0'

azure devops release stage bash script

I defined the following stage in my azure devops release :
steps:
- bash: |
# Write your commands here
echo 'Hello world'
curl -X POST -H "Authorization: Bearer dapiXXXXXXXX" -d #conf/dbfs_api.json https://adb-YYYYYYYY.X.azuredatabricks.net/api/2.0/jobs/create > file.json
displayName: 'Bash Script'
my repo has a folder called conf with the file dbfs_api.json inside of it , unfortunately this file is not found during the deployment of this stage and I get the following error:
Couldn't read data from file "D:ar1a/conf/dbfs_api.json", this makes an empty POST.
The release stages of an Azure Pipelines workflow don't perform a checkout by default. You can manually add a checkout task to the release stage to perform a checkout:
A deployment job doesn't automatically clone the source repo. You can checkout the source repo within your job with checkout: self. Deployment jobs only support one checkout step.
jobs:
- deployment:
environment: 'prod'
strategy:
runOnce:
deploy:
steps:
- checkout: self
Or you can create a pipeline artifact in the build stage and consume the artifact in a later artifact.
stages:
- stage: build
jobs:
- job:
steps:
- publish: '$(Build.ArtifactStagingDirectory)/scripts'
artifact: drop
- stage: test
dependsOn: build
jobs:
- job:
steps:
- download: current
artifact: drop
See:
Use Artifacts across stages
Deployment jobs

How to trigger pipelines in GitLab CI

I have the problem, that I want to trigger another pipeline (B) in antoher project (B), only when the deploy job in pipeline (A) is finished. But my configuration starts the second pipeline as soon as the deploy job in pipeline (A) starts. How can I do it, that the second pipeline is triggered, only when the deploy job in pipeline (A) in projet (A) is finished?
Here is my gitlab-ci.yml
workflow:
rules:
- if: '$CI_COMMIT_BRANCH'
before_script:
- gem install bundler
- bundle install
pages:
stage: deploy
script:
- bundle exec jekyll build -d public
artifacts:
paths:
- public
rules:
- if: '$CI_COMMIT_BRANCH == "master"'
staging:
variables:
ENVIRONMENT: staging
stage: build
trigger: example/example
test:
stage: test
script:
- bundle exec jekyll build -d test
artifacts:
paths:
- test
rules:
- if: '$CI_COMMIT_BRANCH != "master"'
You don't declare stages order, so gitlab pipeline don't know what order are expected.
At the beginning of .gitlab-ci.yaml file add something like this (or whatever order you want):
stages:
- deploy
- test
- build
# rest of you file...
Alternatively you can use needs to build jobs relation.