How to trigger pipelines in GitLab CI - triggers

I have the problem, that I want to trigger another pipeline (B) in antoher project (B), only when the deploy job in pipeline (A) is finished. But my configuration starts the second pipeline as soon as the deploy job in pipeline (A) starts. How can I do it, that the second pipeline is triggered, only when the deploy job in pipeline (A) in projet (A) is finished?
Here is my gitlab-ci.yml
workflow:
rules:
- if: '$CI_COMMIT_BRANCH'
before_script:
- gem install bundler
- bundle install
pages:
stage: deploy
script:
- bundle exec jekyll build -d public
artifacts:
paths:
- public
rules:
- if: '$CI_COMMIT_BRANCH == "master"'
staging:
variables:
ENVIRONMENT: staging
stage: build
trigger: example/example
test:
stage: test
script:
- bundle exec jekyll build -d test
artifacts:
paths:
- test
rules:
- if: '$CI_COMMIT_BRANCH != "master"'

You don't declare stages order, so gitlab pipeline don't know what order are expected.
At the beginning of .gitlab-ci.yaml file add something like this (or whatever order you want):
stages:
- deploy
- test
- build
# rest of you file...
Alternatively you can use needs to build jobs relation.

Related

azure devops release stage bash script

I defined the following stage in my azure devops release :
steps:
- bash: |
# Write your commands here
echo 'Hello world'
curl -X POST -H "Authorization: Bearer dapiXXXXXXXX" -d #conf/dbfs_api.json https://adb-YYYYYYYY.X.azuredatabricks.net/api/2.0/jobs/create > file.json
displayName: 'Bash Script'
my repo has a folder called conf with the file dbfs_api.json inside of it , unfortunately this file is not found during the deployment of this stage and I get the following error:
Couldn't read data from file "D:ar1a/conf/dbfs_api.json", this makes an empty POST.
The release stages of an Azure Pipelines workflow don't perform a checkout by default. You can manually add a checkout task to the release stage to perform a checkout:
A deployment job doesn't automatically clone the source repo. You can checkout the source repo within your job with checkout: self. Deployment jobs only support one checkout step.
jobs:
- deployment:
environment: 'prod'
strategy:
runOnce:
deploy:
steps:
- checkout: self
Or you can create a pipeline artifact in the build stage and consume the artifact in a later artifact.
stages:
- stage: build
jobs:
- job:
steps:
- publish: '$(Build.ArtifactStagingDirectory)/scripts'
artifact: drop
- stage: test
dependsOn: build
jobs:
- job:
steps:
- download: current
artifact: drop
See:
Use Artifacts across stages
Deployment jobs

Azure DevOps : YAML continuation trigger starting some pipelines and not others - how to investigate this issue?

I have four YAML "release" pipelines where I use the same YAML syntax to define a continuation trigger. Here is the YAML definition for the trigger.
resources:
pipelines:
- pipeline: Build # Name of the pipeline resource
source: BuildPipeline # Name of the pipeline as registered with Azure DevOps
trigger: true
Not really sure about this syntax where I don't specify any branch but everything was working fine till recently. More recently I updated two of the YAML release pipelines and they now are not getting triggered when the build pipeline completes. All pipelines if executed manually work fine.
All release pipelines have the same YAML for the continuation trigger definition (see above) and have the same branch set for "Default branch for manual and scheduled builds".
I don't know how to investigate why some of the release pipelines are not triggered (any log available somewhere?) and I don't see them executed and failing, they simply are not being triggered. How do I investigate this issue?
For your question about investigating the logs - you can see what pipeline runs were created, but unfortunately you can't see what wasn't. So far as Azure DevOps is concerned, if "nothing occurred" to set off a trigger, then there's nothing to log.
As for the pipelines themselves not triggering, from the pipeline editor, check the trigger settings to ensure that nothing is set there - UI and YAML settings tend to cancel one another out:
Finally, if you want to specify a branch, you can use some combination of the following options:
resources:
pipelines:
- pipeline: Build # Name of the pipeline resource
source: BuildPipeline # Name of the pipeline as registered with Azure DevOps
trigger:
branches:
include: # branch names which will trigger a build
exclude: # branch names which will not
tags:
include: # tag names which will trigger a build
exclude: # tag names which will not
paths:
include: # file paths which must match to trigger a build
exclude: # file paths which will not trigger a build
I believe I found the issue and it's the removal of the following statements from my deploy pipelines
pool:
vmImage: windows-2019
I removed these statements because I transformed all jobs into deployment jobs as follows
- deployment: MyDeployJob
displayName: 'bla bla bla'
environment:
name: ${{ parameters.AzureDevopsEnv }}
resourceType: VirtualMachine
resourceName: ${{ parameters.AzureDevopsVM }}
The pipelines with no pool statement run perfectly well if started manually but I'm convinced fail at being triggered if started via the pipeline completion trigger. I do not understand this behavior but I placed the pool statement back in all deploy pipelines and all are now getting triggered as the build pipeline completes.
I found that when defining the resource pipeline (trigger) in a template that you extend in the depending pipeline, there are two things that can prevent builds from being triggered:
There are syntax errors in the template (or the parent .yaml)
The depending pipeline needs to be updated before Azure Devops realizes you made edits to the template it extends
This worked for me:
template.yaml
stages:
- stage: SomeBuildStage
displayName: Build The Project
jobs:
- job: SomeJob
displayName: Build NuGet package from Project
pool:
name: My Self-hosted Agent Pool # Using Pool here works fine for me, contrary to #whatever 's answer
steps:
- pwsh: |
echo "This template can be extended by multiple pipelines in order to define a trigger only once."
# I still use CI triggers as well here (optional)
trigger:
branches:
include:
- '*'
# This is where the triggering pipeline is defined
resources:
pipelines:
- pipeline: trigger-all-builds # This can be any string
source: trigger-all-builds # This is the name defined in the Azure Devops GUI
trigger: true
depending-pipeline.yaml
extends:
template: template.yaml
# I still use CI triggers as well here (optional)
trigger:
paths:
include:
- some/subfolder
triggering-pipeline.yaml
stages:
- stage: TriggerAllBuilds
displayName: Trigger all package builds
jobs:
- job: TriggerAllBuilds
displayName: Trigger all builds
pool:
name: My Self-hosted Agent Pool
steps:
- pwsh: |
echo "Geronimooo!"
displayName: Geronimo
trigger: none
pr: none

Using Lerna.js and Azure Devops Pipeline

I'm studying about azuredevops pipelines to a new project. I'm totally new about devops.
In this project, I want to use lerna.js to manage my monorepo and my packages.
Considering that I have those three projects inside my monorepo:
Package 1 (1.0.1)
Package 2 (1.0.0)
Package 3 (1.0.3)
And I want to create a new TAG, which will increase the Package 3 to (1.0.4). How can I trigger an Azure Pipeline just to Package 3? There is any guide?
I watched one talk about monorepos with lerna.js and I'm trying to figure out if Azure Pipelines has one feature similar to what Sail CI does. In the exemple we have this approch:
tasks:
build-package-1:
image: sailci/demo
when:
paths:
- "packages/package-1/**/*"
The company I'm working at is using Azure DevOps, would be awesome know if I can have that feature there.
The closest thing similar to that one is like #Simon Ness wrote multiple pipelines with path filters. Additionally if your packages have similar strucutre and require the same steps to creeate/test/publish package you should consider templates.
The conspet toi handle packages like you described can be similar to below steps.
template.yaml
parameters:
- name: workingDir
type: string
default: package-1
steps:
- script: npm install
workingDiretory: ${{ parameters.workingDir }}
- script: yarn install
workingDiretory: ${{ parameters.workingDir }}
- script: npm run compile
workingDiretory: ${{ parameters.workingDir }}
then pipeline for Package-1 may look like this:
trigger:
branches:
include:
- master
- releases/*
paths:
include:
- Package-1/*
steps:
- template: templates/template.yaml
parameters:
workingDir: Package-1
and for Package-2:
trigger:
branches:
include:
- master
- releases/*
paths:
include:
- Package-2/*
steps:
- template: templates/template.yaml
parameters:
workingDir: Package-2
EDIT
For tag part all you need to do is change trigger section:
trigger:
branches:
include:
- master
- refs/tags/*
and when you create a tag and push it:
git tag release-05
git push origin --tags
your pipeline wil start:
However, trigger works like or condition, so for any change on master or new tag piepline will start. So if you tag another branch (not master) pipeline will start.
This is why you may need to check if your source branch is the one from trigger section:
trigger:
branches:
include:
- master
- refs/tags/*
stages:
- stage: A
condition: eq(variables['Build.SourceBranch'], 'master')
jobs:
- job: JA
steps:
- script: |
echo "This is job Foo."
Above pipeline will run for:
change in master branch
any tag pushed to server, but it runs stage A only if you push tag to master branch
If you're happy to define a separate pipeline for each package take a look at paths in CI triggers.
Tags can be used as a trigger by including the branch - refs/tags/* in the triggers section.

I want Azure Pipeline to build equally named branch as the triggering pipeline has just built

Problem
I have two repositories A and B in one project, each with their own pipeline A-CI and B-CI. The repos are Azure Repos Git (so not external ones). I got it working to trigger pipeline B-CI whenever A-CI has completed. If A-CI got triggered by a commit to branch develop, then B-CI is triggered to build master although B also has a develop branch.
I want to build a new release of B for the dev environment, when a new dev version of A has been built.
Is it possible to let a pipeline-resource trigger the B-CI pipeline to build the branch with the same name as the branch which was just built by the pipeline-resource? It would be fine for me if it would fallback to master in case a matching branch is not available in B.
This scenario is working however if A-C and B-CI would both refer to different pipeline yamls of the same repository.
Pipeline YAMLs
A-CI
trigger:
- '*'
stages:
- stage: Build
jobs:
- job: BuildJob
pool:
name: 'MyBuildPool'
steps:
- powershell: |
Write-Host "Building A"
B-CI
resources:
pipelines:
- pipeline: Pipeline_A
source: 'A-CI'
trigger:
branches:
- master
- develop
- feature/*
trigger:
- '*'
stages:
- stage: Build
jobs:
- job: BuildJob
pool:
name: MyBuildPool
steps:
- powershell: |
Write-Host $(Build.SourceBranch) # is always refs/heads/master
Write-Host $(Build.Reason) # is ResourceTrigger
Background Info
The main idea behind that is, that A contains the IaC project and whenever the infrastructure of the project changes, then all apps should be deployed, too.
I do not want to put the IaC into the app repo because we have multiple apps, so I would have to split the IaC code into several chunks.
And then I would probably still have the same problem because some resources, like Azure KeyVault, are shared among the apps so A would still include the common stuff used by all apps and changes to it would require re-deployments of all apps.
Please check pipeline triggers:
If the triggering pipeline and the triggered pipeline use the same repository, then both the pipelines will run using the same commit when one triggers the other. This is helpful if your first pipeline builds the code, and the second pipeline tests it.
However, if the two pipelines use different repositories, then the triggered pipeline will use the latest version of the code from its default branch.
In this case, since master if the default branch of your B-CI, $(Build.SourceBranch) is always refs/heads/master.
As a workaround:
You can create a new yaml pipeline for repository B. You can use similar content of the yaml file for B-CI. And you only need to change something in it to:
resources:
pipelines:
- pipeline: Pipeline_A
source: 'A-CI'
trigger:
branches:
- develop
When we create the new yaml file, it's always placed in master branch. For me, I created a file with same name in dev branch, and copy the same content in it. Then i delete the new yaml file in master branch, now when dev of A-CI pipeline is built, dev of B repos will be used.
I think I have found a nicely working workaround as the built-in pipeline-triggers are not addressing our specific problem (though I can't say if we have an odd approach and there are better ways).
What I am doing now it to use the Azure CLI DevOps extension based on this docs entry and trigger the pipelines manually.
Pipeline YAMLs
A-CI
trigger:
- '*'
stages:
- stage: Build
displayName: Build something and create a pipeline artifact
jobs:
- job: BuildJob
pool:
name: 'MyBuildPool'
steps:
- powershell: |
Write-Host "Building A"
# steps to publish artifact omitted
- stage: TriggerAppPipelines
displayName: Trigger App Pipeline
jobs:
- job: TriggerAppPipelinesJob
displayName: Trigger App Pipeline
steps:
- bash: az extension list | grep azure-devops
displayName: 'Ensure Azure CLI DevOps extension is installed'
- bash: |
echo ${AZURE_DEVOPS_CLI_PAT} | az devops login
az devops configure --defaults organization=https://dev.azure.com/MyOrg project="MyProject" --use-git-aliases true
displayName: 'Login Azure CLI'
env:
AZURE_DEVOPS_CLI_PAT: $(System.AccessToken)
# By passing the build Id of this A-CI run, I can use that in B-CI to download the matching artifact of A-CI.
# If there is no matching branch, then the command fails.
- bash: |
az pipelines run --branch $(Build.SourceBranch) --name "B-CI" --variables a_Version="$(Build.BuildId)" -o none
displayName: 'Trigger pipeline'
B-CI
trigger:
- '*'
stages:
- stage: Build
jobs:
- job: BuildJob
pool:
name: MyBuildPool
steps:
- powershell: |
Write-Host $(Build.SourceBranch) # is same as the the triggering A-CI branch
Write-Host $(Build.Reason) # B-CI is triggered manually but the user is Project Collection Build Service, so automated runs can be distinguished
As B-CI is triggered manually now, there is no need for a resource node anymore.

Gitlab yaml to azure pipelines yaml file

I'm trying to convert but after reading some sources im not sure how I can convert this part of yaml code that works with gitlab CI to azure pipelines yaml:
build:
stage: build
script:
- npm run build
artifacts:
paths:
- dist
only:
- master
deploy:
stage: deploy
script:
- npm i -g netlify-cli
- netlify deploy --site $NETLIFY_SITE_ID --auth $NETLIFY_AUTH_TOKEN --prod
dependencies:
- build
only:
- master
Especially I want to set the artifact path in the build stage and then somehow set that in the deploy stage.
Here how it looks now in my azure-pipelines yaml:
- script: |
npm run build
displayName: 'Build'
- script: |
npm i -g netlify-cli
netlify deploy --site $NETLIFY_SITE_ID --auth $NETLIFY_AUTH_TOKEN --prod
displayName: 'Deploy'
See below sample:
variables:
- name: netlify.site.id
value: {value}
- name: netlify.auth.token
value: {token}
trigger:
- master
pool:
vmImage: 'vs2017-win2016'
stages:
- stage: Build
jobs:
- job: ARM
steps:
- script: npm -version
- publish: $(System.DefaultWorkingDirectory)
artifact: dist
- stage: Deploy
dependsOn: Build
condition: succeeded()
jobs:
- job: APP
steps:
- bash: |
npm i -g netlify-cli
netlify deploy --site $(netlify.site.id) --auth $(netlify.auth.token) --prod
Tip1: If the value of netlify.auth.token and netlify.site.id is very private for you, and you do not want it public in YAML. You can store them in variable group. And then change the Variables part as:
variables:
- group: {group name}
See this doc.
Tip2: For the stage dependency, you can use dependsOn keyword in VSTS yaml to achieve the dependency. See this.
Tip3: In VSTS, you must specify stages, jobs and steps which used as the entry point for the server to compile the respective part.
A stage is a collection of related jobs.
A job is a collection of steps to be run by an agent or on the
server.
Steps are a linear sequence of operations that make up a job.
Tip4: To achieve publishing artifacts in VSTS with YAML, there has 2 different format. One is what I show for you above. The publish keyword is a shortcut for the Publish Pipeline Artifact task.
Another one format, see this Publishing artifacts