How to prevent triggering Gitlab pipeline on new tag creation - tags

I have a repository that I am using as a template for semantic release:
release.yml
workflow:
rules:
- if: $CI_COMMIT_TAG
when: never
- if: $CI_COMMIT_BRANCH == "test"
when: always
- if: $CI_COMMIT_BRANCH == "main"
when: always
.release:
image: docker-images/semantic-release-test:v0.2.2
variables:
GITLAB_TOKEN: $GITLAB_ACCESS_TOKEN
script:
- npx semantic-release --debug
and I am referencing it in another project
.gitlab-ci.yml
stages:
- release
include:
- project: templates/semantic-release-test
file:
- release.yml
docker_release:
stage: release
extends: .release
the problem is that there is still a second pipeline being created after the script creates a new tag. I did try to implement the logic within the .gitlab-ci.yml without the template and it works fine. But when I am using the include key a new pipeline is being triggered regardless.
I have tried many other variation of adding rules to the end of the job or and to the .gitlab-ci.yml as to the release.yml but no luck.
Any ideas on why is that happening?

I came across the following in the documentation after reading your question earlier today:
To pass information about the upstream pipeline using predefined CI/CD variables. use interpolation. Save the predefined variable as a new job variable in the trigger job, which is passed to the downstream pipeline.
If I understood it correctly, it should be something like this in your case:
release.yaml
workflow:
rules:
- if: $PARENT_TAG
when: never
- if: $PARENT_BRANCH == "test"
when: always
- if: $PARENT_BRANCH == "main"
when: always
.release:
image: docker-images/semantic-release-test:v0.2.2
variables:
GITLAB_TOKEN: $GITLAB_ACCESS_TOKEN
script:
- npx semantic-release --debug
.gitlab-ci.yml
stages:
- release
include:
- project: templates/semantic-release-test
file:
- release.yml
docker_release:
stage: release
variables:
PARENT_BRANCH: $CI_COMMIT_BRANCH
PARENT_TAG: $CI_COMMIT_TAG
extends: .release
Edit: Later Idea
Though, if you can live with the limitation only disabling it in the parent pipeline, you could also do something like this.
stages:
- release
include:
- project: templates/semantic-release-test
file:
- release.yml
docker_release:
stage: release
extends: .release
rules:
- if: $CI_COMMIT_TAG
when: never
- if: $CI_COMMIT_BRANCH == "test"
when: always
- if: $CI_COMMIT_BRANCH == "main"
when: always

Related

GitAction: Trigger workflow on PR approval AND when base branch is main AND .csv file is updated

I was trying with following code but it is not taking care of all 3 conditions due to following reason:
base_ref will only work on pull_request/push event not on pull_request_review
action dorny/paths-filter#v2.2.1 only works with pull_request/push event
on:
pull_request_review:
types: [submitted]
branches:
- main
jobs:
myJob:
name: myJob
if: github.event.review.state == 'approved' && startsWith(github.base_ref, 'main/')
runs-on: [self-hosted, prd]
steps:
- name: Checkout repository
uses: actions/checkout#v2.2.0
- name: Check if *.csv is modified
uses: dorny/paths-filter#v2.2.1
id: changes
with:
filters: |
csv:
- 'data/*.csv'
- name: Run process bmv script
if: ${{ steps.changes.outputs.csv == 'true' }}
run: |
echo "-Started running script-"
Can any one suggest how can i handle all 3 conditions : PR approval, base branch as main and only csv is modified.
Finally I got the answer by doing Matteo's way and also tweaking existing code.
Tweaks: Updated dorny/paths-filter#v2.2.1 to dorny/paths-filter#v2.10.2 as this version support it to work on other than pull/pill_request events.
on:
pull_request_review:
types: [submitted]
jobs:
myJob:
name: myJob
if: startsWith(github.event.pull_request.base.ref, 'main') && (github.event.review.state == 'approved')
runs-on: [self-hosted, prd]
steps:
- name: Checkout repository
uses: actions/checkout#v2.2.0
- name: Check if *.csv is modified
uses: dorny/paths-filter#v2.10.2
id: changes
with:
filters: |
csv:
- 'data/*.csv'
- name: Run process bmv script
if: ${{ steps.changes.outputs.csv == 'true' }}
run: |
echo "-Started running script-"
Thanks for all the help, above is the complete working solution which check for all 3 conditions i.e run workflow only when PR is approved and PR has base branch as main and only csv file inside data folder is modified.

"Configuring the trigger failed, edit and save the pipeline again" with no noticeable error and no further details

I have run in to an odd problem after converting a bunch of my YAML pipelines to use templates for holding job logic as well as for defining my pipeline variables. The pipelines run perfectly fine, however I get a "Some recent issues detected related to pipeline trigger." warning at the top of the pipeline summary page and viewing details only states: "Configuring the trigger failed, edit and save the pipeline again."
The odd part here is that the pipeline works completely fine, including triggers. Nothing is broken and no further details are given about the supposed issue. I currently have YAML triggers overridden for the pipeline, but I did also define the same trigger in the YAML to see if that would help (it did not).
I'm looking for any ideas on what might be causing this or how I might be able to further troubleshoot it given the complete lack of detail that the error/warning provides. It's causing a lot of confusion among developers who think there might be a problem with their builds as a result of the warning.
Here is the main pipeline. the build repository is a shared repository for holding code that is used across multiple repos in the build system. dev.yaml contains dev environment specific variable values. Shared holds conditionally set variables based on the branch the pipeline is running on.
name: ProductName_$(BranchNameLower)_dev_$(MajorVersion)_$(MinorVersion)_$(BuildVersion)_$(Build.BuildId)
resources:
repositories:
- repository: self
- repository: build
type: git
name: Build
ref: master
# This trigger isn't used yet, but we want it defined for later.
trigger:
batch: true
branches:
include:
- 'dev'
variables:
- template: YAML/variables/shared.yaml#build
- template: YAML/variables/dev.yaml#build
jobs:
- template: ProductNameDevJob.yaml
parameters:
pipelinePool: ${{ variables.PipelinePool }}
validRef: ${{ variables.ValidRef }}
Then this is the start of the actual job yaml. It provides a reusable definition of the job that can be used in more than one over-arching pipeline:
parameters:
- name: dependsOn
type: object
default: {}
- name: pipelinePool
default: ''
- name: validRef
default: ''
- name: noCI
type: boolean
default: false
- name: updateBeforeRun
type: boolean
default: false
jobs:
- job: Build_ProductName
displayName: 'Build ProductName'
pool:
name: ${{ parameters.pipelinePool }}
demands:
- msbuild
- visualstudio
dependsOn:
- ${{ each dependsOnThis in parameters.dependsOn }}:
- ${{ dependsOnThis }}
condition: and(succeeded(), eq(variables['Build.SourceBranch'], variables['ValidRef']))
steps:
**step logic here
Finally, we have the variable YAML which conditionally sets pipeline variables based on what we are building:
variables:
- ${{ if or(eq(variables['Build.SourceBranch'], 'refs/heads/dev'), eq(variables['Build.SourceBranch'], 'refs/heads/users/ahenderson/azure_devops_build')) }}:
- name: BranchName
value: Dev
** Continue with rest of pipeline variables and settings of each value for each different context.
You can check my post here : Azure DevOps pipeline trigger issue message not going away
As I can see in your YAML file, you are using this branch : 'refs/heads/users/ahenderson/azure_devops_build'.
I think some YAML files you are refering are missing from the branch defined as default in your build there :
Switch to your branch
I think I may have figured out the problem. It appears that this is related to the use of conditionals in the variable setup. While the variables will be set in any valid trigger configuration, it appears that the proper values are not used during validation and that may have been causing the problem. Switching my conditional variables to first set a default value and then replace the value conditionally seems to have fixed the problem.
It would be nice if Microsoft would give a more useful error message here, something to the extent of the values not being found for a given variable, but adding defaults does seem to have fixed the problem.
In our case it was because the path to the YAML file started with a slash: /builds/build.yaml
Removing the slash fixed the error: builds/build.yaml
In my case I had replaced the trigger in the yaml-file. The pipeline then does not now where to start.
# ASP.NET
# Build and test ASP.NET projects.
# Add steps that publish symbols, save build artifacts, deploy, and more:
# https://learn.microsoft.com/azure/devops/pipelines/apps/aspnet/build-aspnet-4
trigger:
- "main" # <-- That was name wrong
For me it was this causing the problem...
The following pipeline.yaml causes the error “Configuring the trigger failed, edit and save the pipeline again”. It has to do with the environment name name: FeatureVMs.${{ variables.resourceName }}, if i replace ${{ variables.resourceName }} with something else e.g. FeatureVMs.develop then the error does not occur. The strange thing is, if i once save the pipeline with all triggers I want and a valid environment FeatureVMs.develop then it saves the triggers, if i then change it to what i actually want a dynamic environment resource selection FeatureVMs.${{ variables.resourceName }} then the error occurs but Azure Dev Ops the pipeline works as i expect it. So the workaround is to save it once without a variable and the triggers you want and then with the variable and live with the error on top of the pipeline
This causes the error
trigger: none
variables:
- name: resourceName
value: $(Build.SourceBranchName)
- name: sourcePipeline
value: vetsxl-ci
resources:
pipelines:
- pipeline: vetsxl-ci
source: vetsxl-ci
trigger:
branches:
include:
- develop
- feature/F*
- release/*
- review/*
- demo/*
- hotfix/H*
- tests/*
- test/*
stages:
- stage: Deploy
displayName: 'Deploy'
condition: and(succeeded(), ne(variables['Build.Reason'], 'PullRequest'))
jobs:
- deployment: DeployVM
displayName: 'Deploy to develop VM'
environment:
name: FeatureVMs.${{ variables.resourceName }}
strategy:
rolling:
deploy:
steps:
- template: deploy.yml
parameters:
sourcePipeline: ${{ variables.sourcePipeline }}
This works without any errors.
trigger: none
variables:
- name: resourceName
value: $(Build.SourceBranchName)
- name: sourcePipeline
value: vetsxl-ci
resources:
pipelines:
- pipeline: vetsxl-ci
source: vetsxl-ci
trigger:
branches:
include:
- develop
- feature/F*
- release/*
- review/*
- demo/*
- hotfix/H*
- tests/*
- test/*
stages:
- stage: Deploy
displayName: 'Deploy'
condition: and(succeeded(), ne(variables['Build.Reason'], 'PullRequest'))
jobs:
- deployment: DeployVM
displayName: 'Deploy to develop VM'
environment:
name: FeatureVMs.develop
strategy:
rolling:
deploy:
steps:
- template: deploy.yml
parameters:
sourcePipeline: ${{ variables.sourcePipeline }}

Is it possible to set a condition based on System.PullRequest.TargetBranch for a stage in a pipeline template?

I have a solution where a git branch is directly related to an environment (this has to be this way, so please do not discuss whether this is good or bad, I know it is not best practice).
We have the option to run a verification deployment (including automatic tests) towards an environment, without actually deploying the solution to the environment. Because of this, I would like to set up a pipeline that runs this verification for an environment, whenever a pull request is opened towards that environment's branch. Moreover, I am using a template for the majority of the pipeline. The actual pipeline in the main repository is just a tiny solution that points towards the template pipeline in another repository. This template, in turn, has stages for each respective environment.
I have, in the main pipeline, successfully added a solution that identifies the current branch, which for pull requests should be the target branch:
variables:
- name: currentBranch
${{ if eq(variables['Build.Reason'], 'PullRequest') }}:
value: $(System.PullRequest.TargetBranch)
${{ if ne(variables['Build.Reason'], 'PullRequest') }}:
value: $(Build.SourceBranch)
I would like to send this variable currentBranch down to the template through a parameter, as my template pipeline has different stages depending on the branch. My solution was to use the pipeline like this:
extends:
template: <template-reference>
parameters:
branch: $(currentBranch)
...and then for a stage in my pipeline do this:
- stage: TestAndDeployBranchName
condition: eq('${{ parameters.branch }}', 'refs/heads/branchName')
jobs:
- job1... etc.
Basically, the stage should run if the current branch is either "branchName", or (for pull requests) when the target branch is "branchName", which comes from the "branch" parameters that is sent to the template.
However, I see here that System.PullRequest.TargetBranch is not available for templates and further here that the parameters are not available for templates (the variable is empty) when the template is expanded. Thus my pipeline does not work as expected (the condition does not trigger when it should, ie. when there is a match on the branch name).
Is there any way that I can use System.PullRequest.TargetBranch in a condition within a template, or should I look for another solution?
After investigating this further I concluded that what I am trying to do is not possible.
In short, System.PullRequest.TargetBranch (and I assume at least some other variables within System.PullRequest are not available in compile time for template, which is when conditions are evaluated. Thus, using these variables in a condition in a template is not possible.
As my goal was to have certain steps run for pull requests only, based on the target branch of the pull request, I solved this by creating duplicate pipelines. Each pipeline is the same and references the same template, except for that the input parameter for the template is different. I then added each "PR pipelines" to run as part of the branch policy each respective branch this was applicable.
This works great, however it requires me to create a new pipeline if I have the same requirement for another branch. Moreover, I have to maintain each PR pipeline separately (which can be both good and bad).
Not an ideal solution, but it works.
Reference PR pipeline:
trigger: none # no trigger as PR triggers are set by branch policies
#This references the template repository to reuse the basic pipeline
resources:
repositories:
- repository: <template repo>
type: git # "git" means azure devops repository
name: <template name> # Syntax: <project>/<repo>
ref: refs/heads/master # Grab latest pipeline template from the master branch
stages:
- stage: VerifyPullRequest
condition: |
and(
not(failed()),
not(canceled()),
eq(variables['Build.Reason'], 'PullRequest')
)
displayName: 'Verify Pull Request'
jobs:
- template: <template reference> # Template reference
parameters:
image: <image>
targetBranch: <targetBranch> # Adjust this to match each respective relevant branch
The targetBranch parameter is the used in relevant places in the template to run PR verification.
Example of branch policy:
(Set this up for each relevant branch)
Picture of branch policy set up
After checking your script, we find we can not use the
variables:
- name: currentBranch
${{ if eq(variables['Build.Reason'], 'PullRequest') }}:
value: $(System.PullRequest.TargetBranch)
${{ if ne(variables['Build.Reason'], 'PullRequest') }}:
value: $(Build.SourceBranch)
in the variables.
The variables will duplicate the second value to first one.
This will cause your issue.
So, on my side, I create a work around and hope this will help you. Here is my main yaml:
parameters:
- name: custom_agent
displayName: Use Custom Agent
type: boolean
default: true
- name: image
type: string
default: default
resources:
repositories:
- repository: templates
type: git
name: Tech-Talk/template
trigger: none
pool:
vmImage: windows-latest
# vmImage: ubuntu-20.04
stages:
- stage: A
jobs:
- job: A1
steps:
- task: PowerShell#2
name: printvar
inputs:
targetType: 'inline'
script: |
If("$(Build.Reason)" -eq "PullRequest"){
Write-Host "##vso[task.setvariable variable=currentBranch;isOutput=true]$(System.PullRequest.TargetBranch)"
}
else{
Write-Host "##vso[task.setvariable variable=currentBranch;isOutput=true]$(Build.SourceBranch)"
}
- stage: B
condition: eq(dependencies.A.outputs['A1.printvar.currentBranch'], 'refs/heads/master')
dependsOn: A
jobs:
- job: B1
variables:
varFromA: $[ stageDependencies.A.A1.outputs['printvar.currentBranch'] ]
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
# Write your PowerShell commands here.
Write-Host "$(varFromA)"
- template: temp.yaml#templates
parameters:
branchName: $(varFromA)
agent_pool_name: ''
db_resource_path: $(System.DefaultWorkingDirectory)
Please Note:
If we use this, we need to modified your temp yaml.
We need to move the condition to the main yaml and make the temp yaml only steps is left.

How to trigger pipelines in GitLab CI

I have the problem, that I want to trigger another pipeline (B) in antoher project (B), only when the deploy job in pipeline (A) is finished. But my configuration starts the second pipeline as soon as the deploy job in pipeline (A) starts. How can I do it, that the second pipeline is triggered, only when the deploy job in pipeline (A) in projet (A) is finished?
Here is my gitlab-ci.yml
workflow:
rules:
- if: '$CI_COMMIT_BRANCH'
before_script:
- gem install bundler
- bundle install
pages:
stage: deploy
script:
- bundle exec jekyll build -d public
artifacts:
paths:
- public
rules:
- if: '$CI_COMMIT_BRANCH == "master"'
staging:
variables:
ENVIRONMENT: staging
stage: build
trigger: example/example
test:
stage: test
script:
- bundle exec jekyll build -d test
artifacts:
paths:
- test
rules:
- if: '$CI_COMMIT_BRANCH != "master"'
You don't declare stages order, so gitlab pipeline don't know what order are expected.
At the beginning of .gitlab-ci.yaml file add something like this (or whatever order you want):
stages:
- deploy
- test
- build
# rest of you file...
Alternatively you can use needs to build jobs relation.

Using Lerna.js and Azure Devops Pipeline

I'm studying about azuredevops pipelines to a new project. I'm totally new about devops.
In this project, I want to use lerna.js to manage my monorepo and my packages.
Considering that I have those three projects inside my monorepo:
Package 1 (1.0.1)
Package 2 (1.0.0)
Package 3 (1.0.3)
And I want to create a new TAG, which will increase the Package 3 to (1.0.4). How can I trigger an Azure Pipeline just to Package 3? There is any guide?
I watched one talk about monorepos with lerna.js and I'm trying to figure out if Azure Pipelines has one feature similar to what Sail CI does. In the exemple we have this approch:
tasks:
build-package-1:
image: sailci/demo
when:
paths:
- "packages/package-1/**/*"
The company I'm working at is using Azure DevOps, would be awesome know if I can have that feature there.
The closest thing similar to that one is like #Simon Ness wrote multiple pipelines with path filters. Additionally if your packages have similar strucutre and require the same steps to creeate/test/publish package you should consider templates.
The conspet toi handle packages like you described can be similar to below steps.
template.yaml
parameters:
- name: workingDir
type: string
default: package-1
steps:
- script: npm install
workingDiretory: ${{ parameters.workingDir }}
- script: yarn install
workingDiretory: ${{ parameters.workingDir }}
- script: npm run compile
workingDiretory: ${{ parameters.workingDir }}
then pipeline for Package-1 may look like this:
trigger:
branches:
include:
- master
- releases/*
paths:
include:
- Package-1/*
steps:
- template: templates/template.yaml
parameters:
workingDir: Package-1
and for Package-2:
trigger:
branches:
include:
- master
- releases/*
paths:
include:
- Package-2/*
steps:
- template: templates/template.yaml
parameters:
workingDir: Package-2
EDIT
For tag part all you need to do is change trigger section:
trigger:
branches:
include:
- master
- refs/tags/*
and when you create a tag and push it:
git tag release-05
git push origin --tags
your pipeline wil start:
However, trigger works like or condition, so for any change on master or new tag piepline will start. So if you tag another branch (not master) pipeline will start.
This is why you may need to check if your source branch is the one from trigger section:
trigger:
branches:
include:
- master
- refs/tags/*
stages:
- stage: A
condition: eq(variables['Build.SourceBranch'], 'master')
jobs:
- job: JA
steps:
- script: |
echo "This is job Foo."
Above pipeline will run for:
change in master branch
any tag pushed to server, but it runs stage A only if you push tag to master branch
If you're happy to define a separate pipeline for each package take a look at paths in CI triggers.
Tags can be used as a trigger by including the branch - refs/tags/* in the triggers section.