Azure Devops Pipeline Trigger on tag and specific Branch - azure-devops

I exhausted all the options here and its only Tuesday.
I have a repository with 3 branches:
development
staging
master
I am trying to set the triggers to behave differently based on the branch. What i mean it this:
development = trigger on everything (commit, pr and tags)
Staging = Trigger only on tags created on this branch
Master = Same as staging.
I have the 3 pipelines as follow:
Development branch:
trigger:
branches:
include:
- development
Staging Branch:
trigger:
batch: true
tags:
include:
- 'v*'
branches:
include:
- staging
exclude:
- '*'
Master Branch
trigger:
batch: true
tags:
include:
- 'v*'
branches:
include:
- main
exclude:
- '*'
Everything works just fine when i push a normal commit on development, only development triggers. but if i create a tag on development all the pipelines trigger(staging and master). As far as i understand the Tags and Branches are conditions that as long as one of them is true, they trigger. I dont understand why microsoft is doing it this way if the pipelines resides in different branches but i am looking for a work around this limitations. I thought on excluding the branches but that will go out of control easily.
Any advice or a better solution please?
Thank you so much for any help you can provide me with

From this doc, it seems you could not let tags filter with branch filters that include file paths.
But considering that includes are processed first, and then excludes are removed from that list.
Therefore, you could try to write clearly the tag as an explicit include and exclude in the triggers part to see whether it could meet your requirement:
New a repo with three empty branches: stage, main, dev
In dev you should write tag include in trigger if you would like it be triggered by tags in dev branch:
for example, 2.yml in dev branch
trigger:
branches:
include:
- dev
tags:
include:
- v*
azure-pipelines.yml in main branch:
trigger:
batch: true
branches:
include:
- main
- refs/tags/v*
exclude:
- dev
- stage
tags:
include:
- v*
exclude:
- dev/*
- stage/*
paths:
exclude:
- /*
and 1.yml in stage branch:
trigger:
batch: true
branches:
include:
- stage
- refs/tags/v*
exclude:
- dev
- main
tags:
include:
- v*
exclude:
- dev/*
- main/*
paths:
exclude:
- /*
Then here are results test from my side:
commits on dev triggers 2.yml pipeline
adding tag v* on dev will triggers 2.yml pipeline
commits on stage doesn't trigger 1.yml pipeline
adding tag v* on dev triggers 1.yml pipeline
commit on main trigger azure-pipelines.yml pipeline
add tag v* on main trigger azure-pipelines.yml pipeline
Please kindly see and try whether it is what you want.
You could also consider the following two methods:
Alternative 1: As your first step, conditionally abort the build as "cancelled" if the branch name is not what you expect. You could see this for more details.
Alternative 2: Add a condition like eq(variables['Build.SourceBranchname'], 'master') or contains(variables['build.sourceBranch'], 'refs/heads/main') to every single step in your pipeline. This is tedious, but simple. You could see the doc for more details.

Related

Trigger diferent jobs depending on pull request type

I'm trying to reduce the amount of files I have for my workflows from 4 to 1. And with that my on is like this:
on:
pull_request:
types: [opened, synchronize, closed]
push:
branches: [master]
I know it's possible to use if in workflows but looking at the documentation I didn't find which parameters I should use to trigger the correct jobs when:
Pull request is opened
Pull request is closed
Push is made to a existing pull request
Push is made to a brach
The piece of documentation that touches on how to use if is this one.
You can use it at the job level. Consider the following example to execute a job only when the PR has been closed and merged (not just closed):
name: Build Your App
on:
pull_request:
types: [ closed ]
jobs:
build:
# this job will only run if the PR has been merged, not just 'closed'
if: github.event.pull_request.merged == true
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout#v2
with:
fetch-depth: '0'
And you can also use it at the step level, to execute or not the step based on the {{ expression }} evaluation, as the documentation shows.
Based on your ask, I would use the information on the github.event.* payload. To do that I use to re-create the conditions and triggers on a test repository and print it to the console. Then I know what I have to look for in each kind of event. It's like debugging the events. This is the documentation to do that.

how to define workflow to run based on two push rules

Is there a way to define 2 push rules in same workflow file or work around ?
How to combine and write below rules into single workflow file :
Run when any file is pushed on non master branch
On:
push:
branches-ignore:
- 'master'
paths:
- 'path-to-package/**'
Run Only when particular(package.json) file pushed in master branch
On:
push:
branches:
- 'master'
paths:
- 'path-to-package/package.json'
Your specific request doesn't appear to be supported by the syntax.
According to the workflow syntax for GitHub Actions documentation, two trigger configurations appear unrelated.
GitHub allows free users to open support requests. You could always make a feature request at support.github.com/contact
The closest workaround I know at the moment would be something like the workflow below, using a conditional inside your jobs.
on:
push:
paths:
- 'path-to-package/package.json'
jobs:
build_pom:
runs-on: ubuntu-latest
steps:
- run: echo 'this is master'
if: github.ref == 'refs/heads/master'

Azure DevOps pipelines resource trigger not working

I have an Azure DevOps pipeline (Pipeline1) that should be triggered when another (Pipeline2) completes. To that end I have implemented a pipelines resource as described in the documentation -
Trigger one pipeline after another
YAML schema reference
However, it's simply not working. In reality Pipeline2 will be triggered when a new PR is created or manually. I've tested creating a new PR, updating a PR several times, and several manual runs, but no matter what I do Pipeline1 will not trigger.
I've tried two of the examples as defined in the YAML schema reference, and reading further into the Trigger one pipeline after another document, I've tried to prefix the all branches wildcard with refs/heads/.
What must I do to get this working?
What I've tried
Without any branchs explicitly defined -
resources:
pipelines:
- pipeline: pipeline2
source: Pipeline2
trigger: true
With all branches explicitly defined -
resources:
pipelines:
- pipeline: pipeline2
source: Pipeline2
trigger:
branches:
- "*"
Prefixed the all branches wildcard with refs/heads/ -
resources:
pipelines:
- pipeline: pipeline2
source: Pipeline2
trigger:
branches:
- refs/heads/*
Update
It seems that sadly the pipelines resource does not work on PR's. Why That's the case, I couldn't tell you.
After some further investigation I stumbled across the Incoming Webhook Service Connection in a sprint update. This update is from six months ago and at the time of writing nothing has been added to the YAML schema reference.
However, it turns out that this feature just doesn't work full stop, and even if it did it looks like it will only trigger the default branch of a pipeline, which is no good for us (and probably no good for most use cases).
I did eventually find some documentation on GitHub from a year ago, but unfortunately this only seems to confirm that the Incoming Webhook Service Connection is of no use to us in this case.
This answer solved it for me.
I had a main and dev branch, and the target pipeline yaml file was not yet pushed up to main. The "Default branch for manual and scheduled builds" in the target pipeline must contain the source file in order for the pipeline to be triggered automatically. (The version of the pipeline that will actually be triggered will be the branch that triggered the original pipeline, as long as they are in the same project.) I changed the value to dev and that solved it.
You can change this setting by going to Edit/Triggers/Yaml/Get sources:
For me works this
resources:
pipelines:
- pipeline: build_pipeline
source: kmadof.devops-manual (14)
branch: master
trigger:
branches:
- '*'
steps:
- bash: env | sort
- task: Bash#3
inputs:
targetType: 'inline'
script: |
echo 'Hello world'
So in your case I would try this:
resources:
pipelines:
- pipeline: pipeline2
source: Pipeline2
branch: master
trigger:
branches:
- '*'
The Pipeline2 is triggered by PR, so the source branch (Build.SourceBranch) that triggers the pipeline run is the PR merge branch (refs/pull/{PR_ID}/merge).
I also have tested with the 3 ways you posted above, and only the first way can work as expected.
According to my further investigation, it seems that the branch filters on the pipeline resource trigger are only available to the repository branches that you can see on the 'Repos/Branches' page. These branches have the same prefix 'refs/heads/'. The PR merge branch (refs/pull/{PR_ID}/merge) seems is not included.
In your case, the first way should can work. You need to check with the following things:
Make sure you have specified the correct pipeline name of Pipeline2 to the 'source' key.
Check whether Pipeline1 and Pipeline2 are in the same project. If not, make sure you have used the 'project' key to specify the correct project where Pipeline2 is in, and make sure the projects are in the same organization.

Have a unique check-run for github actions workflow

I'm trying to enforce labelling PR's using enforce-label-action.
name: Enforce PR label
on:
pull_request:
types: [labeled, unlabeled, opened, edited]
jobs:
enforce-label:
runs-on: ubuntu-latest
steps:
- uses: yogevbd/enforce-label-action#master
with:
REQUIRED_LABELS_ANY: "bug,enhancement,feature"
The problem is that each time PR is labeled, a new check-run getting created and the old one's still having a failing status which cause the check-suite to show: Some checks were not successful.
Is it possible for github-actions to discard the old check-runs when a workflow check with the same name is getting triggered?
This is now fixed, it was a bug on GitHubs end.

Triggering tasks on Semver change: triggers jobs out or order

Here's what I'm trying to achieve:
I have a project with a build job for a binary release. The binary takes a while to cross-compile for each platform, so I only want to release build to be done when a release is tagged, but I want the local-native version to build and tests to run for each checked-in version.
Based on the flight-school demo... so far, my pipeline configuration looks like this:
resources:
- name: flight-school
type: git
source:
uri: https://github.com/nbering/flight-school
branch: master
- name: flight-school-version
type: semver
source:
driver: git
uri: https://github.com/nbering/flight-school
branch: master
file: version
jobs:
- name: test-app
plan:
- get: flight-school
trigger: true
- task: tests
file: flight-school/build.yml
- name: release-build
plan:
- aggregate:
- get: flight-school-version
trigger: true
- get: flight-school
passed: [test-app]
- task: release-build
file: flight-school/ci/release.yml
This produces a pipeline in the Web UI that looks like this:
The problem is that when I update the "release" file in the git repository, the semver resource, "flight-school-version" can check before the git resource "flight-school", causing the release build to be processed from the git version assigned to the previous check-in.
I'd like a way to work around this so that the release build appears as a separate task, but only triggers when the version is bumped.
Some things I've thought of so far
Create a separate git resource with a tag_filter set so that it only runs when a semver tag has been push to master
Pro: Jobs only run when tag is pushed
Con: Has the same disconnected-inheritance problem for tests as the semver-based example above
Add the conditional check for a semver tag (or change diff on a file) using the git history in the checkout as part of the build script
Pro: Will do basically what I want without too much wrestling with Concourse
Con: Can't see the difference in the UI without actually reading the build output
Con: Difficult to compose with other tasks and resource types to do something with the binary release
Manually trigger release build
Pro: Simple to set up
Con: Requires manual intervention.
Use the API to trigger a paused build step on completion of tests when a version change is detected
Con: Haven't seen any examples of others doing this, seems really complicated.
I haven't found a way to trigger a task when both the git resource and semver resource change.
I'm looking for either an answer to solve the concurrency problem in my above example, or an alternative pattern that would produce a similar release workflow.
Summary
Here's what I came up with for a solution, based on suggestions from the Concourse CI slack channel.
I added a parallel "release" track, which filters on tags resembling a semantic versioning versions. The two tracks share task configuration files and build scripts.
Tag Filtering
The git resource supports a tag_filter option. From the README:
tag_filter: Optional. If specified, the resource will only detect commits
that have a tag matching the expression that have been made against
the branch. Patterns are glob(7)
compatible (as in, bash compatible).
I used a simple glob pattern to match my semver tags (like v0.0.1):
v[0-9]*
At first I tried an "extglob" pattern, matching semantic versions exactly, like this:
v+([0-9]).+([0-9]).+([0-9])?(\-+([-A-Za-z0-9.]))?(\++([-A-Za-z0-9.]))
That didn't work, because the git resource isn't using the extglob shell option.
The end result is a resource that looks like this:
resource:
- name: flight-school-release
type: git
source:
uri: https://github.com/nbering/flight-school
branch: master
tag_filter: 'v[0-9]*'
Re-Using Task Definitions
The next challenge I faced was avoiding re-writing my test definition file for the release track. I would have to do this because all the file paths use the resource name, and I now have a resource for release, and development. My solution is to override the resource with an option on the get task.
jobs:
- name: test-app-release
plan:
- get: flight-school
resource: flight-school-release
trigger: true
- task: tests
file: flight-school/build.yml
Build.yml above is the standard example from the flight school tutorial.
Putting It All Together
My resulting pipeline looks like this:
My complete pipeline config looks like this:
resources:
- name: flight-school-master
type: git
source:
uri: https://github.com/nbering/flight-school
branch: master
- name: flight-school-release
type: git
source:
uri: https://github.com/nbering/flight-school
branch: master
tag_filter: 'v[0-9]*'
jobs:
- name: test-app-dev
plan:
- get: flight-school
resource: flight-school-master
trigger: true
- task: tests
file: flight-school/build.yml
- name: test-app-release
plan:
- get: flight-school
resource: flight-school-release
trigger: true
- task: tests
file: flight-school/build.yml
- name: build-release
plan:
- get: flight-school
resource: flight-school-release
trigger: true
passed: [test-app-release]
- task: release-build
file: flight-school/ci/release.yml
In my opinion you should manually click the release-build button, and let everything else be automated. I'm assuming you are manually bumping your version number, but it seems better to move that manual intervention to releasing.
What I would do is have put at the end of release-build that bumps your minor version. Something like:
- put: flight-school-version
params:
bump: minor
That way you will always be on the correct version, once you release 0.0.1, you are done with it forever, you can only go forward.