Triggering tasks on Semver change: triggers jobs out or order - concourse

Here's what I'm trying to achieve:
I have a project with a build job for a binary release. The binary takes a while to cross-compile for each platform, so I only want to release build to be done when a release is tagged, but I want the local-native version to build and tests to run for each checked-in version.
Based on the flight-school demo... so far, my pipeline configuration looks like this:
resources:
- name: flight-school
type: git
source:
uri: https://github.com/nbering/flight-school
branch: master
- name: flight-school-version
type: semver
source:
driver: git
uri: https://github.com/nbering/flight-school
branch: master
file: version
jobs:
- name: test-app
plan:
- get: flight-school
trigger: true
- task: tests
file: flight-school/build.yml
- name: release-build
plan:
- aggregate:
- get: flight-school-version
trigger: true
- get: flight-school
passed: [test-app]
- task: release-build
file: flight-school/ci/release.yml
This produces a pipeline in the Web UI that looks like this:
The problem is that when I update the "release" file in the git repository, the semver resource, "flight-school-version" can check before the git resource "flight-school", causing the release build to be processed from the git version assigned to the previous check-in.
I'd like a way to work around this so that the release build appears as a separate task, but only triggers when the version is bumped.
Some things I've thought of so far
Create a separate git resource with a tag_filter set so that it only runs when a semver tag has been push to master
Pro: Jobs only run when tag is pushed
Con: Has the same disconnected-inheritance problem for tests as the semver-based example above
Add the conditional check for a semver tag (or change diff on a file) using the git history in the checkout as part of the build script
Pro: Will do basically what I want without too much wrestling with Concourse
Con: Can't see the difference in the UI without actually reading the build output
Con: Difficult to compose with other tasks and resource types to do something with the binary release
Manually trigger release build
Pro: Simple to set up
Con: Requires manual intervention.
Use the API to trigger a paused build step on completion of tests when a version change is detected
Con: Haven't seen any examples of others doing this, seems really complicated.
I haven't found a way to trigger a task when both the git resource and semver resource change.
I'm looking for either an answer to solve the concurrency problem in my above example, or an alternative pattern that would produce a similar release workflow.

Summary
Here's what I came up with for a solution, based on suggestions from the Concourse CI slack channel.
I added a parallel "release" track, which filters on tags resembling a semantic versioning versions. The two tracks share task configuration files and build scripts.
Tag Filtering
The git resource supports a tag_filter option. From the README:
tag_filter: Optional. If specified, the resource will only detect commits
that have a tag matching the expression that have been made against
the branch. Patterns are glob(7)
compatible (as in, bash compatible).
I used a simple glob pattern to match my semver tags (like v0.0.1):
v[0-9]*
At first I tried an "extglob" pattern, matching semantic versions exactly, like this:
v+([0-9]).+([0-9]).+([0-9])?(\-+([-A-Za-z0-9.]))?(\++([-A-Za-z0-9.]))
That didn't work, because the git resource isn't using the extglob shell option.
The end result is a resource that looks like this:
resource:
- name: flight-school-release
type: git
source:
uri: https://github.com/nbering/flight-school
branch: master
tag_filter: 'v[0-9]*'
Re-Using Task Definitions
The next challenge I faced was avoiding re-writing my test definition file for the release track. I would have to do this because all the file paths use the resource name, and I now have a resource for release, and development. My solution is to override the resource with an option on the get task.
jobs:
- name: test-app-release
plan:
- get: flight-school
resource: flight-school-release
trigger: true
- task: tests
file: flight-school/build.yml
Build.yml above is the standard example from the flight school tutorial.
Putting It All Together
My resulting pipeline looks like this:
My complete pipeline config looks like this:
resources:
- name: flight-school-master
type: git
source:
uri: https://github.com/nbering/flight-school
branch: master
- name: flight-school-release
type: git
source:
uri: https://github.com/nbering/flight-school
branch: master
tag_filter: 'v[0-9]*'
jobs:
- name: test-app-dev
plan:
- get: flight-school
resource: flight-school-master
trigger: true
- task: tests
file: flight-school/build.yml
- name: test-app-release
plan:
- get: flight-school
resource: flight-school-release
trigger: true
- task: tests
file: flight-school/build.yml
- name: build-release
plan:
- get: flight-school
resource: flight-school-release
trigger: true
passed: [test-app-release]
- task: release-build
file: flight-school/ci/release.yml

In my opinion you should manually click the release-build button, and let everything else be automated. I'm assuming you are manually bumping your version number, but it seems better to move that manual intervention to releasing.
What I would do is have put at the end of release-build that bumps your minor version. Something like:
- put: flight-school-version
params:
bump: minor
That way you will always be on the correct version, once you release 0.0.1, you are done with it forever, you can only go forward.

Related

Azure Devops Pipeline Trigger on tag and specific Branch

I exhausted all the options here and its only Tuesday.
I have a repository with 3 branches:
development
staging
master
I am trying to set the triggers to behave differently based on the branch. What i mean it this:
development = trigger on everything (commit, pr and tags)
Staging = Trigger only on tags created on this branch
Master = Same as staging.
I have the 3 pipelines as follow:
Development branch:
trigger:
branches:
include:
- development
Staging Branch:
trigger:
batch: true
tags:
include:
- 'v*'
branches:
include:
- staging
exclude:
- '*'
Master Branch
trigger:
batch: true
tags:
include:
- 'v*'
branches:
include:
- main
exclude:
- '*'
Everything works just fine when i push a normal commit on development, only development triggers. but if i create a tag on development all the pipelines trigger(staging and master). As far as i understand the Tags and Branches are conditions that as long as one of them is true, they trigger. I dont understand why microsoft is doing it this way if the pipelines resides in different branches but i am looking for a work around this limitations. I thought on excluding the branches but that will go out of control easily.
Any advice or a better solution please?
Thank you so much for any help you can provide me with
From this doc, it seems you could not let tags filter with branch filters that include file paths.
But considering that includes are processed first, and then excludes are removed from that list.
Therefore, you could try to write clearly the tag as an explicit include and exclude in the triggers part to see whether it could meet your requirement:
New a repo with three empty branches: stage, main, dev
In dev you should write tag include in trigger if you would like it be triggered by tags in dev branch:
for example, 2.yml in dev branch
trigger:
branches:
include:
- dev
tags:
include:
- v*
azure-pipelines.yml in main branch:
trigger:
batch: true
branches:
include:
- main
- refs/tags/v*
exclude:
- dev
- stage
tags:
include:
- v*
exclude:
- dev/*
- stage/*
paths:
exclude:
- /*
and 1.yml in stage branch:
trigger:
batch: true
branches:
include:
- stage
- refs/tags/v*
exclude:
- dev
- main
tags:
include:
- v*
exclude:
- dev/*
- main/*
paths:
exclude:
- /*
Then here are results test from my side:
commits on dev triggers 2.yml pipeline
adding tag v* on dev will triggers 2.yml pipeline
commits on stage doesn't trigger 1.yml pipeline
adding tag v* on dev triggers 1.yml pipeline
commit on main trigger azure-pipelines.yml pipeline
add tag v* on main trigger azure-pipelines.yml pipeline
Please kindly see and try whether it is what you want.
You could also consider the following two methods:
Alternative 1: As your first step, conditionally abort the build as "cancelled" if the branch name is not what you expect. You could see this for more details.
Alternative 2: Add a condition like eq(variables['Build.SourceBranchname'], 'master') or contains(variables['build.sourceBranch'], 'refs/heads/main') to every single step in your pipeline. This is tedious, but simple. You could see the doc for more details.

Github actions: Post comment to PR workflow that triggered the current workflow

I have two workflows, the first one runs a build script and generates an artifact.
The first one is triggered when a pull request is created like this:
name: build
on:
pull_request:
types: [opened, edited, ready_for_review, reopened]
The second flow runs when the first is done, by using the workflow_runtrigger like this:
on:
workflow_run:
workflows: ["build"]
types:
- "completed"
The second flow has to be separate and run after the first one. When done it is supposed to post a comment on the PR that triggered the first workflow, but I am unable to find out how.
According to the Github Action Docs this is one of the typical use cases, as per this qoute:
For example, if your pull_request workflow generates build artifacts, you can create
a new workflow that uses workflow_run to analyze the results and add a comment to the
original pull request.
But I can't seem to find out how. I can get the first workflow's id in the 2nd workflow's context.payload.workflow_run.id, but workflow_run should also have about the pull request, but they`re empty.
What am I doing wrong, and where can I find the necessary info to be able to comment on my created pull request?
You're not doing anything wrong, it's just that the Pull Request datas from the first workflow are not present in the Github Context of the second workflow.
To resolve your problem, you could send the Pull Request datas you need from the first workflow to the second workflow.
There are different ways to do it, for example using a dispatch event (instead of a workflow run), or an artifact.
For the artifact, it would look like something as below:
In the FIRST workflow, you get the PR number from the github.event. Then you save that number into a file and upload it as an artifact.
- name: Save the PR number in an artifact
shell: bash
env:
PULL_REQUEST_NUMBER: ${{ github.event.number }}
run: echo $PULL_REQUEST_NUMBER > pull_request_number.txt
- name: Upload the PULL REQUEST number
uses: actions/upload-artifact#v2
with:
name: pull_request_number
path: ./pull_request_number.txt
In the SECOND workflow, you get the artifact and the Pull Request number from the FIRST workflow, using the following GitHub Apps:
- name: Download workflow artifact
uses: dawidd6/action-download-artifact#v2.11.0
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
workflow: <first_workflow_name>.yml
run_id: ${{ github.event.workflow_run.id }}
- name: Read the pull_request_number.txt file
id: pull_request_number_reader
uses: juliangruber/read-file-action#v1.0.0
with:
path: ./pull_request_number/pull_request_number.txt
- name: Step to add comment on PR
[...]

Azure DevOps pipelines resource trigger not working

I have an Azure DevOps pipeline (Pipeline1) that should be triggered when another (Pipeline2) completes. To that end I have implemented a pipelines resource as described in the documentation -
Trigger one pipeline after another
YAML schema reference
However, it's simply not working. In reality Pipeline2 will be triggered when a new PR is created or manually. I've tested creating a new PR, updating a PR several times, and several manual runs, but no matter what I do Pipeline1 will not trigger.
I've tried two of the examples as defined in the YAML schema reference, and reading further into the Trigger one pipeline after another document, I've tried to prefix the all branches wildcard with refs/heads/.
What must I do to get this working?
What I've tried
Without any branchs explicitly defined -
resources:
pipelines:
- pipeline: pipeline2
source: Pipeline2
trigger: true
With all branches explicitly defined -
resources:
pipelines:
- pipeline: pipeline2
source: Pipeline2
trigger:
branches:
- "*"
Prefixed the all branches wildcard with refs/heads/ -
resources:
pipelines:
- pipeline: pipeline2
source: Pipeline2
trigger:
branches:
- refs/heads/*
Update
It seems that sadly the pipelines resource does not work on PR's. Why That's the case, I couldn't tell you.
After some further investigation I stumbled across the Incoming Webhook Service Connection in a sprint update. This update is from six months ago and at the time of writing nothing has been added to the YAML schema reference.
However, it turns out that this feature just doesn't work full stop, and even if it did it looks like it will only trigger the default branch of a pipeline, which is no good for us (and probably no good for most use cases).
I did eventually find some documentation on GitHub from a year ago, but unfortunately this only seems to confirm that the Incoming Webhook Service Connection is of no use to us in this case.
This answer solved it for me.
I had a main and dev branch, and the target pipeline yaml file was not yet pushed up to main. The "Default branch for manual and scheduled builds" in the target pipeline must contain the source file in order for the pipeline to be triggered automatically. (The version of the pipeline that will actually be triggered will be the branch that triggered the original pipeline, as long as they are in the same project.) I changed the value to dev and that solved it.
You can change this setting by going to Edit/Triggers/Yaml/Get sources:
For me works this
resources:
pipelines:
- pipeline: build_pipeline
source: kmadof.devops-manual (14)
branch: master
trigger:
branches:
- '*'
steps:
- bash: env | sort
- task: Bash#3
inputs:
targetType: 'inline'
script: |
echo 'Hello world'
So in your case I would try this:
resources:
pipelines:
- pipeline: pipeline2
source: Pipeline2
branch: master
trigger:
branches:
- '*'
The Pipeline2 is triggered by PR, so the source branch (Build.SourceBranch) that triggers the pipeline run is the PR merge branch (refs/pull/{PR_ID}/merge).
I also have tested with the 3 ways you posted above, and only the first way can work as expected.
According to my further investigation, it seems that the branch filters on the pipeline resource trigger are only available to the repository branches that you can see on the 'Repos/Branches' page. These branches have the same prefix 'refs/heads/'. The PR merge branch (refs/pull/{PR_ID}/merge) seems is not included.
In your case, the first way should can work. You need to check with the following things:
Make sure you have specified the correct pipeline name of Pipeline2 to the 'source' key.
Check whether Pipeline1 and Pipeline2 are in the same project. If not, make sure you have used the 'project' key to specify the correct project where Pipeline2 is in, and make sure the projects are in the same organization.

How to prevent triggering an Azure DevOps build pipeline based on commit tags?

I am using Azure pipelines with a Github-based project. I have set up a build pipeline that is triggered only by tagged commits, to keep it separate from automatic daily builds that happen at every commit.
I would like to exclude tagged commits from triggering the daily build pipeline. What is the correct way to do so in a yaml script?
Here is what I did, without success.
According to Azure documentation at this page, to my understanding excluding tags should be possible with something like:
trigger:
tags:
exclude:
- projectname_v*
However, this does not work, and just prevents the build pipeline to run at any commit, be it tagged or not.
I have also tried:
trigger:
tags:
include:
- *
exclude:
- projectname_v*
but this is apparently not supported, as it produces error:
/azure-pipelines.yml: (Line: 12, Col: 7, Idx: 220) - (Line: 12, Col: 8, Idx: 221): While scanning an anchor or alias, did not find expected alphabetic or numeric character.
I have also tried the alternative syntax proposed on the doc page:
trigger:
branches:
exclude:
refs/tags/{projectname_v*}
as well as variants with/without braces and wildcards, but all fail with "unexpected value" or "Input string was not in a correct format" errors.
Edit 2019-12-10
After reading wallas-tg's answer below, I have tried the following in the daily build pipeline:
trigger:
branches:
include:
- '*'
exclude:
- 'refs/tags/*'
This works, but does not do what I would like:
Pushing only a tag triggers the correct pipeline and not the one for daily builds
Pushing a commit without tags triggers the daily build pipeline
Pushing a tagged commit triggers both pipelines: the daily build pipeline gets triggered by the commit, and the other one by the tag; my desired behavior in this case would be that the daily build pipeline is not triggered.
#acasta69
I think that i found solution for your issue. I've been doing just the opposite scenario, build only features branches and exclude anything else.
For this purposes use this yml snippet on azure-pipelines.yml
resources:
repositories:
- repository: myNetProject
type: GitHub
connection: myGitHubConnection
source: wkrea/DockerHubImages
trigger:
batch: true
branches:
include:
- releases/*
exclude:
- '*'
paths:
exclude:
- README.md
I was be able to build on DevOps from
If this answer was useful for you, let me know commenting and rate my answer to find it more easy next time that anyone need help, because the DevOps Pipelines documentations it's really unclear and confusing at moment :'(
Here you can see checks for my last commit on releases branch
The syntax for build pipeline triggers is documented on this page.
Regarding what is exposed in the question, a couple of details are worth highlighting:
There is a default implicit trigger that includes all branches and is overwritten by any user-defined trigger. Thus, it is not possible to specify a trigger that only excludes something: doing that would end up in nothing being included, and the trigger would never fire.
This explains why the first code snippet shown in the question does not trigger anything.
When you specify a trigger, it replaces the default implicit trigger, and only pushes to branches that are explicitly configured to be included will trigger a pipeline. Includes are processed first, and then excludes are removed from that list. If you specify an exclude but don't specify any includes, nothing will trigger.
The default implicit trigger looks like this (note the comment in last line, which explains the error produced by the second code snippet in the question):
trigger:
branches:
include:
- '*' # must quote since "*" is a YAML reserved character; we want a string
Summarizing, a correct way to do exclude tagged commits from triggering the pipeline should be the one shown in the edited part of the question:
trigger:
branches:
include:
- '*'
exclude:
- 'refs/tags/*'
Or, which is equivalent:
trigger:
branches:
include:
- '*'
tags:
exclude:
- '*'
However, this does not obtain the desired effect. The following happens instead:
Pushing a commit without tags triggers the pipeline
Pushing only a tag does not trigger the pipeline
Pushing a tagged commit still triggers the pipeline
A final feedback received from Azure DevOps support clarifies that there is no way at the moment to obtain the desired behaviour:
Basically there is no way right now to prevent builds from being triggered if the tags are committed along with the branch changes and the CI on branch changes are enabled on the pipeline. There are couple of options you can use to prevent triggering the build on new tags:
Check-in the tags separately than the branch changes.
Add "[skip ci]" to your commit message to skip triggering the build on that particular commit.
None of the two options fit well to the original request. #1 is already working as intended, i.e. it does not trigger the build unless tags are explicitly included in triggers, which is only a part of the original request. #2 has the desired behaviour, but it requires a specific commit text and would force any user of the pipeline to be aware of its internals.
The workaround that I found in the meantime, as mentioned in a comment, was to use only one pipeline, that is always triggered by any commit, whether tagged or not, and detect the use of tags with dedicated scripts to activate specific pipeline steps when required.

How to parameterise concourse task files

I'm pretty impressed by the power and simplicity of Concourse. Since my pipelines keep growing I decided to move the tasks to separate files. One of the tasks use a custom Docker image from our own private registry. So, in that task file I have:
image_resource:
type: docker-image
source:
repository: docker.mycomp.com:443/app-builder
tag: latest
username: {{dckr-user}}
password: {{dckr-pass}}
When I do a set-pipeline, I pass the --load-from-vars argument to load credentials etc from a seperate file.
Now here's my problem: I notice that the vars in my pipeline files are replaced with the actual correct values, but once the task runs, the afore mentioned {{dckr-user}} and {{dckr-pass}} are not replaced.
How do I achieve this?
In addition to what was provided in this answer
If specifically you are looking to use private images in a task, you can do the following in your pipeline.yml:
resources:
- name: some-private-image
type: docker
params:
repository: ...
username: {{my-username}}
password: {{my-password}}
jobs:
- name: foo
plan:
- get: some-private-image
- task: some-task
image: some-private-image
Because this is your pipeline, you can use --load-vars-from, which will first get your image as a resource and then use it for the subsequent task.
You can also see this article on pre-fetching ruby gems in test containers on Concourse
The only downside to this is you cannot use this technique when running a fly execute.
As of concourse v3.3.0, you can set up Credential Management in order to use variables from one of the supported credential managers which are currently Vault, Credhub, Amazon SSM, and Amazon Secrets Manager. So you don't have to separate your task files partially in the pipeline.yml anymore. The values you set in the Vault will be also accessible from the task.yml files.
And since v3.2.0 {{foo}} is deprecated in favor of ((foo)).
Using the Credential Manager you can parameterize:
source under resources in a pipeline
source under resource_types in a pipeline
webhook_token under resources in a pipeline
image_resource.source under image_resource in a task config
params in a pipeline
params in a task config
For setting up vault with concourse you can refer to:
https://concourse-ci.org/creds.html
You can always define tasks in a pipeline.yml...
For example:
jobs:
- name: dotpersecond
plan:
- task: dotpersecond
config:
image_resource:
type: docker-image
source:
repository: docker.mycomp.com:443/app-builder
tag: latest
username: {{dckr-user}}
password: {{dckr-pass}}
run:
path: sh
args:
- "-c"
- |
for i in `seq 1000`; do echo hi; sleep 2; done