Azure DevOps: How to eliminate the warning "Tags set for the trigger didn't match the pipeline" in Azure DevOps? - azure-devops

I have two Azure DevOps pipelines set up so that the completion of Pipeline One triggers Pipeline Two. It works fine, but it generates an unnecessary error message.
All recent Pipeline Two builds are listed on this page (not really a link, don't bother clicking on it) : https://dev.azure.com/mycompany/myproject/_build?definitionId=29
Any trigger issues are listed on this page (not really a link, don't bother clicking on it) : https://dev.azure.com/mycompany/myproject/_build?definitionId=29&view=triggerIssues
It appears that every run of Pipeline One -> Pipeline Two adds this warning to the Trigger Issues page: "Tags set for the trigger didn't match the pipeline". It's only a warning, not an error, and Pipeline Two executes successfully. But how can I eliminate this warning message?
The pipeline resource is specified in Pipeline Two as follows:
resources:
pipelines:
- pipeline: pipeline-one
source: mycompany.pipeline-one
# project: myproject # optional - only required if first pipeline is in a different project
trigger:
enabled: true
branches:
include:
- master
- develop
- release_*
No tags are specified, because tags are not used.
I have reviewed the following documentation without finding an answer. I may have missed something in the docs.
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/pipeline-triggers?view=azure-devops
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/resources?view=azure-devops&tabs=schema#define-a-pipelines-resource
https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema/resources-pipelines-pipeline?view=azure-pipelines

Related

Azure DevOps pipelines resource trigger not working

I have an Azure DevOps pipeline (Pipeline1) that should be triggered when another (Pipeline2) completes. To that end I have implemented a pipelines resource as described in the documentation -
Trigger one pipeline after another
YAML schema reference
However, it's simply not working. In reality Pipeline2 will be triggered when a new PR is created or manually. I've tested creating a new PR, updating a PR several times, and several manual runs, but no matter what I do Pipeline1 will not trigger.
I've tried two of the examples as defined in the YAML schema reference, and reading further into the Trigger one pipeline after another document, I've tried to prefix the all branches wildcard with refs/heads/.
What must I do to get this working?
What I've tried
Without any branchs explicitly defined -
resources:
pipelines:
- pipeline: pipeline2
source: Pipeline2
trigger: true
With all branches explicitly defined -
resources:
pipelines:
- pipeline: pipeline2
source: Pipeline2
trigger:
branches:
- "*"
Prefixed the all branches wildcard with refs/heads/ -
resources:
pipelines:
- pipeline: pipeline2
source: Pipeline2
trigger:
branches:
- refs/heads/*
Update
It seems that sadly the pipelines resource does not work on PR's. Why That's the case, I couldn't tell you.
After some further investigation I stumbled across the Incoming Webhook Service Connection in a sprint update. This update is from six months ago and at the time of writing nothing has been added to the YAML schema reference.
However, it turns out that this feature just doesn't work full stop, and even if it did it looks like it will only trigger the default branch of a pipeline, which is no good for us (and probably no good for most use cases).
I did eventually find some documentation on GitHub from a year ago, but unfortunately this only seems to confirm that the Incoming Webhook Service Connection is of no use to us in this case.
This answer solved it for me.
I had a main and dev branch, and the target pipeline yaml file was not yet pushed up to main. The "Default branch for manual and scheduled builds" in the target pipeline must contain the source file in order for the pipeline to be triggered automatically. (The version of the pipeline that will actually be triggered will be the branch that triggered the original pipeline, as long as they are in the same project.) I changed the value to dev and that solved it.
You can change this setting by going to Edit/Triggers/Yaml/Get sources:
For me works this
resources:
pipelines:
- pipeline: build_pipeline
source: kmadof.devops-manual (14)
branch: master
trigger:
branches:
- '*'
steps:
- bash: env | sort
- task: Bash#3
inputs:
targetType: 'inline'
script: |
echo 'Hello world'
So in your case I would try this:
resources:
pipelines:
- pipeline: pipeline2
source: Pipeline2
branch: master
trigger:
branches:
- '*'
The Pipeline2 is triggered by PR, so the source branch (Build.SourceBranch) that triggers the pipeline run is the PR merge branch (refs/pull/{PR_ID}/merge).
I also have tested with the 3 ways you posted above, and only the first way can work as expected.
According to my further investigation, it seems that the branch filters on the pipeline resource trigger are only available to the repository branches that you can see on the 'Repos/Branches' page. These branches have the same prefix 'refs/heads/'. The PR merge branch (refs/pull/{PR_ID}/merge) seems is not included.
In your case, the first way should can work. You need to check with the following things:
Make sure you have specified the correct pipeline name of Pipeline2 to the 'source' key.
Check whether Pipeline1 and Pipeline2 are in the same project. If not, make sure you have used the 'project' key to specify the correct project where Pipeline2 is in, and make sure the projects are in the same organization.

Specify runtime parameter in a pipeline task

We have a requirement to somehow pass a dynamic runtime parameter to a pipeline task.
For example below paramater APPROVAL would be different for each run of the task.
This APPROVAL parameter is for the change and release number so that the task can tag it on the terraform resources created for audit purposes.
Been searching the web for a while but with no luck in finding a solution, is this possible in a concourse pipeline or best practice?
- task: plan-terraform
file: ci/concourse-jobs/pipelines/tasks/terraform/plan-terraform.yaml
params:
ENV: dev
APPROVAL: test
CHANNEL: Developement
GITLAB_KEY: ((gitlab_key))
REGION: eu-west-2
TF_FOLDER: terraform/squid
input_mapping:
ci: ci
tf: squid
output_mapping:
plan: plan
tags:
- dev
From https://concourse-ci.org/tasks.html:
ideally tasks are pure functions: given the same set of inputs, it should either always succeed with the same outputs or always fail.
A dynamic parameter would break that contract and produce different outputs from the same set of inputs. Could you possibly make APPROVAL an input? Then you'd maintain your build traceability. If it's a (file) input, you could then load it into a variable:
APPROVAL=$(cat <filename>)

Download artifact from Azure DevOps Pipeline grandparent Pipeline

Given 3 Azure DevOps Pipelines (more may exist), as follows:
Build, Unit Test, Publish Artifacts
Deploy Staging, Integration Test
Deploy Production, Smoke Test
How can I ensure Pipeline 3 downloads the specific artifacts published in Pipeline 1?
The challenge as I see it is that the Task DownloadPipelineArtifact#2 only offers a means to do this if the artifact came from the immediately preceding pipeline. By using the following Pipeline task:
- task: DownloadPipelineArtifact#2
inputs:
buildType: 'specific'
project: '$(System.TeamProjectId)'
definition: 1
specificBuildWithTriggering: true
buildVersionToDownload: 'latest'
artifactName: 'example.zip'
This works fine for a parent "triggering pipeline", but not a grandparent. Instead it returns the error message:
Artifact example.zip was not found for build nnn.
where nnn is the run ID of the immediate predecessor, as though I had specified pipelineId: $(Build.TriggeredBy.BuildId). Effectively, Pipeline 3 attempts to retrieve the Pipeline 1 artifact from Pipeline 2. It would be nice if that definition: 1 line did something, but alas, it seems to do nothing when specificBuildWithTriggering: true is set.
Note that buildType: 'latest' isn't safe; it appears it permits publishing an untested artifact, if emitted from Pipeline 1 while Pipeline 2 is running.
There may be no way to accomplish this with the DownloadPipelineArtifact#2. It's hard to be sure because the documentation doesn't have much detail. Perhaps there's another reasonable way to accomplish this... I suppose publishing another copy of the artifact at each of the intervening pipelines, even the ones that don't use it, is one way, but not very reasonable. We could eliminate the ugly aspect of creating copies of the binaries, by instead publishing an artifact with the BuildId recorded in it, but we'd still have to retrieve it and republish it from every pipeline.
If there is a way to identify the original CI trigger, e.g. find the hash of the initiating GIT commit, I could use that to name and refer to the artifacts. Does Build.SourceVersion remain constant between triggered builds? Any other "Initiating ID" would work equally well.
You are welcome to comment on the example pipeline scenario, as I'm actually currently using it, but it isn't the point of my question. I think this problem is broadly applicable, as it will apply when building dependent packages, or for any other reasons for which "Triggers" are useful.
An MS representative suggested using REST for this. For example:
HTTP GET https://dev.azure.com/ORGNAME/PROJECTGUID/_apis/build/Builds/2536
-
{
"id": 2536,
"definition": {
"id": 17
},
"triggeredByBuild": {
"id": 2535,
"definition": {
"id": 10
}
}
}
By walking the parents, one could find the ancestor with the desired definition ID (e.g. 10). Then its run ID (e.g. 2535) could be used to download the artifact.
#merlin-liang-msft suggested a similar process for a different requirement from #sschmeck, and their answer has accompanying code.
There are extensions that allow you to do this, but the official solution it to use a multi-stage pipeline and not 3 independent pipelines.
One way is using release pipelines (you can't code/edit it in YAML) but you can use the same artifacts through whole deployment.
Release pipeline
You can also specify required triggers to start deployment on
Approval and triggers
Alternatively, there exist multi-stage pipeline, that are in preview.(https://devblogs.microsoft.com/devops/whats-new-with-azure-pipelines/ ).
You can access it by enabling it in your "preview feature".
Why don't you output some pipeline artifacts with meta info and concatenate these in the preceding pipes like.
Grandparent >meta about pipe
Parent > meta about pipe and grantparent meta
Etc

How to prevent triggering an Azure DevOps build pipeline based on commit tags?

I am using Azure pipelines with a Github-based project. I have set up a build pipeline that is triggered only by tagged commits, to keep it separate from automatic daily builds that happen at every commit.
I would like to exclude tagged commits from triggering the daily build pipeline. What is the correct way to do so in a yaml script?
Here is what I did, without success.
According to Azure documentation at this page, to my understanding excluding tags should be possible with something like:
trigger:
tags:
exclude:
- projectname_v*
However, this does not work, and just prevents the build pipeline to run at any commit, be it tagged or not.
I have also tried:
trigger:
tags:
include:
- *
exclude:
- projectname_v*
but this is apparently not supported, as it produces error:
/azure-pipelines.yml: (Line: 12, Col: 7, Idx: 220) - (Line: 12, Col: 8, Idx: 221): While scanning an anchor or alias, did not find expected alphabetic or numeric character.
I have also tried the alternative syntax proposed on the doc page:
trigger:
branches:
exclude:
refs/tags/{projectname_v*}
as well as variants with/without braces and wildcards, but all fail with "unexpected value" or "Input string was not in a correct format" errors.
Edit 2019-12-10
After reading wallas-tg's answer below, I have tried the following in the daily build pipeline:
trigger:
branches:
include:
- '*'
exclude:
- 'refs/tags/*'
This works, but does not do what I would like:
Pushing only a tag triggers the correct pipeline and not the one for daily builds
Pushing a commit without tags triggers the daily build pipeline
Pushing a tagged commit triggers both pipelines: the daily build pipeline gets triggered by the commit, and the other one by the tag; my desired behavior in this case would be that the daily build pipeline is not triggered.
#acasta69
I think that i found solution for your issue. I've been doing just the opposite scenario, build only features branches and exclude anything else.
For this purposes use this yml snippet on azure-pipelines.yml
resources:
repositories:
- repository: myNetProject
type: GitHub
connection: myGitHubConnection
source: wkrea/DockerHubImages
trigger:
batch: true
branches:
include:
- releases/*
exclude:
- '*'
paths:
exclude:
- README.md
I was be able to build on DevOps from
If this answer was useful for you, let me know commenting and rate my answer to find it more easy next time that anyone need help, because the DevOps Pipelines documentations it's really unclear and confusing at moment :'(
Here you can see checks for my last commit on releases branch
The syntax for build pipeline triggers is documented on this page.
Regarding what is exposed in the question, a couple of details are worth highlighting:
There is a default implicit trigger that includes all branches and is overwritten by any user-defined trigger. Thus, it is not possible to specify a trigger that only excludes something: doing that would end up in nothing being included, and the trigger would never fire.
This explains why the first code snippet shown in the question does not trigger anything.
When you specify a trigger, it replaces the default implicit trigger, and only pushes to branches that are explicitly configured to be included will trigger a pipeline. Includes are processed first, and then excludes are removed from that list. If you specify an exclude but don't specify any includes, nothing will trigger.
The default implicit trigger looks like this (note the comment in last line, which explains the error produced by the second code snippet in the question):
trigger:
branches:
include:
- '*' # must quote since "*" is a YAML reserved character; we want a string
Summarizing, a correct way to do exclude tagged commits from triggering the pipeline should be the one shown in the edited part of the question:
trigger:
branches:
include:
- '*'
exclude:
- 'refs/tags/*'
Or, which is equivalent:
trigger:
branches:
include:
- '*'
tags:
exclude:
- '*'
However, this does not obtain the desired effect. The following happens instead:
Pushing a commit without tags triggers the pipeline
Pushing only a tag does not trigger the pipeline
Pushing a tagged commit still triggers the pipeline
A final feedback received from Azure DevOps support clarifies that there is no way at the moment to obtain the desired behaviour:
Basically there is no way right now to prevent builds from being triggered if the tags are committed along with the branch changes and the CI on branch changes are enabled on the pipeline. There are couple of options you can use to prevent triggering the build on new tags:
Check-in the tags separately than the branch changes.
Add "[skip ci]" to your commit message to skip triggering the build on that particular commit.
None of the two options fit well to the original request. #1 is already working as intended, i.e. it does not trigger the build unless tags are explicitly included in triggers, which is only a part of the original request. #2 has the desired behaviour, but it requires a specific commit text and would force any user of the pipeline to be aware of its internals.
The workaround that I found in the meantime, as mentioned in a comment, was to use only one pipeline, that is always triggered by any commit, whether tagged or not, and detect the use of tags with dedicated scripts to activate specific pipeline steps when required.

Concourse resource is not used error message

I have a concourse pipeline that bumps a semver, publishes a release to a GitHub-release resource and publishes a message using a slack-notification resource. All is fine until I try to start using on_failure: and on_success: steps.
I moved the slack put to on_success without issue. But when I try to move the GitHub-release put to on_success set-pipeline returns the error:
resource 'github-release' is not used
I tried putting it in both on_failure and on_success but I still get the message.
Is there a way to only publish this release when the build is good?
The on_success and on_failure only run a single step of a pipeline. If you want to run multiple steps, you have to use one of the block steps, such as do or aggregate to accomplish this.
For example:
on_success:
do:
- put: slack-notification
- put: github-release