I have a CI pipeline setup for release and debug builds:
trigger:
batch: true
branches:
include:
- "master"
- "main"
- "feature/*"
- "hotfix/*"
strategy:
matrix:
'Release':
buildConfiguration: 'Release'
'Debug':
buildConfiguration: 'Debug'
Both are ran regardless of errors:
I want to change this behaviour so that when one job fails the other job also stops - saving me build minutes.
Is this possible?
We do not have any built-in method can easily automatically cancel all in-progress jobs if any matrix job fails.
As a workaround, you can try the following method:
Add a script task (such as PowerShell or Bash) as the last step of the matrix job.
Set the script task runs when any of the previous tasks is failed (condition: failed()).
On the script task, set the script to execute the REST API "Builds - Update Build" to cancel current build.
With this way, when any task in the job is failed, the script task will run and execute the REST API to cancel the whole build.
Of course, if you really want a built-in easy method can be used (for example, add the option jobs.job.strategy.fail-fast), I recommend that you can report a feature request on Developer Community. That will make it possible for you to interact with the appropriate engineering team, and make it more convenient for the engineering team to collect and categorize your suggestions.
Related
I have two Azure DevOps pipelines set up so that the completion of Pipeline One triggers Pipeline Two. It works fine, but it generates an unnecessary error message.
All recent Pipeline Two builds are listed on this page (not really a link, don't bother clicking on it) : https://dev.azure.com/mycompany/myproject/_build?definitionId=29
Any trigger issues are listed on this page (not really a link, don't bother clicking on it) : https://dev.azure.com/mycompany/myproject/_build?definitionId=29&view=triggerIssues
It appears that every run of Pipeline One -> Pipeline Two adds this warning to the Trigger Issues page: "Tags set for the trigger didn't match the pipeline". It's only a warning, not an error, and Pipeline Two executes successfully. But how can I eliminate this warning message?
The pipeline resource is specified in Pipeline Two as follows:
resources:
pipelines:
- pipeline: pipeline-one
source: mycompany.pipeline-one
# project: myproject # optional - only required if first pipeline is in a different project
trigger:
enabled: true
branches:
include:
- master
- develop
- release_*
No tags are specified, because tags are not used.
I have reviewed the following documentation without finding an answer. I may have missed something in the docs.
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/pipeline-triggers?view=azure-devops
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/resources?view=azure-devops&tabs=schema#define-a-pipelines-resource
https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema/resources-pipelines-pipeline?view=azure-pipelines
Our CI setup is currently looking like this on GitHub:
Usually, first check is finishing much sooner than second check. It can succeed or fail. Is it possible (and if so - how) to "break early" and terminate remaining actions as soon as some action fails?
You can do this easily but only within a single workflow. If you have multiple workflows.
strategy:
matrix:
os: [ubuntu-latest, macos-latest, windows-latest]
fail-fast: true
Is it possible in azure-pipelines.yml to define multi-value at runtime parameters so when you run the build you have to input so values
parameters:
- name: image
displayName: Pool Image
type: string
default: ubuntu-latest
values:
- windows-latest
- vs2017-win2016
- ubuntu-latest
Upon clicking Run in Azure DevOps you are presented with a dropdown and you select which option you require ???
Upon your selection, the build will only run certain steps or tasks based on your selection
I am not sure when it was added, but drop down parameters are now available:
parameters:
- name: env
displayName: Environment
type: string
values:
- dev
- prod
- test
- train
default: train
will provide me with a drop down of dev, prod, etc., prepopulated with the value train.
Moreover, it will be a drop down if 4 values or more, and a radio dial with 3 or less. For instance,
- name: department
displayName: Business Department
type: string
values:
- AI
- BI
- Marketing
default: AI
will create a radio dial with AI selected by default. Note that the YAML is identical between the two, with the exception of 4 values in the first and 3 in the second.
Dropdown parameters is not yet supported on azure devops pipeline.
There is a workaround that you can create a variable with all the possible values, and enable Settable at queue time. The detailed steps are in below:
Edit your yaml pipeline, Click the 3dots on the top right corer and choose Triggers
Go to Variables tab, create a variable and check Settable at queue time
Then when you queue your pipeline, you will be allowed to set the value for this variable.
After you setup above steps. You also need to add condtions for your tasks.
For below example the script task can only run when the Environment variable is equal to prod and previous steps are all succeeded.
steps:
- script: echo "run this step when Environment is prod"
condition: and(succeeded(), eq(variables['Environment'], 'prod'))
Please check here for more information about Conditions and Expressions
You can also submit a feature request (Click suggest a feature and choose Azure devops)to Microsoft Develop, hope they will consider implementing this feature in the future.
We have a requirement to somehow pass a dynamic runtime parameter to a pipeline task.
For example below paramater APPROVAL would be different for each run of the task.
This APPROVAL parameter is for the change and release number so that the task can tag it on the terraform resources created for audit purposes.
Been searching the web for a while but with no luck in finding a solution, is this possible in a concourse pipeline or best practice?
- task: plan-terraform
file: ci/concourse-jobs/pipelines/tasks/terraform/plan-terraform.yaml
params:
ENV: dev
APPROVAL: test
CHANNEL: Developement
GITLAB_KEY: ((gitlab_key))
REGION: eu-west-2
TF_FOLDER: terraform/squid
input_mapping:
ci: ci
tf: squid
output_mapping:
plan: plan
tags:
- dev
From https://concourse-ci.org/tasks.html:
ideally tasks are pure functions: given the same set of inputs, it should either always succeed with the same outputs or always fail.
A dynamic parameter would break that contract and produce different outputs from the same set of inputs. Could you possibly make APPROVAL an input? Then you'd maintain your build traceability. If it's a (file) input, you could then load it into a variable:
APPROVAL=$(cat <filename>)
Given 3 Azure DevOps Pipelines (more may exist), as follows:
Build, Unit Test, Publish Artifacts
Deploy Staging, Integration Test
Deploy Production, Smoke Test
How can I ensure Pipeline 3 downloads the specific artifacts published in Pipeline 1?
The challenge as I see it is that the Task DownloadPipelineArtifact#2 only offers a means to do this if the artifact came from the immediately preceding pipeline. By using the following Pipeline task:
- task: DownloadPipelineArtifact#2
inputs:
buildType: 'specific'
project: '$(System.TeamProjectId)'
definition: 1
specificBuildWithTriggering: true
buildVersionToDownload: 'latest'
artifactName: 'example.zip'
This works fine for a parent "triggering pipeline", but not a grandparent. Instead it returns the error message:
Artifact example.zip was not found for build nnn.
where nnn is the run ID of the immediate predecessor, as though I had specified pipelineId: $(Build.TriggeredBy.BuildId). Effectively, Pipeline 3 attempts to retrieve the Pipeline 1 artifact from Pipeline 2. It would be nice if that definition: 1 line did something, but alas, it seems to do nothing when specificBuildWithTriggering: true is set.
Note that buildType: 'latest' isn't safe; it appears it permits publishing an untested artifact, if emitted from Pipeline 1 while Pipeline 2 is running.
There may be no way to accomplish this with the DownloadPipelineArtifact#2. It's hard to be sure because the documentation doesn't have much detail. Perhaps there's another reasonable way to accomplish this... I suppose publishing another copy of the artifact at each of the intervening pipelines, even the ones that don't use it, is one way, but not very reasonable. We could eliminate the ugly aspect of creating copies of the binaries, by instead publishing an artifact with the BuildId recorded in it, but we'd still have to retrieve it and republish it from every pipeline.
If there is a way to identify the original CI trigger, e.g. find the hash of the initiating GIT commit, I could use that to name and refer to the artifacts. Does Build.SourceVersion remain constant between triggered builds? Any other "Initiating ID" would work equally well.
You are welcome to comment on the example pipeline scenario, as I'm actually currently using it, but it isn't the point of my question. I think this problem is broadly applicable, as it will apply when building dependent packages, or for any other reasons for which "Triggers" are useful.
An MS representative suggested using REST for this. For example:
HTTP GET https://dev.azure.com/ORGNAME/PROJECTGUID/_apis/build/Builds/2536
-
{
"id": 2536,
"definition": {
"id": 17
},
"triggeredByBuild": {
"id": 2535,
"definition": {
"id": 10
}
}
}
By walking the parents, one could find the ancestor with the desired definition ID (e.g. 10). Then its run ID (e.g. 2535) could be used to download the artifact.
#merlin-liang-msft suggested a similar process for a different requirement from #sschmeck, and their answer has accompanying code.
There are extensions that allow you to do this, but the official solution it to use a multi-stage pipeline and not 3 independent pipelines.
One way is using release pipelines (you can't code/edit it in YAML) but you can use the same artifacts through whole deployment.
Release pipeline
You can also specify required triggers to start deployment on
Approval and triggers
Alternatively, there exist multi-stage pipeline, that are in preview.(https://devblogs.microsoft.com/devops/whats-new-with-azure-pipelines/ ).
You can access it by enabling it in your "preview feature".
Why don't you output some pipeline artifacts with meta info and concatenate these in the preceding pipes like.
Grandparent >meta about pipe
Parent > meta about pipe and grantparent meta
Etc