Azure DevOps Build with Parameters - azure-devops

Is it possible in azure-pipelines.yml to define multi-value at runtime parameters so when you run the build you have to input so values
parameters:
- name: image
displayName: Pool Image
type: string
default: ubuntu-latest
values:
- windows-latest
- vs2017-win2016
- ubuntu-latest
Upon clicking Run in Azure DevOps you are presented with a dropdown and you select which option you require ???
Upon your selection, the build will only run certain steps or tasks based on your selection

I am not sure when it was added, but drop down parameters are now available:
parameters:
- name: env
displayName: Environment
type: string
values:
- dev
- prod
- test
- train
default: train
will provide me with a drop down of dev, prod, etc., prepopulated with the value train.
Moreover, it will be a drop down if 4 values or more, and a radio dial with 3 or less. For instance,
- name: department
displayName: Business Department
type: string
values:
- AI
- BI
- Marketing
default: AI
will create a radio dial with AI selected by default. Note that the YAML is identical between the two, with the exception of 4 values in the first and 3 in the second.

Dropdown parameters is not yet supported on azure devops pipeline.
There is a workaround that you can create a variable with all the possible values, and enable Settable at queue time. The detailed steps are in below:
Edit your yaml pipeline, Click the 3dots on the top right corer and choose Triggers
Go to Variables tab, create a variable and check Settable at queue time
Then when you queue your pipeline, you will be allowed to set the value for this variable.
After you setup above steps. You also need to add condtions for your tasks.
For below example the script task can only run when the Environment variable is equal to prod and previous steps are all succeeded.
steps:
- script: echo "run this step when Environment is prod"
condition: and(succeeded(), eq(variables['Environment'], 'prod'))
Please check here for more information about Conditions and Expressions
You can also submit a feature request (Click suggest a feature and choose Azure devops)to Microsoft Develop, hope they will consider implementing this feature in the future.

Related

Azure DevOps: How to eliminate the warning "Tags set for the trigger didn't match the pipeline" in Azure DevOps?

I have two Azure DevOps pipelines set up so that the completion of Pipeline One triggers Pipeline Two. It works fine, but it generates an unnecessary error message.
All recent Pipeline Two builds are listed on this page (not really a link, don't bother clicking on it) : https://dev.azure.com/mycompany/myproject/_build?definitionId=29
Any trigger issues are listed on this page (not really a link, don't bother clicking on it) : https://dev.azure.com/mycompany/myproject/_build?definitionId=29&view=triggerIssues
It appears that every run of Pipeline One -> Pipeline Two adds this warning to the Trigger Issues page: "Tags set for the trigger didn't match the pipeline". It's only a warning, not an error, and Pipeline Two executes successfully. But how can I eliminate this warning message?
The pipeline resource is specified in Pipeline Two as follows:
resources:
pipelines:
- pipeline: pipeline-one
source: mycompany.pipeline-one
# project: myproject # optional - only required if first pipeline is in a different project
trigger:
enabled: true
branches:
include:
- master
- develop
- release_*
No tags are specified, because tags are not used.
I have reviewed the following documentation without finding an answer. I may have missed something in the docs.
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/pipeline-triggers?view=azure-devops
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/resources?view=azure-devops&tabs=schema#define-a-pipelines-resource
https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema/resources-pipelines-pipeline?view=azure-pipelines

How to stop matrix builds on the first error

I have a CI pipeline setup for release and debug builds:
trigger:
batch: true
branches:
include:
- "master"
- "main"
- "feature/*"
- "hotfix/*"
strategy:
matrix:
'Release':
buildConfiguration: 'Release'
'Debug':
buildConfiguration: 'Debug'
Both are ran regardless of errors:
I want to change this behaviour so that when one job fails the other job also stops - saving me build minutes.
Is this possible?
We do not have any built-in method can easily automatically cancel all in-progress jobs if any matrix job fails.
As a workaround, you can try the following method:
Add a script task (such as PowerShell or Bash) as the last step of the matrix job.
Set the script task runs when any of the previous tasks is failed (condition: failed()).
On the script task, set the script to execute the REST API "Builds - Update Build" to cancel current build.
With this way, when any task in the job is failed, the script task will run and execute the REST API to cancel the whole build.
Of course, if you really want a built-in easy method can be used (for example, add the option jobs.job.strategy.fail-fast), I recommend that you can report a feature request on Developer Community. That will make it possible for you to interact with the appropriate engineering team, and make it more convenient for the engineering team to collect and categorize your suggestions.

How do I set a github branch protection rule based on the success or failure of an entire github actions workflow?

I'm trying to set a github branch protection rule based on the success or failure of a github actions workflow.
You can see the workflow here:
https://github.com/apostrophecms/apostrophe/blob/main/.github/workflows/main.yml
The workflow passes, and I even have a working badge for it, but I am unable to set a branch protection rule requiring that it pass as a status check.
I can set a branch protection rule based on any one of the individual builds in the matrix, but I don't want to set all of them individually and keep track of that as my matrix rule changes.
As you can see from the screenshots, I am unable to pick "build", the name of the job (although I can pick any of the sub-builds), and I am also unable to pick "tests", the name of the workflow as a whole (it does not change if I use an uppercase t).
What am I missing? Thanks for your help!
Screenshot one: I can pick a sub-build but not the entire build job.
Screenshot two: I can't pick the name of the overall "Tests" workflow at all.
There's a trick to add one step to the workflow to collect all jobs from the matrix to one check:
tests:
runs-on: ubuntu-latest
needs: build
if: always()
steps:
- name: All tests ok
if: ${{ !(contains(needs.*.result, 'failure')) }}
run: exit 0
- name: Some tests failed
if: ${{ contains(needs.*.result, 'failure') }}
run: exit 1
if: always() is obligatory to collect failed tasks, otherwise PR will never get a proper status check. Also, this is an additional step for you to pay (if you use paid plans).
In this case, you have a single job with a matrix. That means you'll end with 9 possibilities (3 node options × 3 MongoDB options). Each of those is considered a separate status check and can be enabled or disabled as mandatory individually. This is so that you can add new options without making them mandatory up front.
If you want every one of those jobs to pass, then you need to choose every one of the 9 jobs and mark them as required.

Azure Pipelines parameter value from variable template

We would like to deploy components of our application to developer's local machines and want it to be easy enough for our co-workers to use and easy enough for us to maintain. These are virtual machines with a certain naming convention, for instance: VM001, VM002, and so on.
I can define these machines, and use the value later on in the pipeline, in a parameter in YAML like this:
parameters:
- name: stage
displayName: Stage
type: string
values:
- VM001
- VM002
- And so on...
I then only have to maintain one stage, because the only thing that really differs is the stage name:
stages:
- stage: ${{ parameters.stage }}
displayName: Deploy on ${{ parameters.stage }}
- jobs:
...
The idea behind defining the machines in the parameters like this is that developers can choose their virtual machine from the 'Stage' dropdown when they want to deploy to their own virtual machine. By setting the value of the parameter to the virtual machine, the stage is named and the correct library groups will also be linked up to the deployment (each developer has their own library groups where we store variables such as accounts and secrets).
However, we have multiple components that we deploy through multiple pipelines. So each component gets its own YAML pipeline and for each pipeline we will have to enter and maintain the same list of virtual machines.
We already use variable and job templates for reusability. I want to find a way to create a template with the list of machines and pass it to the parameter value. This way, we only need to maintain one template so whenever someone new joins the team or someone leaves, we only need to update one file instead of updating all the pipelines.
I've tried to pass the template to the parameter value using an expression like this:
variables:
- name: VirtualMachinesList
value: VirtualMachinesList.yml
parameters:
- name: stage
displayName: Stage
type: string
values:
- ${{ variables.VirtualMachinesList }}
The VirtualMachinesList.yml looks like this:
variables:
- name: VM001
value: VM001
- name: VM002
value: VM002
- And so on...
This gives the following error when I try to run the pipeline:
A template expression is not allowed in this context
I've also tried changing the parameter type to object. This results in a text field with a list of all the virtual machines and you can select the ones you don't want to deploy to and remove them. This isn't very user-friendly and also very error-prone, so not a very desirable solution.
Is there a way to pass the list of virtual machines to the parameter value from a single location, so that developers can choose their own virtual machine to deploy to?
I know you want to maintain the list of virtual machines in one place, and also keep the function that developers can choose the vm from the dropdown to deploy to. But i am afraid it cannot be done currently. Runtime parameters doesnot support template yet. You can submit a user voice here regarding this issue.
Currently you can keep only one function, either maintain the vms in one place or developer can choose their vm from the dropdown.
1, To maintain the virtual machines in one place. You can define a variable template to hold the virtual machines. And make the developer to type their vm to deploy to. See below:
Define an empty runtime parameter to let the developer to type in.
parameters:
- name: vm
type: string
default:
Define the variable template to hold the VMS
#variable.yml template
variables:
vm1: vm1
vm2: vm2
...
Then in the pipeline define a variable to refer to the vm variable in the variables template. See below
variables:
- template: variables.yml
- name: vmname
value: $[variables.${{parameters.vm}}]
steps:
- powerhsell: echo $(vmname)
2, To make the developer have the convenience to choose their vm from the dropdown. You have to define these machines parameters in all pipeline.
You're really close. You'll want to update how you're consuming your variable template to:
variables:
- template: variable-template.yml
Here's a working example (assuming both the variable template and consuming pipeline are within the same directory of a repository):
variable-template.yml:
variables:
- name: VM001
value: VM001
- name: VM002
value: VM002
example-pipeline.yml:
name: Stackoverflow-Example-Variables
trigger:
- none
variables:
- template: variable-template.yml
stages:
- stage: StageA
displayName: "Stage A"
jobs:
- job: output_message_job
displayName: "Output Message Job"
pool:
vmImage: "ubuntu-latest"
steps:
- powershell: |
Write-Host "Root Variable: $(VM001), $(VM002)"
For reference, here's the MS documentation on variable template usage:
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/templates?view=azure-devops#variable-reuse

Specify runtime parameter in a pipeline task

We have a requirement to somehow pass a dynamic runtime parameter to a pipeline task.
For example below paramater APPROVAL would be different for each run of the task.
This APPROVAL parameter is for the change and release number so that the task can tag it on the terraform resources created for audit purposes.
Been searching the web for a while but with no luck in finding a solution, is this possible in a concourse pipeline or best practice?
- task: plan-terraform
file: ci/concourse-jobs/pipelines/tasks/terraform/plan-terraform.yaml
params:
ENV: dev
APPROVAL: test
CHANNEL: Developement
GITLAB_KEY: ((gitlab_key))
REGION: eu-west-2
TF_FOLDER: terraform/squid
input_mapping:
ci: ci
tf: squid
output_mapping:
plan: plan
tags:
- dev
From https://concourse-ci.org/tasks.html:
ideally tasks are pure functions: given the same set of inputs, it should either always succeed with the same outputs or always fail.
A dynamic parameter would break that contract and produce different outputs from the same set of inputs. Could you possibly make APPROVAL an input? Then you'd maintain your build traceability. If it's a (file) input, you could then load it into a variable:
APPROVAL=$(cat <filename>)