conditional ManualValidation step in Azure DevOps pipeline - azure-devops

Using the Azure DevOps ManualValidation task, can the job run conditionally, based on variables defined earlier in the pipeline?
The job accepts an enabled parameter, but it seems this must be hard-coded to true or false.
- stage: Approve_${{ targetPath.stageName }}_${{ parameters.planEnvironment }}
jobs:
- job: waitForValidation
displayName: Wait for external validation
pool: server
timeoutInMinutes: ${{ parameters.timeoutInMinutes }}
steps:
- task: ManualValidation#0
environment: development ## environment not accepted here
#enabled: $[destroy] ## Unexpected value '$[destroy]'",
#enabled: $(destroy) ## fails - syntax error (does not like this to be a var)
# manually setting true/false works
#enabled: true
#enabled: false
inputs:
notifyUsers: |
alert#test.com
instructions: 'Please validate the build configuration and resume'
onTimeout: 'reject'

enabled control option is boolean type, which means whether or not to run this step, defaults to 'true'. If you want to use enabled control option to condition ManualValidation step, you could check the following syntax:
variables:
- name: destroy
value: true
jobs:
- job: waitForValidation
displayName: Wait for external validation
pool: server
timeoutInMinutes: 4320 # job times out in 3 days
steps:
- task: ManualValidation#0
timeoutInMinutes: 1440 # task times out in 1 day
enabled: ${{ variables.destroy }}
inputs:
notifyUsers: |
test#test.com
example#example.com
instructions: 'Please validate the build configuration and resume'
onTimeout: 'resume'
Otherwise can specify the conditions under the steps as #Shayki Abramczyk mentioned.

You can use a custom condition like in every task:
- task: ManualIntervention#8
inputs:
instructions: 'test'
condition: and(succeeded(), eq(variables['VariableName'], 'VariableValue'))

Another option is to use template parameter conditionals to make the stage, job, or task go away when you don't want it there. That is, when the preprocessing is done on the pipeline code, use conditionals to avoid inserting the item into the expanded yaml file that is actually run when the pipeline runs.
- ${{ if ne(parameters.EnvironmentName, 'Dev') }}:
- stage: Approve_${{ targetPath.stageName }}_${{ parameters.planEnvironment }}
jobs:
- job: waitForValidation
displayName: Wait for external validation
pool: server
timeoutInMinutes: ${{ parameters.timeoutInMinutes }}
steps:
- task: ManualValidation#0
environment: development ## environment not accepted here
inputs:
notifyUsers: |
alert#test.com
instructions: 'Please validate the build configuration and resume'
onTimeout: 'reject'
This above example is using a pipeline template parameter named EnvironmentName. You have the ability to use some variables, as explained in Azure DevOps's Predefined variables documentation (see the "Available in templates?" table column).
Note that any subsequent stages that refer to this stage (or job, if you put the condition on the job) via the dependsOn functionality will also need to make that dependsOn disappear with the same logic.
More info at the Conditional Insertion official documentation.

Related

ADO YAML Templates - how to handle small number of platform-dependent tasks?

I am in the process of porting some existing Classic ADO pipelines to YAML. There are two separate Classic pipelines for Windows and Linux, but, because I have been able to switch most scripting to bash, I am close to having a common cross-platform YAML pipeline.
Nevertheless, I still have a few platform-dependent tasks interspersed between the majority platform-independent tasks. Of these few tasks, some only need to run on Windows and don't exist for Linux, and the remainder exist in two platform-specific versions of the tasks - one using bash and the other batch or PowerShell.
My hope was to make the bulk of the script into a template with an isWindows parameter, and to use this parameters to control the platform-dependent parts. This is roughly what I have, but it is not working:
trigger: none
pool:
name: BuildPool
demands:
- Agent.OS -equals Windows_NT
extends:
template: common-template.yml
parameters:
isWindows: true
Then common-template.yml itself. Note that the approach here, using condition, does not work. Although I have omitted most of the cross-platform tasks, these form the majority of the pipeline - there are only a few tasks that need special handling.
parameters:
- name: isWindows
type: boolean
jobs:
- job: MyJob
steps:
- checkout: none
clean: true
# Simple example of cross-platform script task
- bash: |
env
displayName: Print Environment
# ... more cross platform tasks
# Windows only task
- task: CmdLine#2
condition: eq('${{ parameters.isWindows }}', 'true')
inputs:
filename: scripts\windows_only.bat
# ... more cross platform tasks
# Task with specialization for each platform
# WINDOWS version
- task: CmdLine#2
condition: eq('${{ parameters.isWindows }}', 'true')
inputs:
filename: scripts\task_a.bat
# LINUX version
- task: Bash#3
condition: eq('${{ parameters.isWindows }}', 'false')
inputs:
filePath: scripts/task_a.sh
# ... more cross platform tasks
The issue is that when I try to run with a Linux agent I get this error:
No agent found in pool <pool name> satisfies both of the following demands: Agent.OS, Cmd. All demands: Agent.OS -equals Linux, Cmd, Agent.Version ...
I assume this is because CmdLine tasks are present, even though they are "turned off" via a condition. I assume the dependency on the task is probably determined before the condition is ever evaluated.
Is what I am trying to do possible? Is there another approach? I am not very experienced with ADO and this is the first time I have tried anything with templates so I am possibly missing something straightforward.
You can use PowerShell steps instead of batch/bash (PowerShell can be installed on both Windows and Linux).
You can also remove the demands and just use the predefined Agent.OS variable in your conditions for tasks which require specific OS:
- powershell: 'scripts/windows_only.ps1'
condition: eq(variables['Agent.OS', 'Windows_NT')
After digging into the ADO docs a bit, I discovered that what I needed was called Conditional Insertion:
parameters:
- name: isWindows
type: boolean
jobs:
- job: MyJob
...
# Windows only task
- ${{ if parameters.isWindows }}:
- task: CmdLine#2
inputs:
filename: scripts\windows_only.bat
# Task with specialization for each platform
- ${{ if parameters.isWindows }}:
# WINDOWS version
- task: CmdLine#2
inputs:
filename: scripts\task_a.bat
- $ {{ else }}:
# LINUX version
- task: Bash#3
inputs:
filePath: scripts/task_a.sh
...
There were a few tricky things that might be worth highlighting:
The "conditions" act as items in the YAML list of tasks. Hence there is a need to prefix with - .
The actual task that is protected by the condition is then indented a further level with respect to the condition line.
Don't forget the colon at the end of the condition.
The syntax I showed above doesn't actually work for me - I got an error about using else. It turned out that the else syntax is a feature of the 2022 release of ADO and we are stuck on the 2020 release. So in my case I had to introduce inverted tests: ${{ if not(parameters.isWindows) }}:
I got quite confused about how to test for true values. Various examples in the documentation, when talking about expressions in the condition field of a task, use syntax like: condition: eq(variables.isSomeCondition, 'true'). Note the comparison against a string value. I initially copied this in the inclusion expressions but found that both ${{ if eq(parameters.isWindows, 'true') }}: and ${{ if eq(parameters.isWindows, 'false') }}: triggered when the parameter itself was true. Clearly, the strings 'true' and 'false' evaluate to a boolean true in this context. It's not that this doesn't make sense - it is the inconsistency with the documented examples of the condition syntax that caught me out.

Azure DevOps pipeline stage - stop execution until specific users approve

I have a stage which runs this job (not exactly the same but same concept). The issue with this is it will notify those users but anyone who has the right permissions is able to approve this stage (which I want to limit it only to specific users). Is there another task that can achieve this?
jobs:
- job: waitForValidation
dependsOn: 'previousJobName'
displayName: Wait for external validation
pool: server
timeoutInMinutes: 4320 # job times out in 3 days
steps:
- task: ManualValidation#0
timeoutInMinutes: 1440 # task times out in 1 day
inputs:
notifyUsers: |
test#test.com
example#example.com
instructions: 'Please validate the build configuration and resume'
onTimeout: 'reject'
Something similar to Release Pipelines 'Pre-deployment approvals'
During my test, I suppose that you could add the manual approval by add the environment deployment.
You could add the environment sample like code below.
jobs:
- deployment: Deploy
displayName: Deploy
environment: $(environmentName)
pool:
vmImage:
strategy:
runOnce:
deploy:
steps:
- task: xxx
And in your target environment, add the approval checks with screenshot below.

How to use variable group as a runtime parameter in azure devops yml

I would like to pass the variable group as a runtime parameter so that whenever I run the pipeline, it should allow me to provide the input variable group name, and based on the input value for the variable group during runtime my pipeline should proceed.
I want to achieve this when we click on the run button, then there's a variable section also. So, I want you to accept the variable group names from there.
Pipeline.yml:
stages:
- stage: VMBackupandValidate
displayName: 'VM Backup and Validate using RSV'
jobs:
- job: VMBackupValidate
displayName: 'Azure VM Backup'
steps:
- task: AzurePowerShell#5
inputs:
azureSubscription: $(azure_sc)
ScriptType: 'FilePath'
ScriptPath: 'pipelines/automation/scripts/vmbackup.ps1'
ScriptArguments: '-ResourceGroupName $(ResourceGroupName) -Storagetype $(Storagetype) -SourceVMname $(SourceVMname) -RSVaultname $(RSVaultname) -Location $(Location) -WorkLoadType $(WorkLoadType) -Policyname $(Policyname) -Verbose'
azurePowerShellVersion: 'LatestVersion'
pwsh: true
Based on comments communication with OP.
I suggest using a parameter with a default value. It will ask you for input if want other values, before you hit run then make a condition to select the right variable based on input.
Here is a minified sample of the pipeline:
parameters:
- name: environment
displayName: Deploy Environment
type: string
default: TEST
values:
- TEST
- PROD
trigger:
- 'none'
variables:
- name: environment
${{ if contains(parameters.environment, 'TEST') }}:
value: TEST
${{ if contains(parameters.environment, 'PROD') }}:
value: PROD
stages:
- stage: TEST
displayName: Build
condition: ${{ eq(variables.environment, 'TEST') }}
jobs:
- job:
pool:
vmImage: 'ubuntu-20.04'
steps:
- script: |
echo ${{ variables.environment}}
displayName: 'Print environment info'
You can extend the logic, or replace it with other values and consume it in code later. You can create multiple stages with conditions as well as shown.
Lets say you have two variable groups with names prod and test. You could use the below pipeline:
trigger:
- main
parameters:
- name: environment
displayName: Where to deploy?
type: string
default: test
values:
- prod
- test
pool:
vmImage: ubuntu-latest
variables:
- group: ${{parameters.environment}}
steps:
- script: |
echo $(ENV)
echo $(VERSION)
displayName: Step 1 - print version and environment
- script: pwd ENV ${{parameters.environment}}
displayName: Step 2 - print parameter
You should define ENV, VERSION values on both variable groups.
Your stage should stay as is. In your case you will delete the steps I provided and use only the first part of the pipeline
Adding a reference article.
https://blog.geralexgr.com/azure/deploy-between-different-environments-with-variable-groups-azure-devops?msclkid=002b01eab02f11ec8dffa95dc3a34094

Unable to download secure files conditionally in Azure Pipelines

Question
I am using DownloadSecureFile#1 task to download Secure files.
The issue occurs when in Azure DevOps, in the Library's secure files section, only file_A.txt exists.
The script works fine when both files exists.
In my case, a user A will only need file_A.txt, user B will only need file_B.txt.
Is this an expected behavior? Any possible workarounds to fulfill the use-case?
Error Message:
There was a resource authorization issue: "The pipeline is not valid. Job Job: Step fileB input secureFile references secure file file_B.txt which could not be found. The secure file does not exist or has not been authorized for use. For authorization details, refer to https://aka.ms/yamlauthz."
Code:
parameters:
- name: file_name
type: string
default: ''
values:
- file_A.txt
- file_B.txt
pool:
vmImage: ubuntu-latest
steps:
- task: DownloadSecureFile#1
displayName: Download File A
condition: eq('${{ parameters.file_name }}', 'file_A.txt')
name: fileA
inputs:
secureFile: 'file_A.txt'
- task: DownloadSecureFile#1
displayName: Download file B
condition: eq('${{ parameters.file_name }}', 'file_B.txt')
name: fileB
inputs:
secureFile: 'file_B.txt'
Is this an expected behavior?
Yes, this is expected behavior. To turn a pipeline into a run, Azure Pipelines goes through several steps in this order:
First, expand templates and evaluate template expressions.
Next, evaluate dependencies at the stage level to pick the first
stage(s) to run.
For each stage selected to run, two things happen:
All resources used in all jobs are gathered up and validated for
authorization to run.
Evaluate dependencies at the job level to pick the first job(s) to
run.
For each job selected to run, expand multi-configs (strategy: matrix
or strategy: parallel in YAML) into multiple runtime jobs.
For each runtime job, evaluate conditions to decide whether that job
is eligible to run.
Request an agent for each eligible runtime job.
So, your secure files will be downloaded before evaluating conditions. Please refer to the document about Pipeline run sequence. As a workaround, you can refer to the sample shared by #danielorn.
Instead of using the condition on the tasks you can surround the step with an if-statement as described in use parameters to determine what steps run
parameters:
- name: file_name
type: string
default: ''
values:
- file_A.txt
- file_B.txt
pool:
vmImage: ubuntu-latest
steps:
- ${{ if eq(parameters.file_name, 'file_A.txt') }}:
- task: DownloadSecureFile#1
displayName: Download File A
name: fileA
inputs:
secureFile: 'file_A.txt'
- ${{ if eq(parameters.file_name, 'file_B.txt') }}:
- task: DownloadSecureFile#1
displayName: Download file B
name: fileB
inputs:
secureFile: 'file_B.txt'
However if every user needs exactly one file, a common (and cleaner) option would be to provide the name of the file needed as a parameter. If a secure file is not needed (i.e the parameter is the default empty) the step can be excluded using an if statement
parameters:
- name: file_name
type: string
default: ''
values:
- file_A.txt
- file_B.txt
pool:
vmImage: ubuntu-latest
steps:
- ${{ if ne(parameters.file_name, '') }}:
- task: DownloadSecureFile#1
displayName: Download Secure File
name: secureFileDownload
inputs:
secureFile: '${{ parameters.file_name }}'

Azure DevOps YAML pipeline manual intervention job run in parallel with other job

I want the jobs to go one after another and the first job should control the execution of the following one.
As there are no approvals currently available in the YAML pipeline for deployment outside Kubernetes, I'm using the Manual Intervention to stop the job from being run. But apparently, it doesn't stop the job before instead, it stops the upcoming stage. What do I do wrong? I would expect some notification on the intervention, but it fails immediately and doesn't stop the next job at all.
This is the part of the code for the Deploy STG stage, where the parameters.interventionEnabled is set to true
jobs:
- job: RunOnServer
displayName: 'Reject or resume'
pool: server
continueOnError: false
steps:
- task: ManualIntervention#8
displayName: 'Manual Intervention'
timeoutInMinutes: 0
inputs:
instructions: 'reject or resume'
enabled: ${{ parameters.interventionEnabled }}
- job: Deploy
displayName: ${{ parameters.name }}
pool:
name: ${{ parameters.agentPoolName }}
steps:
- checkout: none # skip checking out the default repository resource
- task: DownloadPipelineArtifact#2
displayName: Download NPM build artifact
inputs:
artifact: ${{ parameters.artifactName }}
buildType: 'current'
targetPath: ${{ parameters.artifactPath }}
Hey Andree ManualIntervention#8 is not supported in YAML. It is roadmapped for 2020\Q2.
I think the route you want to go down is to use approvals with generic environment types.
So you define a deployment job and environment in your yaml like so
- deployment: DeploymentHosted Context
displayName: Runs in Hosted Pool
pool:
vmImage: 'Ubuntu-16.04'
# creates an environment if it doesn't exist
environment: 'Dev'
strategy:
runOnce:
deploy:
steps:
- bash: |
echo This multiline script always runs in Bash.
echo Even on Windows machines!
And you use the GUI to protect the Environment.
Navigate to Pipelines -> Environments.
Select the Environment (you can pre-create them).
Then add and an Approval
There are some drawbacks when compared to classic release definitions and being able to manual trigger to stages. You may not want every artifact to be a candidate for each stage, and if you don't approve the environment it will eventually timeout and report failure. Other good discussion in the comments here.
This is now available:
https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/utility/manual-validation?view=azure-devops&tabs=yaml
- task: ManualValidation#0
timeoutInMinutes: 1440 # task times out in 1 day
inputs:
notifyUsers: |
test#test.com
example#example.com
instructions: 'Please validate the build configuration and resume'
onTimeout: 'resume'