Conditionally copying folders in Azure Pipeline - azure-devops

When we perform uploads using our production pipeline I'd like to be able to conditionally upload the assets folder of our application. This is because it takes significantly more time to upload the contents of that folder and it rarely has any changes to it anyway.
I've wrote the "CopyFilesOverSSH" task as follows...
# Node.js with Angular
trigger:
- production
pool:
default
steps:
- task: CopyFilesOverSSH#0
inputs:
sshEndpoint: 'offigo-production-server'
sourceFolder: '$(Agent.BuildDirectory)/s/dist/offigo-v2/'
${{ if eq('$(UploadAssets)', 0) }}:
contents: |
!browser/assets/**
${{ if eq('$(UploadAssets)', 1) }}:
contents: |
!browser/assets/static-pages/**
!browser/assets/page-metadata/**
targetFolder: '/var/www/docker/DocumentRoot/offigo/frontend'
readyTimeout: '20000'
continueOnError: true
However, when the pipeline runs it completely ignores the rules either way and just uploads all contents of the assets folder. I can't work out why it is not working correctly, some help would be greatly appreciated...

If UploadAssets is a variable name you should use this syntax like here
${{ if eq(variables['UploadAssets'], 0) }}:

Going out on a limb and assuming your future deployment steps if using YAML Pipelines will also have this condition on what and how to deploy.
Would recommend to create two templates. The template would have all the steps outline and if steps are being reused would recommend templating them so they are only defined once.
The issue is here is how the variable is being declared. $() is at the macro level. To confirm download the logs by clicking the ellipses on a job that has been completed:
I'd guess you'd ratehr need at at runtime so $[variables.UploadAssets] .
Alternatively, when editing in the browser can click download full YAML file to see it as well though this won't have any variables passed in at runtime.
Feel free to ask any more questions in the comments as this stuff is never as straightforward.

Thanks Krzysztof Madej, I tried using
${{ if eq(variables['UploadAssets'], 0) }}:
But it still didn't resolve the issue, the variable just seemed to be undefined. Using the link DreadedFrost provided about templates I ended up adding a parameter into the pipeline like so...
# Node.js with Angular
trigger:
- production
pool:
default
parameters:
- name: uploadAssets # Upload assets folder?; required
type: boolean # data type of the parameter; required
default: true
steps:
- task: CopyFilesOverSSH#0
inputs:
sshEndpoint: 'offigo-production-server'
sourceFolder: '$(Agent.BuildDirectory)/s/dist/offigo-v2/'
${{ if eq(parameters.uploadAssets, false) }}:
contents: |
!browser/assets/**
${{ if eq(parameters.uploadAssets, true) }}:
contents: |
!browser/assets/static-pages/**
!browser/assets/page-metadata/**
targetFolder: '/var/www/docker/DocumentRoot/offigo/frontend'
readyTimeout: '20000'
continueOnError: true
This then adds a checkbox when I run the pipeline, so I can set whether or not to upload the folder.
I doubt this is the best way of doing it but it resolves the issue for now, I'll have a proper look at implementing templates in the future.

Related

ADO YAML Templates - how to handle small number of platform-dependent tasks?

I am in the process of porting some existing Classic ADO pipelines to YAML. There are two separate Classic pipelines for Windows and Linux, but, because I have been able to switch most scripting to bash, I am close to having a common cross-platform YAML pipeline.
Nevertheless, I still have a few platform-dependent tasks interspersed between the majority platform-independent tasks. Of these few tasks, some only need to run on Windows and don't exist for Linux, and the remainder exist in two platform-specific versions of the tasks - one using bash and the other batch or PowerShell.
My hope was to make the bulk of the script into a template with an isWindows parameter, and to use this parameters to control the platform-dependent parts. This is roughly what I have, but it is not working:
trigger: none
pool:
name: BuildPool
demands:
- Agent.OS -equals Windows_NT
extends:
template: common-template.yml
parameters:
isWindows: true
Then common-template.yml itself. Note that the approach here, using condition, does not work. Although I have omitted most of the cross-platform tasks, these form the majority of the pipeline - there are only a few tasks that need special handling.
parameters:
- name: isWindows
type: boolean
jobs:
- job: MyJob
steps:
- checkout: none
clean: true
# Simple example of cross-platform script task
- bash: |
env
displayName: Print Environment
# ... more cross platform tasks
# Windows only task
- task: CmdLine#2
condition: eq('${{ parameters.isWindows }}', 'true')
inputs:
filename: scripts\windows_only.bat
# ... more cross platform tasks
# Task with specialization for each platform
# WINDOWS version
- task: CmdLine#2
condition: eq('${{ parameters.isWindows }}', 'true')
inputs:
filename: scripts\task_a.bat
# LINUX version
- task: Bash#3
condition: eq('${{ parameters.isWindows }}', 'false')
inputs:
filePath: scripts/task_a.sh
# ... more cross platform tasks
The issue is that when I try to run with a Linux agent I get this error:
No agent found in pool <pool name> satisfies both of the following demands: Agent.OS, Cmd. All demands: Agent.OS -equals Linux, Cmd, Agent.Version ...
I assume this is because CmdLine tasks are present, even though they are "turned off" via a condition. I assume the dependency on the task is probably determined before the condition is ever evaluated.
Is what I am trying to do possible? Is there another approach? I am not very experienced with ADO and this is the first time I have tried anything with templates so I am possibly missing something straightforward.
You can use PowerShell steps instead of batch/bash (PowerShell can be installed on both Windows and Linux).
You can also remove the demands and just use the predefined Agent.OS variable in your conditions for tasks which require specific OS:
- powershell: 'scripts/windows_only.ps1'
condition: eq(variables['Agent.OS', 'Windows_NT')
After digging into the ADO docs a bit, I discovered that what I needed was called Conditional Insertion:
parameters:
- name: isWindows
type: boolean
jobs:
- job: MyJob
...
# Windows only task
- ${{ if parameters.isWindows }}:
- task: CmdLine#2
inputs:
filename: scripts\windows_only.bat
# Task with specialization for each platform
- ${{ if parameters.isWindows }}:
# WINDOWS version
- task: CmdLine#2
inputs:
filename: scripts\task_a.bat
- $ {{ else }}:
# LINUX version
- task: Bash#3
inputs:
filePath: scripts/task_a.sh
...
There were a few tricky things that might be worth highlighting:
The "conditions" act as items in the YAML list of tasks. Hence there is a need to prefix with - .
The actual task that is protected by the condition is then indented a further level with respect to the condition line.
Don't forget the colon at the end of the condition.
The syntax I showed above doesn't actually work for me - I got an error about using else. It turned out that the else syntax is a feature of the 2022 release of ADO and we are stuck on the 2020 release. So in my case I had to introduce inverted tests: ${{ if not(parameters.isWindows) }}:
I got quite confused about how to test for true values. Various examples in the documentation, when talking about expressions in the condition field of a task, use syntax like: condition: eq(variables.isSomeCondition, 'true'). Note the comparison against a string value. I initially copied this in the inclusion expressions but found that both ${{ if eq(parameters.isWindows, 'true') }}: and ${{ if eq(parameters.isWindows, 'false') }}: triggered when the parameter itself was true. Clearly, the strings 'true' and 'false' evaluate to a boolean true in this context. It's not that this doesn't make sense - it is the inconsistency with the documented examples of the condition syntax that caught me out.

Restricting Azure Pipeline decorators

I wanted to restrict azure pipeline decorator from injecting into all the pipelines. As we know once the decorators are installed it will be applied to all the projects in the organization. how can I restrict it to certain projects?
I used Projects API and got my project id and then i added the project id to my json template as a targettask and targetid.
"properties": {
"template": "custom-postjob-decorator.yml",
"targetid": "9146bcde-ef7b-492d-8289-b3dce5f750b0"
}
it didn't work, do we need to provide a condition in the decorator.yaml for restricting it to certain projects in the organization?
I have a few more questions.
now i don't want to run my decorator on Project B and i define like the below
steps:
- ${{ if in(variables['System.TeamProject'], 'ProjectB')) }}:
- task: ADOSecurityScanner#1 --> injected decorator
I ran this and it is still injecting the decorator to ProjectB.
as you said i have created two scenarios.
to skip if the task already exists, it works like a charm, but it did not show any messages about the skipping part. how can we tell the decorator is skipped in the pipeline?
steps:
- ${{ if eq(variables['Build.task'], 'ADOSecurityScanner#1') }}:
- task: CmdLine#2
displayName: 'Skipping the injector'
inputs:
script: |
echo "##vso[task.logissue type=error]This Step '$(Build.task)' is not injected. You dont need this in pipeline"
exit 1
i have used the condition from the microsoft documentation and also similar to the one above you mentioned above.
it did not skip the injector with just the conditional command and when i added the task(powershell) in the decorator and pipeline.yaml then it is skipped the decorator matching the tasks on both yaml files.
does it show or log any info if the decorator is skipped in the pipeline.
i observed it does display differently when the decorator is skipped showing me the false statement.
do we need to define anything on the pipeline.yaml file as well?
I ran with the conditions you had provided above and somehow the decorator is still getting injected into projectB. can you let me know where i am doing wrong.
here is my basic azure-pipeline.yaml file pasted below.
trigger:
- main
pool:
vmImage: ubuntu-latest
steps:
- script: echo Hello, world!
displayName: 'Run a one-line script'
- script: |
echo Add other tasks to build, test, and deploy your project.
echo See https://aka.ms/yaml
displayName: 'Run a multi-line script'
Be sure to read these docs:
Expressions
Specify Conditions
Simplest solution is to add a condition to the pipeline decorator:
steps:
- ${{ if not(eq(variables['azure-pipelines-tfvc-fixparallel-decorator.skip'], 'true')) }}:
- task: tf-vc-fixparallel-log#2
The decorator can use the pre-defined variables, including System.TeamProject or System.TeamProjectId:
System.TeamProject The name of the project that contains this build.
System.TeamProjectId The ID of the project that this build belongs to.
For example:
steps:
- ${{ if in(variables['System.TeamProject'], 'ProjectA', 'ProjectB', 'ProjectC')) }}:
- task: tf-vc-fixparallel-log#2
You will need to include the steps: if you aren't already. For some reason returning an empty steps[] is fine, but returning nothing will result in a failing decorator.
When you want to exclude a list of projects, simply inverse the condition. So to exclude 'ProjectB':
steps:
- ${{ if not(in(variables['System.TeamProject'], 'ProjectB')) }}:
- task: ADOSecurityScanner#1 --> injected decorator
Your statement above explicitly inserts into ProjectB, nowhere else.
With regards to skipping the injector. The above -${{ if ... }}: completely removed the steps indented below it when the don't need to run. They will not appear at all.
You can also put a condition on the task itself, in that case it will show up in the timeline, but as skipped:
steps:
- task: ADOSecurityScanner#1 --> injected decorator
condition: not(in(variables['System.TeamProject'], 'ProjectB'))
You can combine the condition with a variable, add it to the expression:
steps:
- ${{ if not(or(in(variables['System.TeamProject'], 'ProjectB'),
eq(variables['skipInjector'], 'true')) }}:
- task: ADOSecurityScanner#1 --> injected decorator
As you can see you can and(..., ...) and or(..., ...) these things together to form expressions as complex as you need.
You can see the evaluation of your decorator in the top level log panel of your pipeline job. You need to expand the Job preparation parameters section. And make sure you queue the pipeline with diagnostics turned on.
When you used a condition on the task within the decorator, expand the log for the task to see the condition evaluation:
Looking at she screenshot you posted, two versions of the decorator are included, one with conditions and one without it. Make sure you uninstall/disable that other version.

Azure Devops yml pipeline if else condition with variables

I am trying to use if else conditions in Azure Devops yml pipeline with variable groups. I am trying to implement it as per latest Azure Devops yaml pipeline build.
Following is the sample code for the if else condition in my scenario. test is a variable inside my-global variable group.
variables:
- group: my-global
- name: fileName
${{ if eq(variables['test'], 'true') }}:
value: 'product.js'
${{ elseif eq(variables['test'], false) }}:
value: 'productCost.js'
jobs:
- job:
steps:
- bash:
echo test variable value $(fileName)
When the above code is executed, in echo statement we don't see any value for filename, i.e. it empty, meaning none of the above if else condition was executed, however when I test the if else condition with the following condition.
- name: fileName
${{ if eq('true', 'true') }}:
value: 'product.js'
Filename did echo the correct value, i.e. product.js. So my conclusion is that I am not able to refer the variables from the variable group correctly. So any suggestion will be helpful and appreciated.
Thanks!
Unfortunately there is no ternary operator in Azure DevOps Pipelines. And it seems unlikely considering the state of https://github.com/microsoft/azure-pipelines-yaml/issues/256 and https://github.com/microsoft/azure-pipelines-yaml/issues/278. So for the time being the only choices are :
conditional insertion : it works with parameters, and should work with variables according to the documentation (but it is difficult to use properly),
or the hacks you can find in this Stack Overflow question.
Another work-around has been posted by Simon Alling on GitHub (https://github.com/microsoft/azure-pipelines-yaml/issues/256#issuecomment-1077684972) :
format(
replace(replace(condition, True, '{0}'), False, '{1}'),
valueIfTrue,
valueIfFalse
)
It is similar to the solution provided by Tejas Nagchandi, but I find it a little bit better because the syntax looks closer to what it would be if there was a ternary operator.
I was able to achieve the goal using some dirty work-around, but I do agree that using parameters would be much better way unless ternary operators are available for Azure DevOps YAML pipeline.
The issue is that ${{ if condition }}: is compile time expression, thus the variables under variable group are not available.
I was able to use runtime expressions $[<expression>]
Reference: https://learn.microsoft.com/en-us/azure/devops/pipelines/process/expressions?view=azure-devops
My pipeline:
trigger:
- none
variables:
- group: Temp-group-for-testing
- name: fileName
value: $[replace(replace('True',eq(variables['test'], 'True'), 'value1'),'True','value2')]
stages:
- stage: test
jobs:
- job: testvar
continueOnError: false
steps:
- bash: echo $(fileName)
displayName: "echo variable"
Results are available on github
After detailed investigation I realized that if else doesnt work with variables in Az Devop yaml pipelines, it only works with parameters. However the solution posted by #Tejas Nagchandi is a workaround and might be able to accomplish the same logic of if else setting variable value with replace commands. Hats off to TN.

Is it possible to set a condition based on System.PullRequest.TargetBranch for a stage in a pipeline template?

I have a solution where a git branch is directly related to an environment (this has to be this way, so please do not discuss whether this is good or bad, I know it is not best practice).
We have the option to run a verification deployment (including automatic tests) towards an environment, without actually deploying the solution to the environment. Because of this, I would like to set up a pipeline that runs this verification for an environment, whenever a pull request is opened towards that environment's branch. Moreover, I am using a template for the majority of the pipeline. The actual pipeline in the main repository is just a tiny solution that points towards the template pipeline in another repository. This template, in turn, has stages for each respective environment.
I have, in the main pipeline, successfully added a solution that identifies the current branch, which for pull requests should be the target branch:
variables:
- name: currentBranch
${{ if eq(variables['Build.Reason'], 'PullRequest') }}:
value: $(System.PullRequest.TargetBranch)
${{ if ne(variables['Build.Reason'], 'PullRequest') }}:
value: $(Build.SourceBranch)
I would like to send this variable currentBranch down to the template through a parameter, as my template pipeline has different stages depending on the branch. My solution was to use the pipeline like this:
extends:
template: <template-reference>
parameters:
branch: $(currentBranch)
...and then for a stage in my pipeline do this:
- stage: TestAndDeployBranchName
condition: eq('${{ parameters.branch }}', 'refs/heads/branchName')
jobs:
- job1... etc.
Basically, the stage should run if the current branch is either "branchName", or (for pull requests) when the target branch is "branchName", which comes from the "branch" parameters that is sent to the template.
However, I see here that System.PullRequest.TargetBranch is not available for templates and further here that the parameters are not available for templates (the variable is empty) when the template is expanded. Thus my pipeline does not work as expected (the condition does not trigger when it should, ie. when there is a match on the branch name).
Is there any way that I can use System.PullRequest.TargetBranch in a condition within a template, or should I look for another solution?
After investigating this further I concluded that what I am trying to do is not possible.
In short, System.PullRequest.TargetBranch (and I assume at least some other variables within System.PullRequest are not available in compile time for template, which is when conditions are evaluated. Thus, using these variables in a condition in a template is not possible.
As my goal was to have certain steps run for pull requests only, based on the target branch of the pull request, I solved this by creating duplicate pipelines. Each pipeline is the same and references the same template, except for that the input parameter for the template is different. I then added each "PR pipelines" to run as part of the branch policy each respective branch this was applicable.
This works great, however it requires me to create a new pipeline if I have the same requirement for another branch. Moreover, I have to maintain each PR pipeline separately (which can be both good and bad).
Not an ideal solution, but it works.
Reference PR pipeline:
trigger: none # no trigger as PR triggers are set by branch policies
#This references the template repository to reuse the basic pipeline
resources:
repositories:
- repository: <template repo>
type: git # "git" means azure devops repository
name: <template name> # Syntax: <project>/<repo>
ref: refs/heads/master # Grab latest pipeline template from the master branch
stages:
- stage: VerifyPullRequest
condition: |
and(
not(failed()),
not(canceled()),
eq(variables['Build.Reason'], 'PullRequest')
)
displayName: 'Verify Pull Request'
jobs:
- template: <template reference> # Template reference
parameters:
image: <image>
targetBranch: <targetBranch> # Adjust this to match each respective relevant branch
The targetBranch parameter is the used in relevant places in the template to run PR verification.
Example of branch policy:
(Set this up for each relevant branch)
Picture of branch policy set up
After checking your script, we find we can not use the
variables:
- name: currentBranch
${{ if eq(variables['Build.Reason'], 'PullRequest') }}:
value: $(System.PullRequest.TargetBranch)
${{ if ne(variables['Build.Reason'], 'PullRequest') }}:
value: $(Build.SourceBranch)
in the variables.
The variables will duplicate the second value to first one.
This will cause your issue.
So, on my side, I create a work around and hope this will help you. Here is my main yaml:
parameters:
- name: custom_agent
displayName: Use Custom Agent
type: boolean
default: true
- name: image
type: string
default: default
resources:
repositories:
- repository: templates
type: git
name: Tech-Talk/template
trigger: none
pool:
vmImage: windows-latest
# vmImage: ubuntu-20.04
stages:
- stage: A
jobs:
- job: A1
steps:
- task: PowerShell#2
name: printvar
inputs:
targetType: 'inline'
script: |
If("$(Build.Reason)" -eq "PullRequest"){
Write-Host "##vso[task.setvariable variable=currentBranch;isOutput=true]$(System.PullRequest.TargetBranch)"
}
else{
Write-Host "##vso[task.setvariable variable=currentBranch;isOutput=true]$(Build.SourceBranch)"
}
- stage: B
condition: eq(dependencies.A.outputs['A1.printvar.currentBranch'], 'refs/heads/master')
dependsOn: A
jobs:
- job: B1
variables:
varFromA: $[ stageDependencies.A.A1.outputs['printvar.currentBranch'] ]
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
# Write your PowerShell commands here.
Write-Host "$(varFromA)"
- template: temp.yaml#templates
parameters:
branchName: $(varFromA)
agent_pool_name: ''
db_resource_path: $(System.DefaultWorkingDirectory)
Please Note:
If we use this, we need to modified your temp yaml.
We need to move the condition to the main yaml and make the temp yaml only steps is left.

Azure build pipelines - using the 'DownloadSecureFile' task within a template

I have an Azure DevOps project with a couple of YAML build pipelines that share a common template for some of their tasks.
I now want to add a DownloadSecureFile task to that template, but I can't find a way to get it to work.
The following snippet results in an error when added to the template, but works fine in the parent pipeline definition (Assuming I also replace the ${{ xx }} syntax for the variable names with the $(xx) version):
- task: DownloadSecureFile#1
name: securezip
displayName: Download latest files
inputs:
secureFile: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
retryCount: 5
- task: ExtractFiles#1
displayName: Extract files
inputs:
archiveFilePatterns: ${{ variables.securezip.secureFilePath }}
destinationFolder: '${{ parameters.sourcesDir }}\secure\'
cleanDestinationFolder: true
The error occurs on the 'Extract File' step and is Input required: archiveFilePatterns, so it looks like it's just not finding the variable.
As a workaround, I could move the download task to the parent pipeline scripts and pass the file path as a parameter. However, that means duplicating the task, which seems like a bit of a hack.
Variables in dollar-double-curly-brackets are resolved at template expansion time. They are not the output of tasks.
Output variables from tasks are referenced by dollar-single-parentheses and they don't need to start with the word "variables."
So I believe the line you're looking for is like this, and it isn't affected by the template mechanism.
archiveFilePatterns: $(securezip.secureFilePath)