Azure build pipelines - using the 'DownloadSecureFile' task within a template - azure-devops

I have an Azure DevOps project with a couple of YAML build pipelines that share a common template for some of their tasks.
I now want to add a DownloadSecureFile task to that template, but I can't find a way to get it to work.
The following snippet results in an error when added to the template, but works fine in the parent pipeline definition (Assuming I also replace the ${{ xx }} syntax for the variable names with the $(xx) version):
- task: DownloadSecureFile#1
name: securezip
displayName: Download latest files
inputs:
secureFile: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
retryCount: 5
- task: ExtractFiles#1
displayName: Extract files
inputs:
archiveFilePatterns: ${{ variables.securezip.secureFilePath }}
destinationFolder: '${{ parameters.sourcesDir }}\secure\'
cleanDestinationFolder: true
The error occurs on the 'Extract File' step and is Input required: archiveFilePatterns, so it looks like it's just not finding the variable.
As a workaround, I could move the download task to the parent pipeline scripts and pass the file path as a parameter. However, that means duplicating the task, which seems like a bit of a hack.

Variables in dollar-double-curly-brackets are resolved at template expansion time. They are not the output of tasks.
Output variables from tasks are referenced by dollar-single-parentheses and they don't need to start with the word "variables."
So I believe the line you're looking for is like this, and it isn't affected by the template mechanism.
archiveFilePatterns: $(securezip.secureFilePath)

Related

Restricting Azure Pipeline decorators

I wanted to restrict azure pipeline decorator from injecting into all the pipelines. As we know once the decorators are installed it will be applied to all the projects in the organization. how can I restrict it to certain projects?
I used Projects API and got my project id and then i added the project id to my json template as a targettask and targetid.
"properties": {
"template": "custom-postjob-decorator.yml",
"targetid": "9146bcde-ef7b-492d-8289-b3dce5f750b0"
}
it didn't work, do we need to provide a condition in the decorator.yaml for restricting it to certain projects in the organization?
I have a few more questions.
now i don't want to run my decorator on Project B and i define like the below
steps:
- ${{ if in(variables['System.TeamProject'], 'ProjectB')) }}:
- task: ADOSecurityScanner#1 --> injected decorator
I ran this and it is still injecting the decorator to ProjectB.
as you said i have created two scenarios.
to skip if the task already exists, it works like a charm, but it did not show any messages about the skipping part. how can we tell the decorator is skipped in the pipeline?
steps:
- ${{ if eq(variables['Build.task'], 'ADOSecurityScanner#1') }}:
- task: CmdLine#2
displayName: 'Skipping the injector'
inputs:
script: |
echo "##vso[task.logissue type=error]This Step '$(Build.task)' is not injected. You dont need this in pipeline"
exit 1
i have used the condition from the microsoft documentation and also similar to the one above you mentioned above.
it did not skip the injector with just the conditional command and when i added the task(powershell) in the decorator and pipeline.yaml then it is skipped the decorator matching the tasks on both yaml files.
does it show or log any info if the decorator is skipped in the pipeline.
i observed it does display differently when the decorator is skipped showing me the false statement.
do we need to define anything on the pipeline.yaml file as well?
I ran with the conditions you had provided above and somehow the decorator is still getting injected into projectB. can you let me know where i am doing wrong.
here is my basic azure-pipeline.yaml file pasted below.
trigger:
- main
pool:
vmImage: ubuntu-latest
steps:
- script: echo Hello, world!
displayName: 'Run a one-line script'
- script: |
echo Add other tasks to build, test, and deploy your project.
echo See https://aka.ms/yaml
displayName: 'Run a multi-line script'
Be sure to read these docs:
Expressions
Specify Conditions
Simplest solution is to add a condition to the pipeline decorator:
steps:
- ${{ if not(eq(variables['azure-pipelines-tfvc-fixparallel-decorator.skip'], 'true')) }}:
- task: tf-vc-fixparallel-log#2
The decorator can use the pre-defined variables, including System.TeamProject or System.TeamProjectId:
System.TeamProject The name of the project that contains this build.
System.TeamProjectId The ID of the project that this build belongs to.
For example:
steps:
- ${{ if in(variables['System.TeamProject'], 'ProjectA', 'ProjectB', 'ProjectC')) }}:
- task: tf-vc-fixparallel-log#2
You will need to include the steps: if you aren't already. For some reason returning an empty steps[] is fine, but returning nothing will result in a failing decorator.
When you want to exclude a list of projects, simply inverse the condition. So to exclude 'ProjectB':
steps:
- ${{ if not(in(variables['System.TeamProject'], 'ProjectB')) }}:
- task: ADOSecurityScanner#1 --> injected decorator
Your statement above explicitly inserts into ProjectB, nowhere else.
With regards to skipping the injector. The above -${{ if ... }}: completely removed the steps indented below it when the don't need to run. They will not appear at all.
You can also put a condition on the task itself, in that case it will show up in the timeline, but as skipped:
steps:
- task: ADOSecurityScanner#1 --> injected decorator
condition: not(in(variables['System.TeamProject'], 'ProjectB'))
You can combine the condition with a variable, add it to the expression:
steps:
- ${{ if not(or(in(variables['System.TeamProject'], 'ProjectB'),
eq(variables['skipInjector'], 'true')) }}:
- task: ADOSecurityScanner#1 --> injected decorator
As you can see you can and(..., ...) and or(..., ...) these things together to form expressions as complex as you need.
You can see the evaluation of your decorator in the top level log panel of your pipeline job. You need to expand the Job preparation parameters section. And make sure you queue the pipeline with diagnostics turned on.
When you used a condition on the task within the decorator, expand the log for the task to see the condition evaluation:
Looking at she screenshot you posted, two versions of the decorator are included, one with conditions and one without it. Make sure you uninstall/disable that other version.

Azure Pipelines - Output variable from python script file to pipeline variable

I've tried several articles and threads from Stackoverflow but can't seem to get anywhere. I am trying to take a variable from a .py file which is called in a YAML step and set that variable globally to be used.
In my .py file i have
print(f'##vso[task.setvariable variable=AMLPipelineId;isOutput=true]{pipelineId}')
Then in my YAML pipeline step i have
- task: AzurePowerShell#5
displayName: 'Run AML Pipeline'
inputs:
azureSubscription: '$(azureSubscription)'
ScriptType: 'InlineScript'
name: AmlPipeline
azurePowerShellVersion: 'LatestVersion'
pwsh: true
Inline: |
$username = "$(ARM_CLIENT_ID)"
$password = "$(ARM_CLIENT_SECRET)"
$tenantId = "$(ARM_TENANT_ID)"
python $(Pipeline.Workspace)/AML_Pipeline/build_aml_pipeline.py --wsName $(wsName) --resourceGroup $(ResourceGroupName) --subscriptionId $(subId)
$MLPipelineId = $AmlPipeline.AMLPipelineId
But it seems like this variable is empty. I know there are other ways of using the "set variable" but this is my latest attempt i.e. something like print('##vso[task.setvariable variable=version;]%s' % (version))
My current approach i followed: https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch
You don't need isOutput=true - that's only needed for referencing variables between different jobs or stages.
"You cannot use the variable in the step that it is defined." - split that script into two steps: one that runs your .py file, second one that uses the newly defined variable.
I used print('##vso[task.setvariable variable=<Variable-in-Pipeline]+<output-variable>')
Variable-in-Pipeline // the given name should be used in Azure Devops pipeline and should be added to pipeline variables as an empty string
A very minimal example for everyone struggling with this. The documentation is kind of lacking on this for my taste. As #qbik said, dont set and use the variable in the same step, make it seperate steps.
set_variable.py
if __name__ == '__main__':
# set name of the variable
name = 'COLOR'
# set value of the variable
value = 'red'
# set variable
print(f'##vso[task.setvariable variable={name};]{value}')
azure-pipelines.yml
trigger:
- main
pool:
vmImage: ubuntu-latest
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: '3.9'
displayName: 'Use Python 3.9'
# run the script to set the variable
- task: PythonScript#0
inputs:
scriptSource: filePath
scriptPath: set_variable.py
# now you can use the variable in the next step
- bash: echo my favorite color is: $(COLOR)
Now you can technically do all kinds of cool stuff in python, then set, and reference the variables in the following steps. In my case I have to extract specific package version numbers from a JSON/YAML file based on an id that is set earlier in the pipeline and parse the information as an args for a docker build. Hope that helps other people stumbling across this answer looking for a minimal working example :)

Conditionally copying folders in Azure Pipeline

When we perform uploads using our production pipeline I'd like to be able to conditionally upload the assets folder of our application. This is because it takes significantly more time to upload the contents of that folder and it rarely has any changes to it anyway.
I've wrote the "CopyFilesOverSSH" task as follows...
# Node.js with Angular
trigger:
- production
pool:
default
steps:
- task: CopyFilesOverSSH#0
inputs:
sshEndpoint: 'offigo-production-server'
sourceFolder: '$(Agent.BuildDirectory)/s/dist/offigo-v2/'
${{ if eq('$(UploadAssets)', 0) }}:
contents: |
!browser/assets/**
${{ if eq('$(UploadAssets)', 1) }}:
contents: |
!browser/assets/static-pages/**
!browser/assets/page-metadata/**
targetFolder: '/var/www/docker/DocumentRoot/offigo/frontend'
readyTimeout: '20000'
continueOnError: true
However, when the pipeline runs it completely ignores the rules either way and just uploads all contents of the assets folder. I can't work out why it is not working correctly, some help would be greatly appreciated...
If UploadAssets is a variable name you should use this syntax like here
${{ if eq(variables['UploadAssets'], 0) }}:
Going out on a limb and assuming your future deployment steps if using YAML Pipelines will also have this condition on what and how to deploy.
Would recommend to create two templates. The template would have all the steps outline and if steps are being reused would recommend templating them so they are only defined once.
The issue is here is how the variable is being declared. $() is at the macro level. To confirm download the logs by clicking the ellipses on a job that has been completed:
I'd guess you'd ratehr need at at runtime so $[variables.UploadAssets] .
Alternatively, when editing in the browser can click download full YAML file to see it as well though this won't have any variables passed in at runtime.
Feel free to ask any more questions in the comments as this stuff is never as straightforward.
Thanks Krzysztof Madej, I tried using
${{ if eq(variables['UploadAssets'], 0) }}:
But it still didn't resolve the issue, the variable just seemed to be undefined. Using the link DreadedFrost provided about templates I ended up adding a parameter into the pipeline like so...
# Node.js with Angular
trigger:
- production
pool:
default
parameters:
- name: uploadAssets # Upload assets folder?; required
type: boolean # data type of the parameter; required
default: true
steps:
- task: CopyFilesOverSSH#0
inputs:
sshEndpoint: 'offigo-production-server'
sourceFolder: '$(Agent.BuildDirectory)/s/dist/offigo-v2/'
${{ if eq(parameters.uploadAssets, false) }}:
contents: |
!browser/assets/**
${{ if eq(parameters.uploadAssets, true) }}:
contents: |
!browser/assets/static-pages/**
!browser/assets/page-metadata/**
targetFolder: '/var/www/docker/DocumentRoot/offigo/frontend'
readyTimeout: '20000'
continueOnError: true
This then adds a checkbox when I run the pipeline, so I can set whether or not to upload the folder.
I doubt this is the best way of doing it but it resolves the issue for now, I'll have a proper look at implementing templates in the future.

ADO gulp task - pass previous task's output variable as argument

I am trying this for quite sometime now & not able to figure out how to proceed further. My requirement is to dynamically calculate a variable in a bash task & then in next step use the same as a parameter to the gulp task. Below are my two tasks as part of the build pipeline (removed lines for simplicity)
- task: Bash#3
name: SetVariableValue
inputs:
targetType: inline
script: >
// removed
myvariableValue=$(do something & calculate here, assume value will be 'abc')
// Set to an output variable
echo "##vso[task.setvariable variable=myVar;isOutput=true]$myvariableValue"
- task: gulp#0
displayName: Publish front-end
inputs:
gulpFile: $(Build.SourcesDirectory)/../gulpfile.js
targets: publish
arguments: >-
--buildId $(Build.BuildId) --buildNumber $(Build.BuildNumber) --sourceBranch $(Build.SourceBranch) --var $(myVar)
gulpjs: >-
$(Build.SourcesDirectory)/../node_modules/gulp/bin/gulp.js
enableCodeCoverage: false
I am using linux machines to build angular packages and the gulp publish command is used to package our code depending on the said variable.
With above steps all the rest parameters like BuildId, BuildNumber, SourceBranch are getting passed correctly but the 'var' parameter is being passed as $(myVar) only, rather abc
Can you please help me on what am I missing here? I tried multiple things like --var $(Build.myVar), but not able to make it work.
Thanks
Sanjay
Oh man all I needed was to correct the name of the step & refer it without mistyping :(
--var $(SetVariableValue.myVar) did the trick. Thanks.

Azure YAML Get variable from a job run in a previous stage

I am creating YAML pipeline in Azure DevOps that consists of two stages.
The first stage (Prerequisites) is responsible for reading the git commit and creates a comma separated variable containing the list of services that has been affected by the commit.
The second stage (Build) is responsible for building and unit testing the project. This Stage consists of many templates, one for each Service. In the template script, the job will check if the relevant Service in in the variable created in the previous stage. If the job finds the Service it will continue to build and test the service. However if it cannot find the service, it will skip that job.
Run.yml:
stages:
- stage: Prerequisites
jobs:
- job: SetBuildQueue
steps:
- task: powershell#2
name: SetBuildQueue
displayName: 'Set.Build.Queue'
inputs:
targetType: inline
script: |
## ... PowerShell script to get changes - working as expected
Write-Host "Build Queue Auto: $global:buildQueueVariable"
Write-Host "##vso[task.setvariable variable=buildQueue;isOutput=true]$global:buildQueueVariable"
- stage: Build
jobs:
- job: StageInitialization
- template: Build.yml
parameters:
projectName: Service001
projectLocation: src/Service001
- template: Build.yml
parameters:
projectName: Service002
projectLocation: src/Service002
Build.yml:
parameters:
projectName: ''
projectLocation: ''
jobs:
- job:
displayName: '${{ parameters.projectName }} - Build'
dependsOn: SetBuildQueue
continueOnError: true
condition: and(succeeded(), contains(dependencies.SetBuildQueue.outputs['SetBuildQueue.buildQueue'], '${{ parameters.projectName }}'))
steps:
- task: NuGetToolInstaller#1
displayName: 'Install Nuget'
Issue:
When the first stages runs it will create a variable called buildQueue which is populated as seen in the console output of the PowerShell script task:
Service001 Changed
Build Queue Auto: Service001;
However when it gets to stage two and it tries to run the build template, when it checks the conditions it returns the following output:
Started: Today at 12:05 PM
Duration: 16m 7s
Evaluating: and(succeeded(), contains(dependencies['SetBuildQueue']['outputs']['SetBuildQueue.buildQueue'], 'STARS.API.Customer.Assessment'))
Expanded: and(True, contains(Null, 'service001'))
Result: False
So my question is how do I set the dependsOn and condition to get the information from the previous stage?
It because you want to access the variable in a different stage from where you defined them. currently, it's impossible, each stage it's a new instance of a fresh agent.
In this blog you can find a workaround that involves writing the variable to disk and then passing it as a file, leveraging pipeline artifacts.
To pass the variable FOO from a job to another one in a different stage:
Create a folder that will contain all variables you want to pass; any folder could work, but something like mkdir -p $(Pipeline.Workspace)/variables might be a good idea.
Write the contents of the variable to a file, for example echo "$FOO" > $(Pipeline.Workspace)/variables/FOO. Even though the name could be anything you’d like, giving the file the same name as the variable might be a good idea.
Publish the $(Pipeline.Workspace)/variables folder as a pipeline artifact named variables
In the second stage, download the variables pipeline artifact
Read each file into a variable, for example FOO=$(cat $(Pipeline.Workspace)/variables/FOO)
Expose the variable in the current job, just like we did in the first example: echo "##vso[task.setvariable variable=FOO]$FOO"
You can then access the variable by expanding it within Azure Pipelines ($(FOO)) or use it as an environmental variable inside a bash script ($FOO).