I'm creating a docker build and push task in an Azure Devops Build pipeline YAML file, and I'd like to tag the image with the combination of two variables, the project name and the build number, so the tag would be service_22 (service is the project, 22 is the build number).
How do I join two variables together, e.g. $(variable)_$(variable2)
- task: Docker#2
inputs:
containerRegistry: 'Azure'
repository: 'jdmcontainers'
command: 'buildAndPush'
Dockerfile: 'Dockerfile'
tags: |
$(Build.BuildId)
$(imageName)
That's the current file, the tags are added as individual separate tags.
For those who come here and are actually looking for a way to combine two strings/variables in the job variable declaration, you can use the format() expression (docs link):
- job: Build
variables:
job_variable: $[format('Hey-{0}', variables['Build.Reason'])]
Hope that helps someone, at leased I searched for this for hours and found nothing. ^^
Try with below format:
steps:
- task: Docker#2
displayName: buildAndPush
inputs:
xxxx
tags: '$(System.TeamProject)_$(Build.Buildid)'
$(System.TeamProject) is one environment variable which can get the current project name.
Try this it works in Azure pipeline yaml
parameters:
- name: String1
displayName: StringName
type: string
values: 'Hello '
variables:
- name: String2
value: 'World'
- name: StringConcat
value: ${{parameters.String1}}${{variables.String2}}
Related
I have a variable group defined in pipeline > Library > variable group > called 'template-variable-group'
All that I am trying to accomplish here is to pass on the value of the variable my-temp-var (in this case as you can see its value is my-template-value) to a template from the yaml file.
I have a yaml based pipeline as follows.
variables:
- group: template-variable-group
name: $(date:yyyyMMdd)$(rev:.r)
stages:
- stage: Build
jobs:
- job: buildWebApp
displayName: Build Release pipeline for Discount Service on Master branch
steps:
- script: |
echo Here we go
displayName: 'Command Line Script'
- template: template.yaml
parameters:
variableToTemplate: ${{variables['my-temp-var']}}
And the template.yaml file is as follows.
parameters:
variableToTemplate:
steps:
- task: CmdLine#2
inputs:
script: |
echo Hello Hello Hello Hello Hello Hello
echo ${{ parameters.variableToTemplate }}
displayName: 'Run a two-line script'
I am not able to do that.
As you can see, that value is not reaching the template. What am I missing.
I saw this SO Answer but did not seem to be of help.
You can just pass it like a regular variable:
$(my-temp-var)
Passing it this way $(my-temp-var) will only serve for printing the value somewhere. If you try to use it as a parameter to something say for an ssh connection, the passed value will not work. I am still exploring why but it does not wok.
Template:
parameters:
- name: PathPrefix
displayName: 'Path prefix'
type: string
default: ''
steps:
- task: DotNetCoreCLI#2
displayName: 'dotnet restore'
inputs:
command: restore
projects: ${{parameters.PathPrefix}}**/$(Build.DefinitionName).sln
Pipeline:
resources:
repositories:
- repository: devops
name: foo/devops
type: git
ref: master
trigger:
branches:
include:
- refs/heads/*
jobs:
- job: Job_1
displayName: Agent job 1
pool:
vmImage: windows-latest
steps:
- checkout: self
- template: azure/pipelines/pipeline.yaml#devops
parameters:
PathPrefix: $(Build.DefinitionName)
Run error during the restore step:
##[error]No files matched the search pattern.
Even setting verbosityRestore: detailed on the restore step doesn't give me any more information.
If I don't set PathPrefix, it seems to use the default empty string and find the solution file (in some cases). However, if I do set the prefix, which is needed in some repos, it can't find the file. I've tried various ways of referencing the parameter within the template (${{}}, $(), $[] and others) and different ways of specifying it within the pipeline, including hard-coding the path (though I want to use the variable instead), but nothing works.
I thought maybe variables would work instead, so I also tried specifying variables in the pipeline and using them in the template, but that results in the same error. Defining the variable in the template gave me a compilation error for the template (unexpected token 'variable' or something similar).
Look at what you're passing and mentally expand the results.
${{parameters.PathPrefix}}**/$(Build.DefinitionName).sln
If Build.DefinitionName is foo, and you pass that in as PathPrefix, then what you get is:
foo**/foo.sln
It looks like you want an extra forward slash in there, so you get foo/**/foo.sln.
Question
I am using DownloadSecureFile#1 task to download Secure files.
The issue occurs when in Azure DevOps, in the Library's secure files section, only file_A.txt exists.
The script works fine when both files exists.
In my case, a user A will only need file_A.txt, user B will only need file_B.txt.
Is this an expected behavior? Any possible workarounds to fulfill the use-case?
Error Message:
There was a resource authorization issue: "The pipeline is not valid. Job Job: Step fileB input secureFile references secure file file_B.txt which could not be found. The secure file does not exist or has not been authorized for use. For authorization details, refer to https://aka.ms/yamlauthz."
Code:
parameters:
- name: file_name
type: string
default: ''
values:
- file_A.txt
- file_B.txt
pool:
vmImage: ubuntu-latest
steps:
- task: DownloadSecureFile#1
displayName: Download File A
condition: eq('${{ parameters.file_name }}', 'file_A.txt')
name: fileA
inputs:
secureFile: 'file_A.txt'
- task: DownloadSecureFile#1
displayName: Download file B
condition: eq('${{ parameters.file_name }}', 'file_B.txt')
name: fileB
inputs:
secureFile: 'file_B.txt'
Is this an expected behavior?
Yes, this is expected behavior. To turn a pipeline into a run, Azure Pipelines goes through several steps in this order:
First, expand templates and evaluate template expressions.
Next, evaluate dependencies at the stage level to pick the first
stage(s) to run.
For each stage selected to run, two things happen:
All resources used in all jobs are gathered up and validated for
authorization to run.
Evaluate dependencies at the job level to pick the first job(s) to
run.
For each job selected to run, expand multi-configs (strategy: matrix
or strategy: parallel in YAML) into multiple runtime jobs.
For each runtime job, evaluate conditions to decide whether that job
is eligible to run.
Request an agent for each eligible runtime job.
So, your secure files will be downloaded before evaluating conditions. Please refer to the document about Pipeline run sequence. As a workaround, you can refer to the sample shared by #danielorn.
Instead of using the condition on the tasks you can surround the step with an if-statement as described in use parameters to determine what steps run
parameters:
- name: file_name
type: string
default: ''
values:
- file_A.txt
- file_B.txt
pool:
vmImage: ubuntu-latest
steps:
- ${{ if eq(parameters.file_name, 'file_A.txt') }}:
- task: DownloadSecureFile#1
displayName: Download File A
name: fileA
inputs:
secureFile: 'file_A.txt'
- ${{ if eq(parameters.file_name, 'file_B.txt') }}:
- task: DownloadSecureFile#1
displayName: Download file B
name: fileB
inputs:
secureFile: 'file_B.txt'
However if every user needs exactly one file, a common (and cleaner) option would be to provide the name of the file needed as a parameter. If a secure file is not needed (i.e the parameter is the default empty) the step can be excluded using an if statement
parameters:
- name: file_name
type: string
default: ''
values:
- file_A.txt
- file_B.txt
pool:
vmImage: ubuntu-latest
steps:
- ${{ if ne(parameters.file_name, '') }}:
- task: DownloadSecureFile#1
displayName: Download Secure File
name: secureFileDownload
inputs:
secureFile: '${{ parameters.file_name }}'
Normal (non-template) jobs in Azure DevOps yaml support inter-job variable passing as follows:
jobs:
- job: A
steps:
- script: "echo ##vso[task.setvariable variable=skipsubsequent;isOutput=true]false"
name: printvar
- job: B
condition: and(succeeded(), ne(dependencies.A.outputs['printvar.skipsubsequent'], 'true'))
dependsOn: A
steps:
- script: echo hello from B
How do I do something similar in the following, given that templates don't support the dependsOn syntax? I need to get an output from the first template and pass it as 'environmentSlice' to the second template.
- stage: Deploy
displayName: Deploy stage
jobs:
- template: build-templates/get-environment-slice.yml#templates
parameters:
configFileLocation: 'config/config.json'
- template: build-templates/node-app-deploy.yml#templates
parameters:
# Build agent VM image name
vmImageName: $(Common.BuildVmImage)
environmentPrefix: 'Dev'
environmentSlice: '-$(dependencies.GetEnvironmentSlice.outputs['getEnvironmentSlice.environmentSlice'])'
The reason I want the separation between the two templates is the second one is a deployment template and I would like input from the first template in naming the environment in the second template. I.e. initial part of node-app-deploy.yml (2nd template) is:
jobs:
- deployment: Deploy
displayName: Deploy
# Because we use the environmentSlice to name the environment, we have to have it passed in rather than
# extracting it from the config file in steps below
environment: ${{ parameters.environmentPrefix }}${{ parameters.environmentSlice }}
Update:
The accepted solution does allow you to pass variables between separate templates, but won't work for my particular use case. I wanted to be able to name the 'environment' section of the 2nd template dynamically, i.e. environment: ${{ parameters.environmentPrefix }}${{ parameters.environmentSlice }}, but this can only be named statically since templates are compiled on pipeline startup.
The downside of the solution is that it introduces a hidden coupling between the templates. I would have preferred the calling pipeline to orchestrate the parameter passing between templates.
You can apply the depend on and dependency variable into templates.
See below sample:
To make sample more clear, here has 2 template files, one is azure-pipelines-1.yml, and another is azure-pipeline-1-copy.yml.
In azure-pipelines-1.yml, specify the environment value as output variable:
parameters:
environment: ''
jobs:
- job: preDeploy
variables:
EnvironmentName: preDeploy-${{ parameters.environment }}
steps:
- checkout: none
- pwsh: |
echo "##vso[task.setvariable variable=EnvironmentName;isOutput=true]$($env:ENVIRONMENTNAME)"
name: outputVars
And then, in azure-pipeline-1-copy.yml use dependency to get this output variable:
jobs:
- job: deployment
dependsOn: preDeploy
variables:
EnvironmentNameCopy: $[dependencies.preDeploy.outputs['outputVars.EnvironmentName']]
steps:
- checkout: none
- pwsh: |
Write-Host "$(EnvironmentNameCopy)"
name: outputVars
At last, in YAML pipeline, just need to pass the environment value
stages:
- stage: deployQA
jobs:
- template: azure-pipelines-1.yml
parameters:
environment: FromTemplate1
- template: azure-pipeline-1-copy.yml
Now, you can see the value get successfully in the second template job:
It is possible to avoid the dependency in the called template. However, as the OP says, the environment name cannot be created dynamically.
Here is an example of the "calling" template, which firstly calls another template (devops-variables.yml) that sets some environment variables that we wish to consume in a later template (devops-callee.yml):
stages:
- stage: 'Caller_Stage'
displayName: 'Caller Stage'
jobs:
- template: 'devops-variables.yml'
parameters:
InitialEnvironment: "Development"
- template: 'devops-callee.yml'
parameters:
SomeParameter: $[dependencies.Variables_Job.outputs['Variables_Job.Variables.SomeParameter']]
In the devops-variables.yml file, I have this:
"##vso[task.setvariable variable=SomeParameter;isOutput=true;]Wibble"
Then, in the "devops-callee.yml", I just consume it something like this:
parameters:
- name: SomeParameter
default: ''
jobs:
- deployment: 'Called_Job'
condition: succeeded()
displayName: 'Called Job'
environment: "Development"
pool:
vmImage: 'windows-2019'
dependsOn:
- Variables_Job
variables:
SomeParameter: ${{parameters.SomeParameter}}
strategy:
runOnce:
deploy:
steps:
- download: none
- task: AzureCLI#2
condition: succeeded()
displayName: 'An Echo Task'
inputs:
azureSubscription: "$(TheServiceConnection)"
scriptType: pscore
scriptLocation: inlineScript
inlineScript: |
echo "Before"
echo "$(SomeParameter)"
echo "After"
Output:
2021-04-10T09:22:29.6188535Z Before
2021-04-10T09:22:29.6196620Z Wibble
2021-04-10T09:22:29.6197124Z After
This way, the callee doesn't reference the caller. Unfortunately, setting the environment in the callee thus:
environment: "$(SomeParameter)"
doesn't work - you'll just get an environment with the literal characters '$(SomeParameter)'.
I've created a Task group to encapsulate some functionality.
If I use a regular build, I can add the task group through the normal wizard.
Unfortunately, I need to use the task group inside a YAML build. I can't view the YAML of the "old" build to view how this should gonna happen.
The things I've tried:
- task: TaskGroupName#1
displayName: 'RunTests'
inputs:
TestConfiguration: 'some.xml'
TestCaseFilter: $(TestCaseFilter)
UnitTestFolders: $(UnitTestFolders)
According to the docs, Task groups are not supported in the YAML pipelines.
Instead, in that case you can use templates.
Documentation for templates: See here
You can pass parameter 'objects' into a template YAML file to pretty much do what you want; the only tricky bit I found was to have multiple properties per instance parameter 'object' and using the new template {{ each }} expression to iterate over them.
Below is how I constructed my yaml files for this solution:
azure_pipelines.yml
pool:
name: Hosted VS2017
demands:
- npm
- msbuild
- visualstudio
- vstest
steps:
- template: azure_webapp_template.yml
parameters:
webapps:
- name: Customer 1
url: customer1.azurewebsites.net
- name: Customer 2
url: customer2.azurewebsites.net
- name: Customer 3
url: customer3.azurewebsites.net
- name: Customer 4
url: customer4.azurewebsites.net
As you can see above, we are creating an object webapps and then we have some nested properties for each 'webapp'.
Then in our 'template' we can iterate over each of the objects in the webapps parameter and expand the property in our iterated tasks.
azure_webapp_template.yml
# Proving ability to loop over params a number of times
parameters:
- name: 'webapps'
type: object
default: {}
steps:
- ${{ each webapp in parameters.webapps }}:
- task: PowerShell#2
displayName: 'Task Group Test 1 ${{webapp.name}}'
inputs:
targetType: 'inline'
script: |
Write-Host "Name: ${{webapp.name}} with url ${{webapp.url}}"
failOnStderr: true
workingDirectory: '$(Build.SourcesDirectory)'
- task: PowerShell#2
displayName: 'Task Group Test 2 ${{webapp.name}}'
inputs:
targetType: 'inline'
script: |
Write-Host "Name: ${{webapp.name}} with url ${{webapp.url}}"
failOnStderr: true
workingDirectory: '$(Build.SourcesDirectory)'