I tried to find some information in internet. But unfortunately I could not found any information.
Im trying to send pipeline parameters from pipeline into powershell script
pipeline below:
parameters:
- name: env
displayName: Select Environment
type: string
default: development
stages:
- stage: test
displayName: test var
jobs:
- job: PostgresSQL
steps:
- task: PowerShell#2
inputs:
filePath: '$(5ystem. DefaultWorkingDirectory)/test.psl'
errorActionPreference: 'continue'
enabled: true
I need to send ${{ parameters.env }} to powershell.
I tried different type of define param like a variable into powershell. but it does not work.
I would be very happy if anybody can help me and share relevant documentation for that.
Thanks all
First approach is to provide arguments using 'arguments' keyword (available in PowerShell by 'param')
filePath: xyz
arguments: -input1 ${{ parameters.env }}
documentation and example - https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/powershell-v2?view=azure-pipelines#call-powershell-script-with-multiple-arguments
Second approach you can map parameters to environment variables provided to script using 'env' keyword
env:
input1: ${{ parameters.env }}
documentation - https://learn.microsoft.com/en-us/azure/devops/pipelines/process/tasks?view=azure-devops&tabs=yaml#environment-variables
Related
I am in the process of porting some existing Classic ADO pipelines to YAML. There are two separate Classic pipelines for Windows and Linux, but, because I have been able to switch most scripting to bash, I am close to having a common cross-platform YAML pipeline.
Nevertheless, I still have a few platform-dependent tasks interspersed between the majority platform-independent tasks. Of these few tasks, some only need to run on Windows and don't exist for Linux, and the remainder exist in two platform-specific versions of the tasks - one using bash and the other batch or PowerShell.
My hope was to make the bulk of the script into a template with an isWindows parameter, and to use this parameters to control the platform-dependent parts. This is roughly what I have, but it is not working:
trigger: none
pool:
name: BuildPool
demands:
- Agent.OS -equals Windows_NT
extends:
template: common-template.yml
parameters:
isWindows: true
Then common-template.yml itself. Note that the approach here, using condition, does not work. Although I have omitted most of the cross-platform tasks, these form the majority of the pipeline - there are only a few tasks that need special handling.
parameters:
- name: isWindows
type: boolean
jobs:
- job: MyJob
steps:
- checkout: none
clean: true
# Simple example of cross-platform script task
- bash: |
env
displayName: Print Environment
# ... more cross platform tasks
# Windows only task
- task: CmdLine#2
condition: eq('${{ parameters.isWindows }}', 'true')
inputs:
filename: scripts\windows_only.bat
# ... more cross platform tasks
# Task with specialization for each platform
# WINDOWS version
- task: CmdLine#2
condition: eq('${{ parameters.isWindows }}', 'true')
inputs:
filename: scripts\task_a.bat
# LINUX version
- task: Bash#3
condition: eq('${{ parameters.isWindows }}', 'false')
inputs:
filePath: scripts/task_a.sh
# ... more cross platform tasks
The issue is that when I try to run with a Linux agent I get this error:
No agent found in pool <pool name> satisfies both of the following demands: Agent.OS, Cmd. All demands: Agent.OS -equals Linux, Cmd, Agent.Version ...
I assume this is because CmdLine tasks are present, even though they are "turned off" via a condition. I assume the dependency on the task is probably determined before the condition is ever evaluated.
Is what I am trying to do possible? Is there another approach? I am not very experienced with ADO and this is the first time I have tried anything with templates so I am possibly missing something straightforward.
You can use PowerShell steps instead of batch/bash (PowerShell can be installed on both Windows and Linux).
You can also remove the demands and just use the predefined Agent.OS variable in your conditions for tasks which require specific OS:
- powershell: 'scripts/windows_only.ps1'
condition: eq(variables['Agent.OS', 'Windows_NT')
After digging into the ADO docs a bit, I discovered that what I needed was called Conditional Insertion:
parameters:
- name: isWindows
type: boolean
jobs:
- job: MyJob
...
# Windows only task
- ${{ if parameters.isWindows }}:
- task: CmdLine#2
inputs:
filename: scripts\windows_only.bat
# Task with specialization for each platform
- ${{ if parameters.isWindows }}:
# WINDOWS version
- task: CmdLine#2
inputs:
filename: scripts\task_a.bat
- $ {{ else }}:
# LINUX version
- task: Bash#3
inputs:
filePath: scripts/task_a.sh
...
There were a few tricky things that might be worth highlighting:
The "conditions" act as items in the YAML list of tasks. Hence there is a need to prefix with - .
The actual task that is protected by the condition is then indented a further level with respect to the condition line.
Don't forget the colon at the end of the condition.
The syntax I showed above doesn't actually work for me - I got an error about using else. It turned out that the else syntax is a feature of the 2022 release of ADO and we are stuck on the 2020 release. So in my case I had to introduce inverted tests: ${{ if not(parameters.isWindows) }}:
I got quite confused about how to test for true values. Various examples in the documentation, when talking about expressions in the condition field of a task, use syntax like: condition: eq(variables.isSomeCondition, 'true'). Note the comparison against a string value. I initially copied this in the inclusion expressions but found that both ${{ if eq(parameters.isWindows, 'true') }}: and ${{ if eq(parameters.isWindows, 'false') }}: triggered when the parameter itself was true. Clearly, the strings 'true' and 'false' evaluate to a boolean true in this context. It's not that this doesn't make sense - it is the inconsistency with the documented examples of the condition syntax that caught me out.
I have set up a pipeline with variables users can enter using the UI like this:
UI for userinput of variable called 'forceRelease'
I now want to use this variable in the pipeline yaml inside an if-statement like this:
jobs:
- job: Demo
steps:
- ${{ if eq(variables['forceRelease'], 'true') }}:
...some more stuff...
This does'nt work. I've tried different approaches but could not find the right syntax.
If I use the variable inside a condition, it works fine. Like this:
jobs:
- job: MAVEN_Build
- task: Bash#3
condition: eq(variables['forceRelease'], 'true')
I also tried to map the variable inside the variables block to a new pipeline variable like this:
variables:
isReleaseBranch: ${{ startsWith(variables['build.sourcebranch'],'refs/heads/pipelines-playground') }}
isForceRelease: $(forceRelease)
The first variable using 'build.sourcebranch' works fine. My approach using forceRelease doesnt work :(
Any ideas would be appreciated!
Cheers,
Dirk
AFAIK this is working as intended. User set variables are not expanded during parsing of the template.
You can read more on pipeline processing here:
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/runs?view=azure-devops
You should instead use parameters.
parameters:
- name: "forceRelease"
type: boolean
default: "false"
- name: "someOtherParameter"
type: string
default: "someValue"
stages:
- ${{ if eq(parameters['forceRelease'], true)}}:
- stage: build
jobs:
- job: bash_job
steps:
- task: Bash#3
inputs:
targetType: 'inline'
script: |
# Write your commands here
And then when you run the pipeline you have the option to enable the parameter forceRelease or add someOtherParameter string.
I am trying to find a way to define a variable group at stage level and then access it in below jobs through a template? How would I go about doing this?
# Template file getchangedfilesandvariables.yaml
parameters:
- name: "previouscommitid"
type: string
steps:
- task: PowerShell#2
displayName: 'Get the changed files'
name: CommitIds
inputs:
targetType: 'filePath'
filePath: '$(Build.SourcesDirectory)\AzureDevOpsPipelines\Get-COChangedfiles.ps1'
arguments: >
-old_commit_id ${{ previouscommitid }}
- task: PowerShell#2
name: PassOutput
displayName: 'Getting Variables for Packaging'
inputs:
targetType: 'filepath'
filepath: '$(System.DefaultWorkingDirectory)\AzureDevOpsPipelines\Get-COADOvariables.ps1'
And below is my yaml file.
trigger: none
name: $(BuildID)
variables:
system.debug: true
CodeSigningCertThumbprint: "somethumbprint"
# Triggering builds on a branch itself.
${{ if startsWith(variables['Build.SourceBranch'], 'refs/heads/') }}:
branchName: $[ replace(variables['Build.SourceBranch'], 'refs/heads/', '') ]
# Triggering builds from a Pull Request.
${{ if startsWith(variables['Build.SourceBranch'], 'refs/pull/') }}:
branchName: $[ replace(variables['System.PullRequest.SourceBranch'], 'refs/heads/', '') ]
## it will create pipeline package and it will push it private or public feed artifacts
stages:
- stage: Stage1
variables:
- group: Cloudops
- name: oldcommitid
value: $[variables.lastcommitid]
jobs:
- job: IdentifyChangedFilesAndGetADOVariables
pool:
name: OnPrem
workspace:
clean: all # Ensure the agent's directories are wiped clean before building.
steps:
- powershell: |
[System.Version]$PlatformVersion = ((Get-Content "$(Build.SourcesDirectory)\AzureDevOpsPipelines\PlatformVersion.json") | ConvertFrom-Json).PlatformVersion
Write-Output "The repository's PlatformVersion is: $($PlatformVersion.ToString())"
$NewPackageVersion = New-Object -TypeName "System.Version" -ArgumentList #($PlatformVersion.Major, $PlatformVersion.Minor, $(Build.BuildId))
Write-Output "This run's package version is $($NewPackageVersion.ToString())"
echo "##vso[task.setvariable variable=NewPackageVersion]$($NewPackageVersion.ToString())"
echo "##vso[task.setvariable variable=commitidold;isOutput=true]$(oldcommitid)"
displayName: 'Define package version.'
name: commitidorpackageversion
errorActionPreference: stop
- template: getchangedfilesandvariables.yaml
parameters:
previouscommitid:
- $(commitidorpackageversion.commitidold)
# - $(oldcommitid)
I get the error at the second last line of the code that
/AzureDevOpsPipelines/azure-pipelines.yml (Line: 49, Col: 13): The 'previouscommitid' parameter is not a valid String.
I tried different combinations but I am still getting the errors.
Any ideas?
Thanks for your response. I already had the variable group setup in my library. I was just not able to use it.
The way I was able to achieve this I created another template file and supplied it to variables section under my stage. After doing this I was able to actually able to use the variables from my variable group in my successive jobs.
For more information you can review this doc : https://learn.microsoft.com/en-us/azure/devops/pipelines/library/variable-groups?view=azure-devops&tabs=yaml
stagevariables.yaml
variables:
- group: Cloudops
azure-pipelines.yml
stages:
- stage: Stage1
variables:
- template: stagevariables.yaml
jobs:
- job: CheckwhichfeedsAreAvailable
In YAML pipeline, you can't define a new variable group under the variables key.
Actually, we do not have the syntax can be available to create new variable group when running the YAML pipeline.
Under the variables key, you can:
Define new variables with the specified values.
Override the existing variables with new values.
Reference the variables from the existing variable groups and variable templates.
So, if you want to use a variable group with some variables in the pipeline, you should manually define the variable group on the Pipelines > Library page, then reference it in the pipeline.
I am working with Azure pipeline templates. I would like the developer that kicks off a pipeline to either set a variable of a specific branch OR leave as the $(Build.SourceBranch)
The reason is to pull down artifacts from different repositories/branches to combine.
So on my yml I added parameters (only showing 1 for simplicity)
parameters:
- name : source_branch
displayName: Which Branch (e.g. refs/head/foo)
type: string
default: $(Build.SourceBranch)
Then I call a template
- template: download_artifact.yml
parameters:
artifacts:
- project: 'XXX'
pipeline: 291
artifact: 'artifcat'
branch: ${{ parameters.source_branch }}
I use a template as there are approx 30 different artifacts to combine.
Within the template it downloads extracts and manipulates but I will simplify to only download.
parameters:
artifacts: []
steps:
- ${{ each step in parameters.artifacts }}:
- task: DownloadPipelineArtifact#2
displayName: '${{ step.artifact }}'
inputs:
source: 'specific'
project: ${{step.project}}
pipeline: ${{step.pipeline}}
runVersion: 'latestFromBranch'
runBranch: ${{step.branch}}
artifact: ${{step.artifact}}
path: '$(Pipeline.Workspace)\${{ step.artifact }}'
So the end result is that the variable does not get resolved within the template. I think this is due to templates being expanded at queue time. Does anyone have any workarounds for this scenario?
Your reference to the source_branch parameter when calling the template from the pipeline needs to be a template expression:
- template: download_artifact.yml
parameters:
artifacts:
- project: 'XXX'
pipeline: 291
artifact: 'artifcat'
branch: ${{ parameters.source_branch }}
Also, if the $(Build.SourceBranch) doesn't work in your parameter declaration, you can try:
parameters:
- name : source_branch
displayName: Which Branch (e.g. refs/head/foo)
type: string
default: ${{ variables['Build.SourceBranch'] }}
The parameters in the template should be expanded at pipeline compile time. And according to my tests and tries, we seem have no any available workaround for your scenario.
Normal (non-template) jobs in Azure DevOps yaml support inter-job variable passing as follows:
jobs:
- job: A
steps:
- script: "echo ##vso[task.setvariable variable=skipsubsequent;isOutput=true]false"
name: printvar
- job: B
condition: and(succeeded(), ne(dependencies.A.outputs['printvar.skipsubsequent'], 'true'))
dependsOn: A
steps:
- script: echo hello from B
How do I do something similar in the following, given that templates don't support the dependsOn syntax? I need to get an output from the first template and pass it as 'environmentSlice' to the second template.
- stage: Deploy
displayName: Deploy stage
jobs:
- template: build-templates/get-environment-slice.yml#templates
parameters:
configFileLocation: 'config/config.json'
- template: build-templates/node-app-deploy.yml#templates
parameters:
# Build agent VM image name
vmImageName: $(Common.BuildVmImage)
environmentPrefix: 'Dev'
environmentSlice: '-$(dependencies.GetEnvironmentSlice.outputs['getEnvironmentSlice.environmentSlice'])'
The reason I want the separation between the two templates is the second one is a deployment template and I would like input from the first template in naming the environment in the second template. I.e. initial part of node-app-deploy.yml (2nd template) is:
jobs:
- deployment: Deploy
displayName: Deploy
# Because we use the environmentSlice to name the environment, we have to have it passed in rather than
# extracting it from the config file in steps below
environment: ${{ parameters.environmentPrefix }}${{ parameters.environmentSlice }}
Update:
The accepted solution does allow you to pass variables between separate templates, but won't work for my particular use case. I wanted to be able to name the 'environment' section of the 2nd template dynamically, i.e. environment: ${{ parameters.environmentPrefix }}${{ parameters.environmentSlice }}, but this can only be named statically since templates are compiled on pipeline startup.
The downside of the solution is that it introduces a hidden coupling between the templates. I would have preferred the calling pipeline to orchestrate the parameter passing between templates.
You can apply the depend on and dependency variable into templates.
See below sample:
To make sample more clear, here has 2 template files, one is azure-pipelines-1.yml, and another is azure-pipeline-1-copy.yml.
In azure-pipelines-1.yml, specify the environment value as output variable:
parameters:
environment: ''
jobs:
- job: preDeploy
variables:
EnvironmentName: preDeploy-${{ parameters.environment }}
steps:
- checkout: none
- pwsh: |
echo "##vso[task.setvariable variable=EnvironmentName;isOutput=true]$($env:ENVIRONMENTNAME)"
name: outputVars
And then, in azure-pipeline-1-copy.yml use dependency to get this output variable:
jobs:
- job: deployment
dependsOn: preDeploy
variables:
EnvironmentNameCopy: $[dependencies.preDeploy.outputs['outputVars.EnvironmentName']]
steps:
- checkout: none
- pwsh: |
Write-Host "$(EnvironmentNameCopy)"
name: outputVars
At last, in YAML pipeline, just need to pass the environment value
stages:
- stage: deployQA
jobs:
- template: azure-pipelines-1.yml
parameters:
environment: FromTemplate1
- template: azure-pipeline-1-copy.yml
Now, you can see the value get successfully in the second template job:
It is possible to avoid the dependency in the called template. However, as the OP says, the environment name cannot be created dynamically.
Here is an example of the "calling" template, which firstly calls another template (devops-variables.yml) that sets some environment variables that we wish to consume in a later template (devops-callee.yml):
stages:
- stage: 'Caller_Stage'
displayName: 'Caller Stage'
jobs:
- template: 'devops-variables.yml'
parameters:
InitialEnvironment: "Development"
- template: 'devops-callee.yml'
parameters:
SomeParameter: $[dependencies.Variables_Job.outputs['Variables_Job.Variables.SomeParameter']]
In the devops-variables.yml file, I have this:
"##vso[task.setvariable variable=SomeParameter;isOutput=true;]Wibble"
Then, in the "devops-callee.yml", I just consume it something like this:
parameters:
- name: SomeParameter
default: ''
jobs:
- deployment: 'Called_Job'
condition: succeeded()
displayName: 'Called Job'
environment: "Development"
pool:
vmImage: 'windows-2019'
dependsOn:
- Variables_Job
variables:
SomeParameter: ${{parameters.SomeParameter}}
strategy:
runOnce:
deploy:
steps:
- download: none
- task: AzureCLI#2
condition: succeeded()
displayName: 'An Echo Task'
inputs:
azureSubscription: "$(TheServiceConnection)"
scriptType: pscore
scriptLocation: inlineScript
inlineScript: |
echo "Before"
echo "$(SomeParameter)"
echo "After"
Output:
2021-04-10T09:22:29.6188535Z Before
2021-04-10T09:22:29.6196620Z Wibble
2021-04-10T09:22:29.6197124Z After
This way, the callee doesn't reference the caller. Unfortunately, setting the environment in the callee thus:
environment: "$(SomeParameter)"
doesn't work - you'll just get an environment with the literal characters '$(SomeParameter)'.