I am trying to build an automatic Environment Approval with Invoke Azure Function. I need to pass a variable that is created in a previous build stage. I have added the variable to the function body, but the variable is not evaluated:
Task that creates the output variable (in previous stage)
- task: CmdLine#2
inputs:
script: |
# Debug output (value visible in logs)
pulumi stack output resourceGroupName
echo "##vso[task.setvariable variable=resourceGroupName;isoutput=true]$(pulumi stack output resourceGroupName)"
workingDirectory: 'infra'
Body for the Azure function:
{ "ResourceGroup": "$(resourceGroupName)" }
Log:
2020-03-26T15:57:01.2590351Z POST https://azure-function-url
Request body: {
"ResourceGroup": "$(resourceGroupName)"
}
Added the variable to the function body, but the variable is not
evaluated
This is a expected action. Here what you are doing is share variable across stages, which does not supported directly until now.
Output variable just used to share values between steps instead of stages.
Work around:
If you want to use the generated variable in next stage, a workaround you can consider to use is writing the variable to a file, leveraging pipeline artifacts.
Sample steps:
Here I will pass one variable which name is resourceGroupName to next stage.
1) Create a folder which will contain the variables you want to pass
mkdir -p $(Pipeline.Workspace)/variables
2) Write the contents of the variable to file StageUsed:
echo "$resourceGroupName" > $(Pipeline.Workspace)/variables/StageUsed
3) In next stage, add one job before your InvokeAzureFunction job. Download the variables pipeline artifact that previous stage published.
4) Transfer each file into a variable:
resourceGroupName=$(cat $(Pipeline.Workspace)/variables/StageUsed)
5) Make the variable exposed in the current job, and set it reference name as Out:
echo "##vso[task.setvariable variable=resourceGroupName;isoutput=true]$resourceGroupName"
6) Now, you can access the variable in your InvokeAzureFunction job by calling dependencies.secondjob.outputs['output.resourceGroupName']
Sample Script:
trigger:
- master
pool:
vmImage: 'ubuntu-latest'
stages:
- stage: FirstStage
jobs:
- job: firstjob
pool:
vmImage: 'Ubuntu-16.04'
steps:
- bash: |
resourceGroupName="value"
mkdir -p $(Pipeline.Workspace)/variables
echo "$resourceGroupName" > $(Pipeline.Workspace)/variables/resourceGroupName
- publish: $(Pipeline.Workspace)/variables
artifact: variables
- stage: SecondStage
jobs:
- job: secondjob
pool:
vmImage: 'Ubuntu-16.04'
steps:
- download: current
artifact: variables
- bash: |
resourceGroupName=$(cat $(Pipeline.Workspace)/variables/resourceGroupName)
echo "##vso[task.setvariable variable=resourceGroupName;isoutput=true]$resourceGroupName"
name: output
- bash: |
echo "$(output.resourceGroupName)"
- job: ServerJob
dependsOn: secondjob
pool: server
variables:
resourceGroupName: $[dependencies.secondjob.outputs['output.resourceGroupName']]
steps:
- task: AzureFunction#1
inputs:
function:
method: 'POST'
body: '$(sdf)'
waitForCompletion: 'false'
Related
I am trying to find a way to define a variable group at stage level and then access it in below jobs through a template? How would I go about doing this?
# Template file getchangedfilesandvariables.yaml
parameters:
- name: "previouscommitid"
type: string
steps:
- task: PowerShell#2
displayName: 'Get the changed files'
name: CommitIds
inputs:
targetType: 'filePath'
filePath: '$(Build.SourcesDirectory)\AzureDevOpsPipelines\Get-COChangedfiles.ps1'
arguments: >
-old_commit_id ${{ previouscommitid }}
- task: PowerShell#2
name: PassOutput
displayName: 'Getting Variables for Packaging'
inputs:
targetType: 'filepath'
filepath: '$(System.DefaultWorkingDirectory)\AzureDevOpsPipelines\Get-COADOvariables.ps1'
And below is my yaml file.
trigger: none
name: $(BuildID)
variables:
system.debug: true
CodeSigningCertThumbprint: "somethumbprint"
# Triggering builds on a branch itself.
${{ if startsWith(variables['Build.SourceBranch'], 'refs/heads/') }}:
branchName: $[ replace(variables['Build.SourceBranch'], 'refs/heads/', '') ]
# Triggering builds from a Pull Request.
${{ if startsWith(variables['Build.SourceBranch'], 'refs/pull/') }}:
branchName: $[ replace(variables['System.PullRequest.SourceBranch'], 'refs/heads/', '') ]
## it will create pipeline package and it will push it private or public feed artifacts
stages:
- stage: Stage1
variables:
- group: Cloudops
- name: oldcommitid
value: $[variables.lastcommitid]
jobs:
- job: IdentifyChangedFilesAndGetADOVariables
pool:
name: OnPrem
workspace:
clean: all # Ensure the agent's directories are wiped clean before building.
steps:
- powershell: |
[System.Version]$PlatformVersion = ((Get-Content "$(Build.SourcesDirectory)\AzureDevOpsPipelines\PlatformVersion.json") | ConvertFrom-Json).PlatformVersion
Write-Output "The repository's PlatformVersion is: $($PlatformVersion.ToString())"
$NewPackageVersion = New-Object -TypeName "System.Version" -ArgumentList #($PlatformVersion.Major, $PlatformVersion.Minor, $(Build.BuildId))
Write-Output "This run's package version is $($NewPackageVersion.ToString())"
echo "##vso[task.setvariable variable=NewPackageVersion]$($NewPackageVersion.ToString())"
echo "##vso[task.setvariable variable=commitidold;isOutput=true]$(oldcommitid)"
displayName: 'Define package version.'
name: commitidorpackageversion
errorActionPreference: stop
- template: getchangedfilesandvariables.yaml
parameters:
previouscommitid:
- $(commitidorpackageversion.commitidold)
# - $(oldcommitid)
I get the error at the second last line of the code that
/AzureDevOpsPipelines/azure-pipelines.yml (Line: 49, Col: 13): The 'previouscommitid' parameter is not a valid String.
I tried different combinations but I am still getting the errors.
Any ideas?
Thanks for your response. I already had the variable group setup in my library. I was just not able to use it.
The way I was able to achieve this I created another template file and supplied it to variables section under my stage. After doing this I was able to actually able to use the variables from my variable group in my successive jobs.
For more information you can review this doc : https://learn.microsoft.com/en-us/azure/devops/pipelines/library/variable-groups?view=azure-devops&tabs=yaml
stagevariables.yaml
variables:
- group: Cloudops
azure-pipelines.yml
stages:
- stage: Stage1
variables:
- template: stagevariables.yaml
jobs:
- job: CheckwhichfeedsAreAvailable
In YAML pipeline, you can't define a new variable group under the variables key.
Actually, we do not have the syntax can be available to create new variable group when running the YAML pipeline.
Under the variables key, you can:
Define new variables with the specified values.
Override the existing variables with new values.
Reference the variables from the existing variable groups and variable templates.
So, if you want to use a variable group with some variables in the pipeline, you should manually define the variable group on the Pipelines > Library page, then reference it in the pipeline.
Let's say I have the following stage and pipeline:
stages:
- stage: A
lockBehavior: sequential
jobs:
- job: Job
steps:
- script: Hey!
lockBehavior: runLatest
stages:
- stage: A
jobs:
- job: Job
steps:
- script: Hey!
How would I print out the lockBehavior parameter out in Azure Devops when running the pipeline? I have tried printing out all variables using this code:
jobs:
- job: testing
steps:
- task: Bash#3
inputs:
targetType: 'inline'
script: 'env | sort'
But this does not work.
I checked the parameters "lockBehavior" from here . But I haven't found out a method to show the value in the pipeline log. However, there is a note "If you do not specify a lockBehavior, it is assumed to be runLatest." In my view,it's not hard to know the value if you set up it by yourself.
As my experience, you can print some "Use predefined variables" or "https://learn.microsoft.com/en-us/azure/devops/pipelines/process/runtime-parameters?view=azure-devops&tabs=script" by following the offical doc
So i guess that perhaps this parameter will be added in the future in the "Use predefined variables".
I have the following yaml as used in an Azure DevOps pipeline (this is not the full pipeline - it's just a portion of yaml that is in a template):
jobs:
- job: CheckExcludedWorkspaces
displayName: Check Excluded Workspaces
pool:
name: DefaultWindows
steps:
- task: PowerShell#2
name: GetWorkspaces
displayName: Check Excluded Workspaces
inputs:
filePath: "$(System.DefaultWorkingDirectory)/pipelines_v2/powershell/checkExcludedWorkspaces.ps1"
targetType: FilePath
errorActionPreference: 'stop'
arguments: -environmentFolder "$(rootFolderPrefix)\${{parameters.environmentFolder}}" -excludeFolderList "${{parameters.tagOutList}}"
pwsh: false
# - ${{ each folder in dependencies.CheckExcludedWorkspaces.outputs['GetWorkspaces.WorkspaceList'] }}:
- job: NewJob
dependsOn: CheckExcludedWorkspaces
variables:
testVar: $[ dependencies.CheckExcludedWorkspaces.outputs['GetWorkspaces.WorkspaceList'] ]
pool:
name: DefaultWindows
steps:
- powershell: |
Write-Host "Test var = $(testVar)"
displayName: Test workspaces output
this works correctly in that the second job retrieves a variable from a powershell task in the previous job and outputs that variable value. The task in the second job outputs a list of apps using variable testVar. The output contains:
app1,app2,app3,app4 etc
I would like to take this the next stage which is I would like to create a loop of jobs that repeated runs for this application list. Something like:
jobs:
- job: CheckExcludedWorkspaces
displayName: Check Excluded Workspaces
pool:
name: DefaultWindows
steps:
- task: PowerShell#2
name: GetWorkspaces
displayName: Check Excluded Workspaces
inputs:
filePath: "$(System.DefaultWorkingDirectory)/pipelines_v2/powershell/checkExcludedWorkspaces.ps1"
targetType: FilePath
errorActionPreference: 'stop'
arguments: -environmentFolder "$(rootFolderPrefix)\${{parameters.environmentFolder}}" -excludeFolderList "${{parameters.tagOutList}}"
pwsh: false
- ${{ each folder in dependencies.CheckExcludedWorkspaces.outputs['GetWorkspaces.WorkspaceList'] }}:
- job: NewJob
dependsOn: CheckExcludedWorkspaces
variables:
testVar: $[ dependencies.CheckExcludedWorkspaces.outputs['GetWorkspaces.WorkspaceList'] ]
pool:
name: DefaultWindows
steps:
- powershell: |
Write-Host "Test var = ${{folder}}"
displayName: Test workspaces output
This code gives me an error:
Unrecognized value: 'dependencies'. Located at position 1 within expression: dependencies.CheckExcludedWorkspaces.outputs['GetWorkspaces.WorkspaceList']
Is there a way i can use a powershell task output variable, to create a list of jobs in a dependent job? The problem is that i don't know at design time what the list of applications will be (the pipeline should ideally find this out when it runs). The list of applications is based on the list of folders that are created within a repository - which changes over time..
In current situation, we cannot use the 'each' key word for the variables. The 'each' keyword is used for the Obj type, but the variable is String.
For more details, you can refer the doc: Each keyword
I am new to azure pipelines and am currently experimenting with the passing variables to the later jobs. Here is the current snippet whereby I am trying to extract the project version from the pom file using maven help: evaluate plugin. The pomVersion variable is populated but is not available in the same step or later steps in the second job via the projectVersionDynamic variable.
stages:
- stage: FirstStage
jobs:
- job: FirstJob
pool:
vmImage: 'ubuntu-latest'
steps:
- task: Bash#3
inputs:
targetType: 'inline'
script: |
pomVersion=`mvn help:evaluate -Dexpression=project.version -q -DforceStdout`
echo $pomVersion ##Prints semantic version 2.27.0-SNAPSHOT
echo '##vso[task.setvariable variable=projectVersionDynamic;isOutput=true]$pomVersion'
echo '##vso[task.setvariable variable=projectVersionStatic;isOutput=true]2.27.0'
echo $(Task1.projectVersionDynamic) ##Error message
echo $projectVersionDynamic ##Is empty
name: Task1
displayName: Task1 in JobOne of FirstStage
- job: SecondJob
dependsOn: FirstJob
variables:
DynVar: $[ dependencies.FirstJob.outputs['Task1.projectVersionDynamic'] ]
StaVar: $[ dependencies.FirstJob.outputs['Task1.projectVersionStatic'] ]
pool:
vmImage: 'ubuntu-latest'
steps:
- task: Bash#3
inputs:
targetType: 'inline'
script: |
echo 'SecondJob'
echo $(DynVar) ##Is empty
echo $DynVar ##Is empty
echo $(StaVar) ##Prints 2.27.0
echo $StaVar ##Is empty
displayName: Task in JobTwo of FirstStage
Observation: projectVersionDynamic value does not get populated and is not available in the same task or subsequent tasks within or in later jobs / stages. However, the static variable gets populated in projectVersionStatic without any issues.
Is it possible to set dynamic values for user-defined variables in azure pipelines or is it that I am doing something wrong? I see an example here under the Stages section where it seems to be working.
Variables in Azure Pipelines can be really tricky sometimes. The documentation isn't always crystal-clear on how it's supposed to work. Looking at your example, a cpuple of observations:
echo '##vso[task.setvariable variable=projectVersionDynamic;isOutput=true]$pomVersion' - your single quotes need to be double quotes to expand the value of $pomVersion into the echo statement
(and this is where things get fuzzy): The purpose of task.setvariable is to communicate values to other tasks or jobs. From the documentation: "This doesn't update the environment variables, but it does make the new variable available to downstream steps within the same job."
$variable syntax won't work because the task.setVariable doesn't inject into the running environment - rather, it's a signal to the pipeline to capture the output and store it away for later use.
$(variable) syntax won't work because it's expanded just before the job starts, so it's too late to capture here.
If you use my suggestion in point 1) about double-quoting the task.setVariable, you should see the value available in the second job.
For anybody having a use case where variables defined in one azure pipeline stage needs to be used in another stage, this is how it can be done:
Following snipett of azure pipeline that defines a variable and make it available to be used in a job in the same stage and in a job in another stage:
stages:
- stage: stage1
jobs:
- job: job1
steps:
- task: "AzureCLI#2"
name: step1
inputs:
inlineScript: |
my_variable=a-value
echo "##vso[task.setvariable variable=var_name;isOutput=true]$my_variable"
- job: job2
variables:
stage1_var: $[ dependencies.job1.outputs['step1.var_name'] ]
steps:
- bash: |
echo "print variable value"
echo $(stage1_var)
- stage: stage2
jobs:
- job: job1
variables:
stage2_var: $[ stageDependencies.stage1.job1.outputs['step1.var_name'] ]
steps:
- bash: |
echo "print variable value"
echo $(stage2_var)
I'm trying to generate release notes in an azure piplelines stage and push the note to an azure service bus.
How do I expose the variable in a bash script then consume it in a subsequent job in the same stage?
I'm using a bash task to execute a git command and trying to export it as an environment variable which I want to use in the following job.
- stage: PubtoAzureServiceBus
variables:
COMMIT_MSG: "alsdkfgjdsfgjfd"
jobs:
- job: gitlog
steps:
- task: Bash#3
inputs:
targetType: 'inline'
script: |
# Write your commands here
export COMMIT_MSG=$(git log -1 --pretty=format:"Author: %aN%n%nCommit: %H%n%nNotes:%n%n%B")
env | grep C
- job:
pool: server
dependsOn: gitlog
steps:
- task: PublishToAzureServiceBus#1
inputs:
azureSubscription: 'Slack Release Notifications'
messageBody: |
{
"channel":"XXXXXXXXXX",
"username":"bot",
"iconEmoji":"",
"text":":airhorn: release :airhorn: \n`$(COMMIT_MSG)`"
}
signPayload: false
waitForCompletion: false
You need to use logging syntax and output variables like it is shown here:
trigger: none
pool:
vmImage: 'ubuntu-latest'
stages:
- stage: A
jobs:
- job: A1
steps:
- bash: echo "##vso[task.setvariable variable=shouldrun;isOutput=true]true"
# or on Windows:
# - script: echo ##vso[task.setvariable variable=shouldrun;isOutput=true]true
name: printvar
- stage: B
dependsOn: A
jobs:
- job: B1
condition: in(stageDependencies.A.A1.result, 'Succeeded', 'SucceededWithIssues', 'Skipped')
steps:
- script: echo hello from Job B1
- job: B2
variables:
varFromA: $[ stageDependencies.A.A1.outputs['printvar.shouldrun'] ]
steps:
- script: echo $(varFromA) # this step uses the mapped-in variable
Please take a look here to check documentation.
So you need to replace
export COMMIT_MSG=$(git log -1 --pretty=format:"Author: %aN%n%nCommit: %H%n%nNotes:%n%n%B")
wit logging command with isOutput=true
and then map it as here
jobs:
- job: A
steps:
- bash: |
echo "##vso[task.setvariable variable=shouldrun;isOutput=true]true"
name: ProduceVar # because we're going to depend on it, we need to name the step
- job: B
dependsOn: A
variables:
# map the output variable from A into this job
varFromA: $[ dependencies.A.outputs['printvar.shouldrun']
steps:
- script: echo $(varFromA) # this step uses the mapped-in variable
as you want to share variable between jobs (not stages as it shown in the first example).