Passing variable to non-dependent jobs in Azure DevOps Yaml Pipelines - azure-devops

I am trying to pass a variable from one job to multiple non-dependent jobs. It works for dependent jobs. However it is not working for non-dependent jobs. I have yaml pipeline defined like this.
jobs:
- job: JobA
steps:
- task: Bash#3
name: var_test
inputs:
targetType: 'inline'
script: |
myVar=demo-value
echo "##vso[task.setvariable variable=myVarNew;isOutput=true]$myVar"
- job: JobB
dependsOn: JobA
variables:
- name: newVar2
value: $[ dependencies.JobA.outputs['var_test.myVarNew'] ]
steps:
- task: Bash#3
inputs:
targetType: 'inline'
script: |
echo "$(newVar2)"
- job: JobC
dependsOn: JobB
variables:
- name: newVar2
value: $[ dependencies.JobA.outputs['var_test.myVarNew'] ]
steps:
- task: Bash#3
inputs:
targetType: 'inline'
script: |
echo "$(newVar2)"
It works fine in JobB. But when I tried it in different downstream job like jobC it doesn't work. Also I can't set the variable in every job as manual validation agentless job is also there in between. I want to use same variable in different non dependent jobs within same stage. Please help. Thanks.

dependencies.X.outputs relies on X having been declared as a dependency. If it's not there, then Azure Pipelines can't really know to track it as one for that job (with the sole exception that a Job implicitly dependsOn the preceding job unless this is explicitly set.
In your case, though, because JobB depends on JobA and JobC depends on JobB, the following is equivalent (even if a little redundant):
- job: JobC
dependsOn:
- JobA
- JobB
...
It's redundant for dependancy graph purposes, but it satisfies the need for a dependency to be declared so that you can access its output variables.

Related

Set Azure DevOps pipeline variable to array of values

I am trying to run a set of tests across a set of ADO builds.
I can retrieve the list of ADO builds using PowerShell. However, once I have the list, I need to export that to an ADO variable and then iterate across the values.
I've seen how to export values from Powershell to ADO using logging, but that appears to export the value as a string, not a list.
Is there a way to export variables so that I could iterate across them; e.g., using ${{ each foo in exportedVars }}?
First, for the usage you mentioned:
${{ each foo in exportedVars }}
This is a compile-time usage, it is expanded at the beginning, and you can't get the variables generated by the pipeline runtime through it.
Second, the pipeline can output variables through the logging command, but the variables set in this way can only be strings. This is by design, and the documentation has said it very clearly:
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch#set-variables-in-scripts
All variables set by this method are treated as strings.
It is not difficult to parse a string and put it into an array, just use the function(split()) that comes with the string type to split and restore.
Here is an example:
trigger:
- none
# 1
stages:
- stage: s1
displayName: setvars
jobs:
- job: testJob
steps:
- task: PowerShell#2
name: setvar
inputs:
targetType: 'inline'
script: |
# logic here. For example you get the vars and put it into this format:
$testvars = "testvar1,testvar2,testvar3"
Write-Host "##vso[task.setvariable variable=outputvars;isOutput=true]$testvars"
# 2
- stage: s2
displayName: getvars
dependsOn: s1
variables:
vars: $[ stageDependencies.s1.testJob.outputs['setvar.outputvars'] ]
jobs:
- job:
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
$varsArr = "$(vars)".Split(',')
foreach ($var in $varsArr)
{
Write-Host "$var`r`n"
}
Result:

How do I print out a pipeline's or a stage's parameter?

Let's say I have the following stage and pipeline:
stages:
- stage: A
lockBehavior: sequential
jobs:
- job: Job
steps:
- script: Hey!
lockBehavior: runLatest
stages:
- stage: A
jobs:
- job: Job
steps:
- script: Hey!
How would I print out the lockBehavior parameter out in Azure Devops when running the pipeline? I have tried printing out all variables using this code:
jobs:
- job: testing
steps:
- task: Bash#3
inputs:
targetType: 'inline'
script: 'env | sort'
But this does not work.
I checked the parameters "lockBehavior" from here . But I haven't found out a method to show the value in the pipeline log. However, there is a note "If you do not specify a lockBehavior, it is assumed to be runLatest." In my view,it's not hard to know the value if you set up it by yourself.
As my experience, you can print some "Use predefined variables" or "https://learn.microsoft.com/en-us/azure/devops/pipelines/process/runtime-parameters?view=azure-devops&tabs=script" by following the offical doc
So i guess that perhaps this parameter will be added in the future in the "Use predefined variables".

Azure DevOps yaml: use a powershell task output parameter to generate a loop in dependent job

I have the following yaml as used in an Azure DevOps pipeline (this is not the full pipeline - it's just a portion of yaml that is in a template):
jobs:
- job: CheckExcludedWorkspaces
displayName: Check Excluded Workspaces
pool:
name: DefaultWindows
steps:
- task: PowerShell#2
name: GetWorkspaces
displayName: Check Excluded Workspaces
inputs:
filePath: "$(System.DefaultWorkingDirectory)/pipelines_v2/powershell/checkExcludedWorkspaces.ps1"
targetType: FilePath
errorActionPreference: 'stop'
arguments: -environmentFolder "$(rootFolderPrefix)\${{parameters.environmentFolder}}" -excludeFolderList "${{parameters.tagOutList}}"
pwsh: false
# - ${{ each folder in dependencies.CheckExcludedWorkspaces.outputs['GetWorkspaces.WorkspaceList'] }}:
- job: NewJob
dependsOn: CheckExcludedWorkspaces
variables:
testVar: $[ dependencies.CheckExcludedWorkspaces.outputs['GetWorkspaces.WorkspaceList'] ]
pool:
name: DefaultWindows
steps:
- powershell: |
Write-Host "Test var = $(testVar)"
displayName: Test workspaces output
this works correctly in that the second job retrieves a variable from a powershell task in the previous job and outputs that variable value. The task in the second job outputs a list of apps using variable testVar. The output contains:
app1,app2,app3,app4 etc
I would like to take this the next stage which is I would like to create a loop of jobs that repeated runs for this application list. Something like:
jobs:
- job: CheckExcludedWorkspaces
displayName: Check Excluded Workspaces
pool:
name: DefaultWindows
steps:
- task: PowerShell#2
name: GetWorkspaces
displayName: Check Excluded Workspaces
inputs:
filePath: "$(System.DefaultWorkingDirectory)/pipelines_v2/powershell/checkExcludedWorkspaces.ps1"
targetType: FilePath
errorActionPreference: 'stop'
arguments: -environmentFolder "$(rootFolderPrefix)\${{parameters.environmentFolder}}" -excludeFolderList "${{parameters.tagOutList}}"
pwsh: false
- ${{ each folder in dependencies.CheckExcludedWorkspaces.outputs['GetWorkspaces.WorkspaceList'] }}:
- job: NewJob
dependsOn: CheckExcludedWorkspaces
variables:
testVar: $[ dependencies.CheckExcludedWorkspaces.outputs['GetWorkspaces.WorkspaceList'] ]
pool:
name: DefaultWindows
steps:
- powershell: |
Write-Host "Test var = ${{folder}}"
displayName: Test workspaces output
This code gives me an error:
Unrecognized value: 'dependencies'. Located at position 1 within expression: dependencies.CheckExcludedWorkspaces.outputs['GetWorkspaces.WorkspaceList']
Is there a way i can use a powershell task output variable, to create a list of jobs in a dependent job? The problem is that i don't know at design time what the list of applications will be (the pipeline should ideally find this out when it runs). The list of applications is based on the list of folders that are created within a repository - which changes over time..
In current situation, we cannot use the 'each' key word for the variables. The 'each' keyword is used for the Obj type, but the variable is String.
For more details, you can refer the doc: Each keyword

Dynamic variables not available in other stages of azure pipeline

I am new to azure pipelines and am currently experimenting with the passing variables to the later jobs. Here is the current snippet whereby I am trying to extract the project version from the pom file using maven help: evaluate plugin. The pomVersion variable is populated but is not available in the same step or later steps in the second job via the projectVersionDynamic variable.
stages:
- stage: FirstStage
jobs:
- job: FirstJob
pool:
vmImage: 'ubuntu-latest'
steps:
- task: Bash#3
inputs:
targetType: 'inline'
script: |
pomVersion=`mvn help:evaluate -Dexpression=project.version -q -DforceStdout`
echo $pomVersion ##Prints semantic version 2.27.0-SNAPSHOT
echo '##vso[task.setvariable variable=projectVersionDynamic;isOutput=true]$pomVersion'
echo '##vso[task.setvariable variable=projectVersionStatic;isOutput=true]2.27.0'
echo $(Task1.projectVersionDynamic) ##Error message
echo $projectVersionDynamic ##Is empty
name: Task1
displayName: Task1 in JobOne of FirstStage
- job: SecondJob
dependsOn: FirstJob
variables:
DynVar: $[ dependencies.FirstJob.outputs['Task1.projectVersionDynamic'] ]
StaVar: $[ dependencies.FirstJob.outputs['Task1.projectVersionStatic'] ]
pool:
vmImage: 'ubuntu-latest'
steps:
- task: Bash#3
inputs:
targetType: 'inline'
script: |
echo 'SecondJob'
echo $(DynVar) ##Is empty
echo $DynVar ##Is empty
echo $(StaVar) ##Prints 2.27.0
echo $StaVar ##Is empty
displayName: Task in JobTwo of FirstStage
Observation: projectVersionDynamic value does not get populated and is not available in the same task or subsequent tasks within or in later jobs / stages. However, the static variable gets populated in projectVersionStatic without any issues.
Is it possible to set dynamic values for user-defined variables in azure pipelines or is it that I am doing something wrong? I see an example here under the Stages section where it seems to be working.
Variables in Azure Pipelines can be really tricky sometimes. The documentation isn't always crystal-clear on how it's supposed to work. Looking at your example, a cpuple of observations:
echo '##vso[task.setvariable variable=projectVersionDynamic;isOutput=true]$pomVersion' - your single quotes need to be double quotes to expand the value of $pomVersion into the echo statement
(and this is where things get fuzzy): The purpose of task.setvariable is to communicate values to other tasks or jobs. From the documentation: "This doesn't update the environment variables, but it does make the new variable available to downstream steps within the same job."
$variable syntax won't work because the task.setVariable doesn't inject into the running environment - rather, it's a signal to the pipeline to capture the output and store it away for later use.
$(variable) syntax won't work because it's expanded just before the job starts, so it's too late to capture here.
If you use my suggestion in point 1) about double-quoting the task.setVariable, you should see the value available in the second job.
For anybody having a use case where variables defined in one azure pipeline stage needs to be used in another stage, this is how it can be done:
Following snipett of azure pipeline that defines a variable and make it available to be used in a job in the same stage and in a job in another stage:
stages:
- stage: stage1
jobs:
- job: job1
steps:
- task: "AzureCLI#2"
name: step1
inputs:
inlineScript: |
my_variable=a-value
echo "##vso[task.setvariable variable=var_name;isOutput=true]$my_variable"
- job: job2
variables:
stage1_var: $[ dependencies.job1.outputs['step1.var_name'] ]
steps:
- bash: |
echo "print variable value"
echo $(stage1_var)
- stage: stage2
jobs:
- job: job1
variables:
stage2_var: $[ stageDependencies.stage1.job1.outputs['step1.var_name'] ]
steps:
- bash: |
echo "print variable value"
echo $(stage2_var)

need to set condition from variables in azure pipelines

I setting up the variables at global level and value will be provided through variables before running pipeline, trying to set condition using variables but could not succeeded. below is my sample pipeline
trigger
- develop
variables:
stagename: $(stagename)
stages:
- stage: deploy
- task: download artifact
condition: notin(variables($(stagename),(tst1,tst2)))
- task: deploy2
condition: notin(variables($(stagename),(tst1,tst2)))
the goal is to not to run above those two tasks if stagname equals to tst1 or tst2, should run in other cases.
You can refer to the following sample and it worked on my side:
trigger:
- develop
stages:
- stage: deploy
jobs:
- job: Build
steps:
- task: PowerShell#2
condition: notin(variables['stagename'],'tst1','tst2')
inputs:
targetType: 'inline'
script: |
Write-Host "Hello World"
You do not need to set variable at the pipeline root level. A variable set in the pipeline root level will override a variable set in the Pipeline settings UI. Here is the document about define variables and specify conditions.