Create variables dynamically in azure pipeline - azure-devops

I'm trying to generate release notes in an azure piplelines stage and push the note to an azure service bus.
How do I expose the variable in a bash script then consume it in a subsequent job in the same stage?
I'm using a bash task to execute a git command and trying to export it as an environment variable which I want to use in the following job.
- stage: PubtoAzureServiceBus
variables:
COMMIT_MSG: "alsdkfgjdsfgjfd"
jobs:
- job: gitlog
steps:
- task: Bash#3
inputs:
targetType: 'inline'
script: |
# Write your commands here
export COMMIT_MSG=$(git log -1 --pretty=format:"Author: %aN%n%nCommit: %H%n%nNotes:%n%n%B")
env | grep C
- job:
pool: server
dependsOn: gitlog
steps:
- task: PublishToAzureServiceBus#1
inputs:
azureSubscription: 'Slack Release Notifications'
messageBody: |
{
"channel":"XXXXXXXXXX",
"username":"bot",
"iconEmoji":"",
"text":":airhorn: release :airhorn: \n`$(COMMIT_MSG)`"
}
signPayload: false
waitForCompletion: false

You need to use logging syntax and output variables like it is shown here:
trigger: none
pool:
vmImage: 'ubuntu-latest'
stages:
- stage: A
jobs:
- job: A1
steps:
- bash: echo "##vso[task.setvariable variable=shouldrun;isOutput=true]true"
# or on Windows:
# - script: echo ##vso[task.setvariable variable=shouldrun;isOutput=true]true
name: printvar
- stage: B
dependsOn: A
jobs:
- job: B1
condition: in(stageDependencies.A.A1.result, 'Succeeded', 'SucceededWithIssues', 'Skipped')
steps:
- script: echo hello from Job B1
- job: B2
variables:
varFromA: $[ stageDependencies.A.A1.outputs['printvar.shouldrun'] ]
steps:
- script: echo $(varFromA) # this step uses the mapped-in variable
Please take a look here to check documentation.
So you need to replace
export COMMIT_MSG=$(git log -1 --pretty=format:"Author: %aN%n%nCommit: %H%n%nNotes:%n%n%B")
wit logging command with isOutput=true
and then map it as here
jobs:
- job: A
steps:
- bash: |
echo "##vso[task.setvariable variable=shouldrun;isOutput=true]true"
name: ProduceVar # because we're going to depend on it, we need to name the step
- job: B
dependsOn: A
variables:
# map the output variable from A into this job
varFromA: $[ dependencies.A.outputs['printvar.shouldrun']
steps:
- script: echo $(varFromA) # this step uses the mapped-in variable
as you want to share variable between jobs (not stages as it shown in the first example).

Related

How to use variable group as a runtime parameter in azure devops yml

I would like to pass the variable group as a runtime parameter so that whenever I run the pipeline, it should allow me to provide the input variable group name, and based on the input value for the variable group during runtime my pipeline should proceed.
I want to achieve this when we click on the run button, then there's a variable section also. So, I want you to accept the variable group names from there.
Pipeline.yml:
stages:
- stage: VMBackupandValidate
displayName: 'VM Backup and Validate using RSV'
jobs:
- job: VMBackupValidate
displayName: 'Azure VM Backup'
steps:
- task: AzurePowerShell#5
inputs:
azureSubscription: $(azure_sc)
ScriptType: 'FilePath'
ScriptPath: 'pipelines/automation/scripts/vmbackup.ps1'
ScriptArguments: '-ResourceGroupName $(ResourceGroupName) -Storagetype $(Storagetype) -SourceVMname $(SourceVMname) -RSVaultname $(RSVaultname) -Location $(Location) -WorkLoadType $(WorkLoadType) -Policyname $(Policyname) -Verbose'
azurePowerShellVersion: 'LatestVersion'
pwsh: true
Based on comments communication with OP.
I suggest using a parameter with a default value. It will ask you for input if want other values, before you hit run then make a condition to select the right variable based on input.
Here is a minified sample of the pipeline:
parameters:
- name: environment
displayName: Deploy Environment
type: string
default: TEST
values:
- TEST
- PROD
trigger:
- 'none'
variables:
- name: environment
${{ if contains(parameters.environment, 'TEST') }}:
value: TEST
${{ if contains(parameters.environment, 'PROD') }}:
value: PROD
stages:
- stage: TEST
displayName: Build
condition: ${{ eq(variables.environment, 'TEST') }}
jobs:
- job:
pool:
vmImage: 'ubuntu-20.04'
steps:
- script: |
echo ${{ variables.environment}}
displayName: 'Print environment info'
You can extend the logic, or replace it with other values and consume it in code later. You can create multiple stages with conditions as well as shown.
Lets say you have two variable groups with names prod and test. You could use the below pipeline:
trigger:
- main
parameters:
- name: environment
displayName: Where to deploy?
type: string
default: test
values:
- prod
- test
pool:
vmImage: ubuntu-latest
variables:
- group: ${{parameters.environment}}
steps:
- script: |
echo $(ENV)
echo $(VERSION)
displayName: Step 1 - print version and environment
- script: pwd ENV ${{parameters.environment}}
displayName: Step 2 - print parameter
You should define ENV, VERSION values on both variable groups.
Your stage should stay as is. In your case you will delete the steps I provided and use only the first part of the pipeline
Adding a reference article.
https://blog.geralexgr.com/azure/deploy-between-different-environments-with-variable-groups-azure-devops?msclkid=002b01eab02f11ec8dffa95dc3a34094

Trigger a build pipeline in Azure DevOps when a specific templates or a build step is present in .yaml file

I am trying to configure a pipeline in ADO yaml builds if a specific conditions are satisfies. below are the details
I have shared templates in separate project from where the CI Pipeline exists. (lets say project 'A')
CI Pipeline is present in a separate project.(lets say project 'B')
Issue : I would like to define a condition in the CI pipeline (in project 'B') which starts the build after validating if a specified templates from (project 'A') are present in the yaml file as a build step
Do you need to check if the templates files are exist or compare some codes in yaml files
In your current situation, we recommend you can try to make your templates as the template yaml file, and then we can use the resources, stages and pipeline conditions
Here is the example demo script:
resources:
repositories:
- repository: templates
type: git
name: Tech-Talk/template
trigger:
paths:
include:
- "temp.yaml"
trigger: none
# variables:
# - name: Test
# value: TestGroup
pool:
vmImage: ubuntu-latest
stages:
- stage: A
jobs:
- job: A1
steps:
- checkout: templates
- bash: ls $(System.DefaultWorkingDirectory)
- bash: |
if [ -f temp.yaml ]; then
# this is used to check if the file is exist: if [ -f your-file-here ]
echo "##vso[task.setVariable variable=FILEEXISTS;isOutput=true]true"
else
echo "##vso[task.setVariable variable=FILEEXISTS;isOutput=true]false"
fi
# or on Windows:
# - script: echo ##vso[task.setvariable variable=FILEEXISTS;isOutput=true]true
name: printvar
- stage: B
condition: eq(dependencies.A.outputs['A1.printvar.FILEEXISTS'], 'true')
dependsOn: A
jobs:
- job: B1
steps:
- script: echo hello from Stage B
# - template: temp.yaml#templates
# parameters:
# agent_pool_name: ''
# db_resource_path: $(System.DefaultWorkingDirectory)
# variable_group: ${{variables.Test}}
Update:
And if you need to use the Windows agent, please refer this task:
- task: PowerShell#2
name: printvar
inputs:
targetType: 'inline'
script: |
$file = '$(System.DefaultWorkingDirectory)\temp.yaml'
if([System.IO.File]::Exists($file)){
Write-Host "##vso[task.setvariable variable=FILEEXISTS;isOutput=true]true"
}
else{
Write-Host "##vso[task.setvariable variable=FILEEXISTS;isOutput=true]false"
}
Attach my test result:

Dynamic variables not available in other stages of azure pipeline

I am new to azure pipelines and am currently experimenting with the passing variables to the later jobs. Here is the current snippet whereby I am trying to extract the project version from the pom file using maven help: evaluate plugin. The pomVersion variable is populated but is not available in the same step or later steps in the second job via the projectVersionDynamic variable.
stages:
- stage: FirstStage
jobs:
- job: FirstJob
pool:
vmImage: 'ubuntu-latest'
steps:
- task: Bash#3
inputs:
targetType: 'inline'
script: |
pomVersion=`mvn help:evaluate -Dexpression=project.version -q -DforceStdout`
echo $pomVersion ##Prints semantic version 2.27.0-SNAPSHOT
echo '##vso[task.setvariable variable=projectVersionDynamic;isOutput=true]$pomVersion'
echo '##vso[task.setvariable variable=projectVersionStatic;isOutput=true]2.27.0'
echo $(Task1.projectVersionDynamic) ##Error message
echo $projectVersionDynamic ##Is empty
name: Task1
displayName: Task1 in JobOne of FirstStage
- job: SecondJob
dependsOn: FirstJob
variables:
DynVar: $[ dependencies.FirstJob.outputs['Task1.projectVersionDynamic'] ]
StaVar: $[ dependencies.FirstJob.outputs['Task1.projectVersionStatic'] ]
pool:
vmImage: 'ubuntu-latest'
steps:
- task: Bash#3
inputs:
targetType: 'inline'
script: |
echo 'SecondJob'
echo $(DynVar) ##Is empty
echo $DynVar ##Is empty
echo $(StaVar) ##Prints 2.27.0
echo $StaVar ##Is empty
displayName: Task in JobTwo of FirstStage
Observation: projectVersionDynamic value does not get populated and is not available in the same task or subsequent tasks within or in later jobs / stages. However, the static variable gets populated in projectVersionStatic without any issues.
Is it possible to set dynamic values for user-defined variables in azure pipelines or is it that I am doing something wrong? I see an example here under the Stages section where it seems to be working.
Variables in Azure Pipelines can be really tricky sometimes. The documentation isn't always crystal-clear on how it's supposed to work. Looking at your example, a cpuple of observations:
echo '##vso[task.setvariable variable=projectVersionDynamic;isOutput=true]$pomVersion' - your single quotes need to be double quotes to expand the value of $pomVersion into the echo statement
(and this is where things get fuzzy): The purpose of task.setvariable is to communicate values to other tasks or jobs. From the documentation: "This doesn't update the environment variables, but it does make the new variable available to downstream steps within the same job."
$variable syntax won't work because the task.setVariable doesn't inject into the running environment - rather, it's a signal to the pipeline to capture the output and store it away for later use.
$(variable) syntax won't work because it's expanded just before the job starts, so it's too late to capture here.
If you use my suggestion in point 1) about double-quoting the task.setVariable, you should see the value available in the second job.
For anybody having a use case where variables defined in one azure pipeline stage needs to be used in another stage, this is how it can be done:
Following snipett of azure pipeline that defines a variable and make it available to be used in a job in the same stage and in a job in another stage:
stages:
- stage: stage1
jobs:
- job: job1
steps:
- task: "AzureCLI#2"
name: step1
inputs:
inlineScript: |
my_variable=a-value
echo "##vso[task.setvariable variable=var_name;isOutput=true]$my_variable"
- job: job2
variables:
stage1_var: $[ dependencies.job1.outputs['step1.var_name'] ]
steps:
- bash: |
echo "print variable value"
echo $(stage1_var)
- stage: stage2
jobs:
- job: job1
variables:
stage2_var: $[ stageDependencies.stage1.job1.outputs['step1.var_name'] ]
steps:
- bash: |
echo "print variable value"
echo $(stage2_var)

Access build pipeline output variable in Environment Approval / Check

I am trying to build an automatic Environment Approval with Invoke Azure Function. I need to pass a variable that is created in a previous build stage. I have added the variable to the function body, but the variable is not evaluated:
Task that creates the output variable (in previous stage)
- task: CmdLine#2
inputs:
script: |
# Debug output (value visible in logs)
pulumi stack output resourceGroupName
echo "##vso[task.setvariable variable=resourceGroupName;isoutput=true]$(pulumi stack output resourceGroupName)"
workingDirectory: 'infra'
Body for the Azure function:
{ "ResourceGroup": "$(resourceGroupName)" }
Log:
2020-03-26T15:57:01.2590351Z POST https://azure-function-url
Request body: {
"ResourceGroup": "$(resourceGroupName)"
}
Added the variable to the function body, but the variable is not
evaluated
This is a expected action. Here what you are doing is share variable across stages, which does not supported directly until now.
Output variable just used to share values between steps instead of stages.
Work around:
If you want to use the generated variable in next stage, a workaround you can consider to use is writing the variable to a file, leveraging pipeline artifacts.
Sample steps:
Here I will pass one variable which name is resourceGroupName to next stage.
1) Create a folder which will contain the variables you want to pass
mkdir -p $(Pipeline.Workspace)/variables
2) Write the contents of the variable to file StageUsed:
echo "$resourceGroupName" > $(Pipeline.Workspace)/variables/StageUsed
3) In next stage, add one job before your InvokeAzureFunction job. Download the variables pipeline artifact that previous stage published.
4) Transfer each file into a variable:
resourceGroupName=$(cat $(Pipeline.Workspace)/variables/StageUsed)
5) Make the variable exposed in the current job, and set it reference name as Out:
echo "##vso[task.setvariable variable=resourceGroupName;isoutput=true]$resourceGroupName"
6) Now, you can access the variable in your InvokeAzureFunction job by calling dependencies.secondjob.outputs['output.resourceGroupName']
Sample Script:
trigger:
- master
pool:
vmImage: 'ubuntu-latest'
stages:
- stage: FirstStage
jobs:
- job: firstjob
pool:
vmImage: 'Ubuntu-16.04'
steps:
- bash: |
resourceGroupName="value"
mkdir -p $(Pipeline.Workspace)/variables
echo "$resourceGroupName" > $(Pipeline.Workspace)/variables/resourceGroupName
- publish: $(Pipeline.Workspace)/variables
artifact: variables
- stage: SecondStage
jobs:
- job: secondjob
pool:
vmImage: 'Ubuntu-16.04'
steps:
- download: current
artifact: variables
- bash: |
resourceGroupName=$(cat $(Pipeline.Workspace)/variables/resourceGroupName)
echo "##vso[task.setvariable variable=resourceGroupName;isoutput=true]$resourceGroupName"
name: output
- bash: |
echo "$(output.resourceGroupName)"
- job: ServerJob
dependsOn: secondjob
pool: server
variables:
resourceGroupName: $[dependencies.secondjob.outputs['output.resourceGroupName']]
steps:
- task: AzureFunction#1
inputs:
function:
method: 'POST'
body: '$(sdf)'
waitForCompletion: 'false'

Azure DevOps Conditional execution of Job that depends on a Job in another Stage

I have a pipeline.yaml that looks like this
pool:
vmImage: image
stages:
-stage: A
jobs:
-job: a
steps:
- script: |
echo "This is stage build"
echo "##vso[task.setvariable variable=doThing;isOutput=true]Yes"
name: BuildStageRun
-stage: B
jobs:
-job: b
steps: #do something in steps
-job: c
dependsOn: a
condition: eq(dependencies.build.outputs['BuildStageRun.doThing'], 'Yes')
steps:
- script: echo "I am scripting"
So, there are 2 stages, A with one job a, and B with 2 jobs b and c. I would like to have job c executed only when job a has executed. I tried to do so by setting the variable doThing in job a to Yes and then check this variable in job c.
But I get an error:
Stage plan job c depends on unknown job a.
The varible definition and the definition of condition was taken from Azure documentation
Do you have any suggestion on how to get this working?
While Shayki is correct that it is not supported - there is a workaround that I am currently using. That I used with the help of this blog https://medium.com/microsoftazure/how-to-pass-variables-in-azure-pipelines-yaml-tasks-5c81c5d31763
Basically you create your output as normal, and then publish the variables as pipeline artifacts. In the next stage, you read the artifact in the first job, and use that to construct your conditionals e.g.
stages:
- stage: firststage
jobs:
- job: setup_variables
pool:
vmImage: 'Ubuntu-16.04'
steps:
- powershell: |
$ShouldBuildThing1 = $true
# Write to normal output for other jobs in this stage
Write-Output "##vso[task.setvariable variable=BuildThing1;isOutput=true]$ShouldBuildThing1"
# Write to file to publish later
mkdir -p $(PipelineWorkspace)/variables
Write-Output "$ShouldBuildThing1" > $PipelineWorkspace/variables/BuildThing1
name: variablesStep
# Publish the folder as pipeline artifact
- publish: $(Pipeline.Workspace)/variables
artifact: VariablesArtifact
- job: build_something
pool:
vmImage: 'Ubuntu-16.04'
dependsOn: setup_variables
condition: eq(dependencies.setup_variables.outputs['variablesStep.BuildThing1'], 'true')
variables:
BuildThing1: $[dependencies.setup_variables.outputs['variablesStep.BuildThing1']]
steps:
- powershell: |
Write-Host "Look at me I can Read $env:BuildThing1"
- somethingElse:
someInputArgs: $(BuildThing1)
- stage: secondstage
jobs:
- job: read_variables
pool:
vmImage: 'Ubuntu-16.04'
steps:
# If you download all artifacts then foldername will be same as artifact name under $(Pipeline.Workspace). Artifacts are also auto downloaded on deployment jobs.
- task: DownloadPipelineArtifact#2
inputs:
artifact: "VariablesArtifact"
path: $(Pipeline.Workspace)/VariablesArtifact
- powershell: |
$ShouldBuildThing1 = $(Get-Content $(Pipeline.Workspace)/VariablesArtifact/BuildThing1)
Write-Output "##vso[task.setvariable variable=BuildThing1;isOutput=true]$ShouldBuildThing1"
name: variablesStep
- job: secondjob
pool:
vmImage: 'Ubuntu-16.04'
dependsOn: read_variables
condition: eq(dependencies.read_variables.outputs['variablesStep.BuildThing1'], 'true')
variables:
BuildThing1: $[dependencies.setup_variables.outputs['variablesStep.BuildThing1']]
steps:
- powershell: |
Write-Host "Look at me I can Read $env:BuildThing1"
- somethingElse:
someInputArgs: $(BuildThing1)
Looks like a few options available from Microsoft now.
First is job-to-job dependencies across stages
From Microsoft:
In this example, job B1 will run whether job A1 is successful or skipped. Job B2 will check the value of the output variable from job A1 to determine whether it should run.
trigger: none
pool:
vmImage: 'ubuntu-latest'
stages:
- stage: A
jobs:
- job: A1
steps:
- bash: echo "##vso[task.setvariable variable=shouldrun;isOutput=true]true"
# or on Windows:
# - script: echo ##vso[task.setvariable variable=shouldrun;isOutput=true]true
name: printvar
- stage: B
dependsOn: A
jobs:
- job: B1
condition: in(stageDependencies.A.A1.result, 'Succeeded', 'SucceededWithIssues', 'Skipped')
steps:
- script: echo hello from Job B1
- job: B2
condition: eq(stageDependencies.A.A1.outputs['printvar.shouldrun'], 'true')
steps:
- script: echo hello from Job B2
Also, there's another option where you could consume output variables across stages.
From Microsoft site:
Stages can also use output variables from another stage. In this example, Stage B depends on a variable in Stage A.
stages:
- stage: A
jobs:
- job: A1
steps:
- bash: echo "##vso[task.setvariable variable=shouldrun;isOutput=true]true"
# or on Windows:
# - script: echo ##vso[task.setvariable variable=shouldrun;isOutput=true]true
name: printvar
- stage: B
condition: and(succeeded(), eq(dependencies.A.outputs['A1.printvar.shouldrun'], 'true'))
dependsOn: A
jobs:
- job: B1
steps:
- script: echo hello from Stage B
It's because you can't depend on a job from another stage, you can depend stage B on stage A or job c on job b.
You can't achieve your goal with YAML conditions because you want to use a variable that you declared it in the first stage, the second stage doesn't know this variable, Azure DevOps don't support it yet:
You cannot currently specify that a stage run based on the value of an
output variable set in a previous stage.
You can depend stage B on A, so if in stage A there is only one job you depend stage B on stage A:
- stage: B
dependsOn: A