In Azure DevOps YAML build definition how can we parameterize the name of some dependent template? - azure-devops

I have two templates (build-tsmain-project.yml and build-single-project.yml) which I call from another template like this:
steps:
- ${{ each project in parameters.projects }}:
- ${{ if ne(project.disable, 'true') }}:
- ${{ if eq(project.name, 'TSMain') }}:
- template: build-tsmain-project.yml
parameters:
name: ${{ project.name }}
shortName: ${{ project.shortName }}
path: ${{ project.name }}.sln
moreBuildArgs: ${{ parameters.restoreArgs }} ${{ parameters.moreBuildArgs }} ${{ project.moreBuildArgs }}
testFilter: ${{ project.testFilter }}
${{ if parameters.withCoverage }}:
moreTestArgs: ${{ parameters.coverageArgs }}
publishTestResults: false
noTest: ${{ project.noTest }}
- ${{ if ne(project.name, 'TSMain') }}:
- template: build-single-project.yml
parameters:
name: ${{ project.name }}
shortName: ${{ project.shortName }}
path: ${{ project.name }}.sln
moreBuildArgs: ${{ parameters.restoreArgs }} ${{ parameters.moreBuildArgs }} ${{ project.moreBuildArgs }}
testFilter: ${{ project.testFilter }}
${{ if parameters.withCoverage }}:
moreTestArgs: ${{ parameters.coverageArgs }}
publishTestResults: false
noTest: ${{ project.noTest }}
That is a problem, because I duplicate the parameters - not DRY.
The only working solution that is DRY which I could find involves introducing a new template (build-single-project-or-tsmain.yml):
parameters:
- name: template_name
type: string
- name: template_parameters
type: object
steps:
- template: ${{ parameters.template_name }}
parameters: ${{ parameters.template_parameters }}
Which allows me to rewrite the original code like this:
steps:
- ${{ each project in parameters.projects }}:
- ${{ if ne(project.disable, 'true') }}:
- template: build-single-project-or-tsmain.yml
parameters:
${{ if eq(project.name, 'TSMain') }}:
template_name: build-tsmain-project.yml
${{ if ne(project.name, 'TSMain') }}:
template_name: build-single-project.yml
template_parameters:
name: ${{ project.name }}
shortName: ${{ project.shortName }}
path: ${{ project.name }}.sln
moreBuildArgs: ${{ parameters.restoreArgs }} ${{ parameters.moreBuildArgs }} ${{ project.moreBuildArgs }}
testFilter: ${{ project.testFilter }}
${{ if parameters.withCoverage }}:
moreTestArgs: ${{ parameters.coverageArgs }}
publishTestResults: false
noTest: ${{ project.noTest }}
Now I am wondering if there is a way to avoid introducing a new template and still have a DRY solution?
We use Azure DevOps Server 2020 (on-prem).

Related

Injecting an input to a step provided in a stepList?

I have a pipeline template that takes a stepList:
parameters:
- name: applicationDeploySteps
type: stepList
default: []
And injects the stepList into the template:
- deployment: Deploy_App
displayName: Deploy Application
pool: ${{ variables.AgentPool }}
environment: ${{ parameters.Stage }}
variables:
- name: ServiceConnection
value: SomeServiceConnection
strategy:
runOnce:
deploy:
steps:
- ${{ each step in parameters.applicationDeploySteps }}:
- ${{ each pair in step }}:
${{ pair.key }}: ${{ pair.value }}
However, I'd like to provide an AzureCLI#2 step, with the azureSubscription parameter being sourced from a variable inaccessible to the AzureCLI#2 step at the time of template compilation:
extends:
template: main.yml
parameters:
applicationDeploySteps:
- task: AzureCLI#2
inputs:
azureSubscription: $(ServiceConnection)
addSpnToEnvironment: true
scriptType: 'bash'
scriptLocation: 'inlineScript'
inlineScript: |
echo "do azurey things here"
The problem is in azureSubscription: $(ServiceConnection). Obviously, that variable can't resolve. So the solution I'm shooting for is to inject the azureSubscription value in the template. However, I can't find a way to effectively iterate over the values provided in the input block.
- ${{ each pair in step }}:
${{ pair.key }}: ${{ pair.value }}
will let me interrogate the type of the step. Trying to take it further just gives me a null reference exception when trying to queue the pipeline:
- ${{ each pair in step }}:
${{ if eq(pair.key, 'inputs') }}:
- ${{ each input in pair.value }}:
${{ if eq(input.key, 'azureSubscription') }}:
${{ input.key }}: ${{ variables.ServiceConnection }}
${{ else }}:
${{ input.key }}: ${{ input.value }}
${{ else }}:
${{ pair.key }}: ${{ pair.value }}
That attempt gives me: Object reference not set to an instance of an object. with no corresponding line number. I'm guessing it's failing to iterate over pair.value, but I have no idea how to troubleshoot it further or get an idea of what I can and cannot iterate over. The documentation does not include more comprehensive examples, just checking if, say, it's a script task and blocking execution.
Note that this is similar, but not the scenario that I'm implementing.
I figured it out.
- ${{ each step in parameters.applicationDeploySteps }}:
- ${{ each pair in step }}:
${{ if eq(pair.key, 'inputs') }}:
inputs:
${{ each input in pair.value }}:
${{ if eq(input.key, 'azureSubscription') }}:
${{ input.key }}: ${{ variables.ServiceConnection }}
${{ else }}:
${{ input.key }}: ${{ input.value }}
${{ else }}:
${{ pair.key }}: ${{ pair.value }}
This document was useful for pointing me in the right direction, but honestly it was mostly trial-and-error.

How to get the value of parameter in condition azure pipelines yaml

Here's a section of my YAML pipeline:
parameters:
- name: appEnv
default: None
type: string
values:
- None
- Dev
- QA
- Stage
- Stage2
- Prod
variables:
- name: envConfiguration
${{ if and(eq('${{parameters.appEnv}}', 'None'), startsWith(variables['Build.SourceBranchName'], 'release/')) }}:
value: Prod
${{ elseif and(eq('${{parameters.appEnv}}', 'None'), startsWith(variables['Build.SourceBranchName'], 'hotfix/')) }}:
value: Prod
${{ elseif and(eq('${{parameters.appEnv}}', 'None'), startsWith(variables['System.PullRequest.TargetBranch'], 'release/')) }}:
value: Stage
${{ elseif and(eq('${{parameters.appEnv}}', 'None'), startsWith(variables['System.PullRequest.TargetBranch'], 'hotfix/')) }}:
value: Stage2
${{ elseif eq('${{parameters.appEnv}}', 'None') }}:
value: Dev
${{ else }}:
value: ${{parameters.appEnv}}
- name: envFileName
${{ if eq(variables.envConfiguration, 'Dev') }}:
value: .env.dev
${{ if eq(variables.envConfiguration, 'QA') }}:
value: .env.qa
${{ if eq(variables.envConfiguration, 'Stage') }}:
value: .env.stage
${{ if eq(variables.envConfiguration, 'Stage2') }}:
value: .env.stage2
${{ if eq(variables.envConfiguration, 'Prod') }}:
value: .env.prod
The problem is that the envConfiguration conditions are never honoured. Whenever I create a PR targeting the release/ branch the value is always None. I think it is because of runtime and compile time variables (not sure).
All I want to do is select the correct .env file based on the appEnv parameter and the source and target branches.
You are using incorrect syntax.
${{ if and(eq('${{parameters.appEnv}}', 'None'), startsWith(variables['Build.SourceBranchName'], 'release/')) }}:
should be
${{ if and(eq(parameters.appEnv, 'None'), startsWith(variables['Build.SourceBranchName'], 'release/')) }}:
As written, it's doing a literal comparison of the values ${{parameters.appEnv}} and None, which will, of course, always be false.
In general, refer to the documentation for examples of correct syntax.

YAML Extends Template - Unrecognized value: 'else'. Located at position 1 within expression: else

Expanding on my first question. I am now experiencing a new error Unrecognized value: 'else'. Located at position 1 within expression: else that I cannot figure out a solution to. The issue happens when I add the 'else' condition within the foreach of the job.steps.
Build YAML
resources:
repositories:
- repository: self
type: git
ref: refs/heads/Development
- repository: AdoRestrictions
type: git
name: utl-yaml-templates
ref: refs/heads/main
trigger: none
pool:
name: PROD
extends:
template: ADO_Stage_Restrictions_Dev.yml#AdoRestrictions
parameters:
stageObjs:
- stage: 'BuildStage'
displayName: 'Build Test'
jobs:
- job: 'BuildJob'
displayName: 'Build'
steps:
- task: PowerShell#2
displayName: 'PS Hello World'
inputs:
targetType: inline
script: |
Write-Host "Hello World"
Working Extends YAML
parameters:
- name: stageObjs
type: stageList
default: []
stages:
- ${{ each stage in parameters.stageObjs }}:
- stage: ${{ stage.stage }}
displayName: ${{ stage.displayName }}
jobs:
- ${{ each job in stage.jobs }}:
- job: ${{ job.job }}
displayName: ${{ job.displayName }}
steps:
- ${{ each step in job.steps }}:
- ${{ if or(startsWith(step.task, 'PowerShell'),startsWith(step.task, 'CmdLine'),startsWith(step.task, 'Bash'),startsWith(step.task, 'ShellScript'),containsValue(step.task, 'Script'),containsValue(step.task, 'CLI'),containsValue(step.task, 'PowerShell')) }}:
- task: PowerShell#2
displayName: 'Unapproved - ${{ step.displayName }}'
inputs:
targetType: inline
script: Write-Output "Unapproved Task - Scripting and CLI tasks are not approved for use in yaml pipelines"
Broken Extends YAML
parameters:
- name: stageObjs
type: stageList
default: []
stages:
- ${{ each stage in parameters.stageObjs }}:
- stage: ${{ stage.stage }}
displayName: ${{ stage.displayName }}
jobs:
- ${{ each job in stage.jobs }}:
- job: ${{ job.job }}
displayName: ${{ job.displayName }}
steps:
- ${{ each step in job.steps }}:
- ${{ if or(startsWith(step.task, 'PowerShell'),startsWith(step.task, 'CmdLine'),startsWith(step.task, 'Bash'),startsWith(step.task, 'ShellScript'),containsValue(step.task, 'Script'),containsValue(step.task, 'CLI'),containsValue(step.task, 'PowerShell')) }}:
- task: PowerShell#2
displayName: 'Unapproved - ${{ step.displayName }}'
inputs:
targetType: inline
script: Write-Output "Unapproved Task - Scripting and CLI tasks are not approved for use in yaml pipelines"
- ${{ else }}:
- ${{ step }}
Full Error Message
(Line: 22, Col: 13): Unrecognized value: 'else'. Located at position 1 within expression: else. (Line: 22, Col: 13): Expected at least one key-value pair in the mapping
Update - Working Solution
parameters:
- name: stageObjs
type: stageList
default: []
stages:
- ${{ each stage in parameters.stageObjs }}:
- stage: ${{ stage.stage }}
displayName: ${{ stage.displayName }}
jobs:
- ${{ each job in stage.jobs }}:
- job: ${{ job.job }}
displayName: ${{ job.displayName }}
steps:
- ${{ each step in job.steps }}:
#Disallow any type of scripting tasks
- ${{ if or(startsWith(step.task, 'PowerShell'),startsWith(step.task, 'CmdLine'),startsWith(step.task, 'Bash'),startsWith(step.task, 'ShellScript'),contains(step.task, 'Script'),contains(step.task, 'CLI'),contains(step.task, 'PowerShell')) }}:
- task: PowerShell#2
displayName: 'Unapproved Task - ${{ step.displayName }}'
inputs:
targetType: inline
script: throw "Unapproved Task - Scripting and CLI tasks are not approved for use in yaml pipelines"
#Not a scripting task perform additional checks
- ${{ if and(not(startsWith(step.task, 'PowerShell')),not(startsWith(step.task, 'CmdLine')),not(startsWith(step.task, 'Bash')),not(startsWith(step.task, 'ShellScript')),not(contains(step.task, 'Script')),not(contains(step.task, 'CLI')),not(contains(step.task, 'PowerShell')) ) }}:
#Validate azure subscription provided for the environment
- ${{ if contains(step.task, 'Azure') }}:
- ${{ each pair in step }}:
${{ if eq(pair.key, 'inputs') }}:
inputs:
${{ each attribute in pair.value }}:
${{ if contains(attribute.key, 'Subscription') }}:
${{ if and(ne(attribute.value, 'sub-name1'), ne(attribute.value, 'sub-name2'), ne(attribute.value, 'sub-name3')) }}:
${{ attribute.value }}: ''
${{ if or(eq(attribute.value, 'sub-name1'), eq(attribute.value, 'sub-name2'), eq(attribute.value, 'sub-anme3')) }}:
${{ pair.key }}: ${{ pair.value }}
${{ if ne(pair.key, 'inputs') }}:
${{ if eq(pair.key, 'displayName') }}:
${{ pair.key }}: 'Invalid Azure Subscription - ${{ pair.value }}'
${{ if ne(pair.key, 'displayName') }}:
${{ pair.key }}: ${{ pair.value }}
#Allow all other tasks
- ${{ if not(contains(step.task, 'Azure')) }}:
- ${{ step }}

Passing a dictionary to a template in azure devops yaml

I want to run a loop in a template to download two artifacts with specific versions.
I have been trying to formulate solutions for this but no luck yet, this is what I've came up until now but i think its not supported.
Can someone point me in the right direction if this is possible?
pipeline.yml
variables:
- template: project.variables.yml
jobs:
- job: 'Deploy'
steps:
- template: instantclient.template.yml
parameters:
artifacts:
oracle-instantclient:
package: 'oracle-instantclient'
packageVersion: ${{ variables.oracle-instantclient }}
oracle-data-access-components:
package: 'oracle-data-access-components'
packageVersion: ${{ variables.oracle-data-access-components }}
instantclient.template.yml
parameters:
- name: artifacts
type: object
- name: feed
default: ahitapplicationteam
- name: downloadDirectory
default: deployment/s
steps:
- ${{ each artifact in parameters.artifacts}}:
- template: artifacts.template.yml
parameters:
packageVersion: ${{ packageVersion }}
feed: ${{ parameters.feed }}
package: ${{ package }}
downloadDirectory: ${{ parameters.downloadDirectory }}
artifacts.template.yml
parameters:
- name: packageVersion
- name: feed
- name: package
- name: downloadDirectory
steps:
- task: UniversalPackages#0
displayName: 'Downloading | Feed: ${{ parameters.feed }} | Package: ${{ parameters.package }}' #| PackageVersion: ${{ parameters.packageVersion }}
inputs:
command: 'download'
downloadDirectory: ${{ parameters.downloadDirectory }}
vstsFeed: ${{ parameters.feed }}
vstsFeedPackage: ${{ parameters.package }}
vstsPackageVersion: ${{ parameters.packageVersion }}
verbosity: 'Debug'
You're on the right track. You need to add the - character to each item in your object to convert it into an array. The object can be an array of simple strings or complex objects. As an object, you can access the properties of your objects in the each loop.
The use of ${{ variables.oracle-data-access-components }} assumes that the oracle-data-access-components variable is available at compile-time when the pipeline is initially processed.
Whether you want to break it into 3 templates is a stylistic decision. I went with 2 templates to simplify readability, but a third template will provide you will some validation for required parameters.
pipeline.yml
variables:
- template: project.variables.yml
jobs:
- job: 'Deploy'
steps:
- template: instantclient.template.yml
parameters:
artifacts:
- name: 'oracle-instantclient'
version: ${{ variables.oracle-instantclient }}
- name: 'oracle-data-access-components'
version: ${{ variables.oracle-data-access-components }}
instantclient.template.yml
parameters:
# list of package to download (name, version)
- name: artifacts
type: object
# azure artifact feed name
- name: feed
type: string
default: 'ahitapplicationteam'
# download path for artifacts
- name: downloadDirectory
type: string
default: 'deployment/s'
steps:
# loop through the artifacts (name, version)
- ${{ each artifact in parameters.artifacts}}:
# download the artifact
- task: UniversalPackages#0
displayName: 'Downloading | Feed: ${{ parameters.feed }} | Package: ${{ artifact.name }}' #| PackageVersion: ${{ artifact.version }}
inputs:
command: 'download'
downloadDirectory: ${{ parameters.downloadDirectory }}
vstsFeed: ${{ parameters.feed }}
vstsFeedPackage: ${{ artifact.name }}
vstsPackageVersion: ${{ artifact.version }}
verbosity: 'Debug'

Github actions reusable workflows currently does not support environments. Will my hack stop secrets from working?

I am using outputs on each job as a hack to enable Github environments to control if my reusable workflow runs.
My only concern is the "ENV_AWS_ACCESS_KEY_ID" & "ENV_AWS_SECRET_ACCESS_KEY". These secrets are environment specific. How does the reusable workflow know what secret I am passing in?
Is there a risk with the current setup it could get overwritten if two environments got ran at the same time?
name: Used to rollback docker containers
on:
workflow_call:
inputs:
tag_to_identify_containers:
description: The last known containers prior to deployment
type: choice
required: true
options:
- last-known-testing
- last-known-integrate
- last-known-production
new_tag_to_apply_to_containers:
type: choice
required: true
options:
- testing-latest
- integrate-latest
- production-latest
jobs:
rollback_on_testing:
runs-on: ubuntu-latest
name: Rollback on testing
outputs:
signal_deployment: ${{ steps.step_id.outputs.environment }}
environment:
name: test
url: https://test.###/
steps:
- id: step_id
run: echo "::set-output name=environment::test"
retag_and_rollback_test:
needs: rollback_on_testing
if: needs.rollback_on_testing.outputs.signal_deployment == 'test'
uses: ###/###/.github/workflows/container-tagger.yml#main
with:
tag_to_identify_containers: ${{ github.event.inputs.tag_to_identify_containers }}
new_tag_to_apply_to_containers: ${{ github.event.inputs.new_tag_to_apply_to_containers }}
aws-region: eu-west-2
run_cron_and_cycle_containers: true
secrets:
AWS_ACCESS_KEY_ID: ${{ secrets.SHARED_AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.SHARED_AWS_SECRET_ACCESS_KEY }}
ENV_AWS_ACCESS_KEY_ID: ${{ secrets.THIS_AWS_ACCESS_KEY_ID }}
ENV_AWS_SECRET_ACCESS_KEY: ${{ secrets.THIS_AWS_SECRET_ACCESS_KEY }}
rollback_on_integrate:
runs-on: ubuntu-latest
name: Rollback on Integrate
outputs:
signal_deployment: ${{ steps.step_id.outputs.environment }}
environment:
name: integrate
url: https://integrate.###/
steps:
- id: step_id
run: echo "::set-output name=environment::integrate"
retag_and_rollback_integrate:
needs: rollback_on_integrate
if: needs.rollback_on_integrate.outputs.signal_deployment == 'integrate'
uses: ###/###/.github/workflows/container-tagger.yml#main
with:
tag_to_identify_containers: ${{ github.event.inputs.tag_to_identify_containers }}
new_tag_to_apply_to_containers: ${{ github.event.inputs.new_tag_to_apply_to_containers }}
aws-region: eu-west-2
run_cron_and_cycle_containers: true
secrets:
AWS_ACCESS_KEY_ID: ${{ secrets.SHARED_AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.SHARED_AWS_SECRET_ACCESS_KEY }}
ENV_AWS_ACCESS_KEY_ID: ${{ secrets.THIS_AWS_ACCESS_KEY_ID }}
ENV_AWS_SECRET_ACCESS_KEY: ${{ secrets.THIS_AWS_SECRET_ACCESS_KEY }}
rollback_on_production:
runs-on: ubuntu-latest
name: Rollback on Production
outputs:
signal_deployment: ${{ steps.step_id.outputs.environment }}
environment:
name: production
url: https://###/
steps:
- id: step_id
run: echo "::set-output name=environment::production"
retag_and_rollback_production:
needs: rollback_on_integrate
if: needs.rollback_on_integrate.outputs.signal_deployment == 'production'
uses: ###/###/.github/workflows/container-tagger.yml#main
with:
tag_to_identify_containers: ${{ github.event.inputs.tag_to_identify_containers }}
new_tag_to_apply_to_containers: ${{ github.event.inputs.new_tag_to_apply_to_containers }}
aws-region: eu-west-2
run_cron_and_cycle_containers: true
secrets:
AWS_ACCESS_KEY_ID: ${{ secrets.SHARED_AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.SHARED_AWS_SECRET_ACCESS_KEY }}
ENV_AWS_ACCESS_KEY_ID: ${{ secrets.THIS_AWS_ACCESS_KEY_ID }}
ENV_AWS_SECRET_ACCESS_KEY: ${{ secrets.THIS_AWS_SECRET_ACCESS_KEY }}
An idea would be to use a matrix for your GitHub reusable workflow.
name: Reusable workflow with matrix strategy
on:
push:
jobs:
ReuseableMatrixJobForDeployment:
strategy:
matrix:
stage: [test, integration, production]
uses: octocat/octo-repo/.github/workflows/deployment.yml#main
with:
environment: ${{ matrix.stage }}
tag_to_identify_containers: ${{ github.event.inputs.tag_to_identify_containers }}
new_tag_to_apply_to_containers: ${{ github.event.inputs.new_tag_to_apply_to_containers }}
aws-region: eu-west-2
run_cron_and_cycle_containers: true
secrets: inherit
When GitHub runs the workflow, your reusable workflow should have the environment "name" set to:
jobs:
rollback_on_testing:
runs-on: ubuntu-latest
name: Rollback on testing
outputs:
signal_deployment: ${{ steps.step_id.outputs.environment }}
environment:
name: ${{inputs.environment}}
url: https://test.###/
which should give you access to the environment's secrets inherited... "secrets: inherit".