Is there any possibilities to call AzDo templated with parameters? - azure-devops

I would like to run a few templates based on the initial value chosen from the parameter and as soon as the value is chosen then a template will be issued which will further ask for more parameters only required for that template.
Let's say in main azure-pipelines.yml if a user chooses dev then simply a template will be called. However, if a user chooses test then template create-stack-tst-template.yml will be issued but along with that, it should prompt the parameters needed for this template. Is it possible?
If not, is there any possibility to club all the parameters only needed for dev and the same for test. So that when the individual templates are called then clubbed parameter will be passed which is necessary for that template to run but not for others.
Is there any kind of segregation exists?
trigger:
- none
parameters:
- name: DeployToEnvType
displayName: |
Select the env type to be deployed
type: string
values:
- dev
- test
stages:
- ${{ if eq(parameters['DeployToEnvType'], 'dev' ) }}:
- template: templates/create-stack-dev-template.yml
- ${{ if ne(parameters['DeployToEnvType'], 'test' ) }}:
- template: templates/create-stack-tst-template.yml
parameters:
- name: ProjectName
type: string
- name: ImageSource
type: string

it should prompt the parameters needed for this template. Is it possible?
This is not possible. You need to provide all parameters and pass further only those which are needed by particular template.
trigger:
- none
parameters:
- name: DeployToEnvType
displayName: |
Select the env type to be deployed
type: string
values:
- dev
- test
- name: ImageSource
type: string
stages:
- ${{ if eq(parameters['DeployToEnvType'], 'dev' ) }}:
- template: templates/create-stack-dev-template.yml
parameters:
ProjectName: projectA
ImageSource: ${{ parameters.ImageSource }}
- ${{ if ne(parameters['DeployToEnvType'], 'test' ) }}:
- template: templates/create-stack-tst-template.yml
parameters:
ProjectName: projectA
ImageSource: ${{ parameters.ImageSource }}
If you need control at runtime you need to make corresponding runtime parameter and pass it down. If you want to have some values fixed you can just put them inline.

Related

For-Each an Object in Azure Devops pipeline?

I starting to write an appication in microservices and want to have a build step to push the image from my pipeline. For this at the moment I have 3 services to push:
- stage: build_and_publish_containers
displayName: 'Docker (:Dev Push)'
jobs:
- template: "docker/publish.yaml"
parameters:
appName: Authorization_Service
projectPath: "Services/AuthorizationService"
imageName: authorization-service
imageTag: ${{variables.imageTag}}
- template: "docker/publish.yaml"
parameters:
appName: Registration_Service
projectPath: "Services/RegistrationService"
imageName: registration-service
imageTag: ${{variables.imageTag}}
- template: "docker/publish.yaml"
parameters:
appName: Tennant_Service
projectPath: "Services/TennantService"
imageName: tennant-service
imageTag: ${{variables.imageTag}}
Even with only this 3 services (and I want to have much more) I have a lot of duplicated code here I want to reduce.
I tried it with an array and an each-function but I have several information here (name / path / imagename) and that could grow.
Is there a better way?
If that would be a programming language I would have an array of a data model, is that something that is possible in azure devops?
Or maybe could each information saved in a json file (so 3 files at the moment and growing) and azure could get all files and informations out of this?
you could check the example below to define your complex object nested loops in Azure pipelines. By the way, you could also look into the github doc for more reference.
parameters:
- name: environmentObjects
type: object
default:
- environmentName: 'dev'
result: ['123']
- environmentName: 'uat'
result: ['223', '323']
pool:
vmimage: ubuntu-latest
steps:
- ${{ each environmentObject in parameters.environmentObjects }}:
- ${{ each result in environmentObject.result }}:
- script: echo ${{ result }}

How to create Azure Pipeline Template to run a jobList all on the same agent?

I am trying to make a pipeline template that takes a JobList a parameter and runs all the jobs, while ensuring that they run on the same agent every time. Basically the approach I've been taking is to try to adapt this answer into a genericized template format.
This is what I have so far, and I've tried a lot of slight tweaks of this with nothing passing the Validate test on the pipeline that calls it.
parameters:
- name: jobsToRun
type: jobList
- name: pool
type: string
default: Default
- name: demands
type: object
default: []
jobs:
- job:
steps:
- script: echo "##vso[task.setvariable variable=agentName;isOutput=true;]$(Agent.Name)"
pool:
name: ${{ parameters.pool }}
demands:
- ${{ each demand in parameters.demands }}:
${{ demand }}
- ${{ each j in parameters.jobsToRun }}:
${{ each pair in j }}:
${{ pair.key }} : ${{ pair.value }}
pool:
name: Default
demands:
- Agent.Name -equals $(agentName)
What am I doing wrong here? It seems like it should be possible if that answer I reference is correct, but it seems like I'm just a bit off.
Name missing on the job.., example below.
- job: 'test-Name'
Steps need a associated job and pool to run is declared inside
jobsToRun:
- job: sample_job1
displayName: "sample_job1"
pool:
name: "your_PoolName"
steps:
- script: |
echo "Hi"
On this bottom pool declaration...
pool:
name: Default
demands:
- Agent.Name -equals $(agentName)
i am not sure but i have tried this many times but i think this can't be included separate to the job since each individual job is passed as parameter. pool definition needs to be inside the job or inside the job template if you are using templates..
example:
jobsToRun:
- job: output_message_job1
displayName: "in pipe Output Message Job"
pool:
name: "your_PoolName"
steps:
- script: |
echo "Hi"

How can I pass one of group variables value as a template parameter?

I've got yaml pipeline that referes some templates.
I have variable group linked to this main yaml file and I want to pass one of its variables to the template.
It's simple when I want to "just use it" as bellow:
example:
stages:
- stage: Deployment
variables:
- group: My_group_variables
jobs:
- template: /templates/jobs/myJobTemplate.yml
parameters:
someParameter: $(variable_from_my_variable_group)
myJobTemplate.yml:
parameters:
- name: someParameter
default: ''
jobs:
- job: Myjob
steps:
- task: Bash#3
inputs:
targetType: 'inline'
script: cat ${{ parameters.someParameter}}
It does not cooperate when I want to have parameters validation like:
parameters:
- name: environmentName
type: string
values:
- Development
- Test
- UAT
- Production
Or when I want to use "service connection" name as a variable.
...
- task: KubernetesManifest#0
displayName: Deploy to Kubernetes cluster
inputs:
action: 'deploy'
kubernetesServiceConnection: ${{ parameters.kubernetesServiceConnection }}
namespace: ${{ parameters.kubernetesNamespace }}
manifests: ${{ variables.manifestFile }}
...
Does anyone know how should I use those variables with pre-validated parameters or service connections?
It's most probably an issue with the time of resolving values. Pre-defined parameters and service connections names are checked on compile-time, while values from $() are resolved during runtime.
I cannot use extends and variables in this template.
Maybe someone has a pattern for those kinds of usage?

Passing parameters through nested templates (or declaring IF conditions on variables)

I would like to be able to pass a pipeline parameter all the way through my YAML pipeline without having to define a parameter in each and every YAML file.
Essentially I have a main YAML file which calls a stage YAML, that has multiple nested jobs YAML, which in turn calls nested steps YAML; essentially building up my pipeline as I should using templates: https://learn.microsoft.com/en-us/azure/devops/pipelines/process/templates?view=azure-devops
Here's a tree list sample folder;
E:.
├───01_stage (many files per folder)
├───02_jobs (many files per folder)
├───03_steps (many files per folder)
└───...main pipeline files
Ideally I want to run an IF condition on checking out a repository depending upon the pipeline being PROD or NON-PROD. I am fine with defining this as a parameter, but I am also open to it being defined as a variable. As far as I'm aware; you can't use IF condition on variables.
This is fine
- ${{ if eq(parameters.pester, true) }}: # or even as variables['pester']
- name: pester
value: yes
This is not fine
- ${{ if eq(variables.pester, true) }}: # or even as variables['pester']
- name: pester
value: yes
The condition I want this to run is nest far below many templates, and it would be absolutely painful to have to re-code everything to confirm to the parameters value being declared and passed down in each file.
This is where I want it:
steps:
- ${{ if eq(parameters['masterTagged'], 'true') }}: # here
- checkout: masterTagged
displayName: Repo Tagged
- ${{ if ne(parameters['masterTagged'], 'true') }}: # here
- checkout: self
displayName: Repo Self
- template: /.pipelines/03_steps/ssh_install.yml
- template: /.pipelines/03_steps/tf_install.yml
parameters:
terraformVersion: ${{ parameters['terraformVersion'] }}
- ...many more templates
Here is my main YAML pipeline file:
parameters:
- name: artifactory_base
type: boolean
default: true
# ...many more params
- name: pester
type: boolean
default: true
- name: planDeploy
type: boolean
default: true
- name: useBackupAgent
type: boolean
default: false
- name: masterTagged # key param
type: boolean
default: true
name: Team2
pr: none
resources:
repositories:
- repository: masterTagged
endpoint: nationwide-ccoe
name: my-github-org/my-github-repo
type: github
ref: refs/tags/v2.0.3
trigger: none
variables:
- template: /.pipelines/config/sub-asdfasdf.config.yml
- template: /.pipelines/config/namingstd.config.yml
- ${{ if eq(parameters.artifactory_base, true) }}:
- name: artifactory_base
value: yes
# ...many more conditions
- ${{ if eq(parameters.pester, true) }}:
- name: pester
value: yes
- ${{ if eq(parameters.planDeploy, true) }}:
- name: planDeploy
value: yes
stages:
- template: /.pipelines/01_stage/lz_deploy.yml
parameters:
${{ if eq(parameters.useBackupAgent, false) }}:
pool:
vmImage: Ubuntu 18.04
${{ if eq(parameters.useBackupAgent, true) }}:
pool:
name: backupAgents
terraformVersion: $(TERRAFORM_VERSION)
Is it possible to set this masterTagged parameter and for it to filter all the way down without having to declare it each time?
Also; is it even possible to use variables instead of parameters in this manner (I understand that parameters expand before variables):
- ${{ if eq(variables.pester, true) }}: # or even as variables['pester']
- name: pester
value: yes
...if it is, have I been doing it wrong all this time?
Note:
I do understand that you can use a standard task condition on the checkout task (shown below); however, having a 'switch' on two tasks ruins the folder path of the checked out repository. Even though we're only checking out on repository, it adds another folder level to the $SYSTEM_DEFAULTWORKINGDIRECTORY. Doing it this way would require more re-coding on the current structure of my YAML piplines.
- checkout: masterTagged
condition: eq(variables['masterTagged'], 'true')
displayName: Repo Tagged
- checkout: self
condition: ne(variables['masterTagged'], 'true')
displayName: Repo Self
If I could, but I know it's not possible (as seen by other peoples requests), I would enable a parameter or variable on the repository reference:
resources:
repositories:
- repository: masterTagged
endpoint: nationwide-ccoe
name: my-github-org/my-github-repo
type: github
ref: ${{ parameters.repoRef }} # here
Is it possible to set this masterTagged parameter and for it to filter all the way down without having to declare it each time?
No, because parameters are “scoped” to the file they are defined with. This is due to them being expanded when the pipeline is first compiled. (See > Pipeline run sequence)
You can use IF conditions on variables, however you can’t use template expressions (wrapped with {{}}) on variables inside templates as the variables do not exist/have not been populated at the point of template expansion.
One option is just using the conditions on the checkout tasks as you suggested, and dealing with the extra folder level to the default working directory. I had to do something similar a while back, our solution was to copy the contents of the repo folder up a level into the default working directory.
Your other option is to do the checkout in the top level pipeline file. This will allow you to template the checkout step/s using the parameter without having to pass it all the way through the files. This is the option I would suggest as you do not have to deal with the folder structure issues of the first option.
This would look something like this:
parameters:
- name: masterTagged
default: true
type: boolean
resources:
repositories:
- repository: masterTagged
endpoint: nationwide-ccoe
name: my-github-org/my-github-repo
type: github
ref: refs/tags/v2.0.3
steps:
- ${{ if eq(parameters.masterTagged, true) }}:
- checkout: masterTagged
- ${{ if eq(parameters.masterTagged, false) }}:
- checkout: self
- template: ./path/to/template.yml
I hope this answers your question.

Optional job templates in YAML Pipelines

Is it possible to optionally include templates based on some kind of template expression? Specifically, I want my top-level definition in azure-pipelines.yml to call out which build job templates to use in an included stage template:
azure-pipelines.yml :
stages:
- template: generic-build-stage.yml # Template reference
parameters:
# Example of optional build templates to use
buildTypes: [SpecificBuildJobs1, SpecificBuildJobs3, SpecificBuildJobs4]
generic-build-stage.yml :
parameters:
buildTypes: ???
stages:
- stage: generic_build
jobs:
${{ }} # ???? What goes here to include the appropriate templates
- template: ???
The template expression above would ideally expand to this:
jobs:
- template: specific-build-jobs1.yml
- template: specific-build-jobs3.yml
- template: specific-build-jobs4.yml
Edit: The "Iterative insertion" example in the docs seems to suggest that some form of dynamic, parse-time insertion is possible.
The following method worked to allow a top-level pipeline definition to consume a variable number of job sets at a lower level.
azure-pipelines.yml :
stages:
- template: generic-build-stage.yml # Template reference
parameters:
# Example of optional build templates to use
buildTypes: [SpecificBuildJobs1, SpecificBuildJobs3, SpecificBuildJobs4]
generic-build-stage.yml :
parameters:
buildTypes: [MissingBuildType] # Use this if buildTypes is not provided
stages:
- stage: build_stage
jobs:
# Note: VS Code extension for Pipelines (1.1574.4) will
# say this is an "Unexpected property", but this works in ADO
- ${{ if containsValue(parameters.buildTypes, 'MissingBuildType') }}:
- template: build-stage-null.yml
- ${{ if containsValue(parameters.buildTypes, 'SpecificBuildJobs1') }}:
- template: specific-build-jobs1.yml
- ${{ if containsValue(parameters.buildTypes, 'SpecificBuildJobs2') }}:
- template: specific-build-jobs2.yml
- ${{ if containsValue(parameters.buildTypes, 'SpecificBuildJobs3') }}:
- template: specific-build-jobs3.yml
- ${{ if containsValue(parameters.buildTypes, 'SpecificBuildJobs4') }}:
- template: specific-build-jobs4.yml
It seems to be impossible, for the template reference is resolved at parse time.
You may have to set multiple templates for main pipeline, and set the value for buildTypes as the specific template name for job templates, and in generic-build-stage.yml use - template:${{parameters.buildTypes}}.yml to call corresponding job template;
Azure-pipelines.yml:
stages:
- template: generic-build-stage.yml
parameters:
buildTypes:specific-build-jobs1
- template: generic-build-stage.yml
parameters:
buildTypes:specific-build-jobs3
generic-build-stage.yml
parameters:
buildTypes: ""
stages:
- stage: generic_build
jobs:
- template: ${{parameters.buildTypes}}.yml