Alternate solution to using variable in resources/repositories/repository/ref - azure-devops

I am trying to find an alternate solution to using variable in resources/repositories/repository/ref because using a variable is technically not allowed.
resources:
resources:
repositories:
- repository: devops
name: MyProjects/devops
type: git
ref: master
The workaround of doing a git clone of the external repository will not work for me because my dependency on that repository is for referencing the templates.
Example:
- template: Build/Templates/downloadFiles.yaml#devops
Does anyone have a solution? Thank you for reading my post!

Refer to this doc: Template Expressions in Repository Resource Definition
Now, you can use template expressions to choose the branch of a repository resource.
Azure DevOps has supported to use variable in Repo Resource to set the ref. We can use template expression: ${{ variables.var }} to define the ref.
Here is an example:
variables:
branchname: main
resources:
repositories:
- repository: devops
name: 123/Repo90
type: git
ref: ${{ variables['branchname'] }}
pool:
vmImage: ubuntu-latest
steps:
- template: test.yml#devops

Related

Intermediate YAML template, resources repository's ref and parameters "Internal error reading the template. Expected a scalar,a sequence,or a mapping"

Azure Pipelines docs state that we can use template expressions in resources.repositories repository.ref argument.
It already works in some parent template, but using $(Build.SourceBranchName) instead of template expressions. In some child template, the template expression produces the error "Internal error reading the template. Expected a scalar, a sequence, or a mapping". BTW, if I use template expressions in parent template, I get there the same error.
Alternatively, if I use $(Build.SourceBranchName), child template throws error "Could not get the latest source version for repository xxxx-dotnet-solution hosted on Azure Repos using ref $(Build.SourceBranchName). (in this case, Azure Pipelines won't parse the variable).
I can't figure out how to fix this problem.
Here's the affected child template:
parameters:
- name: buildConfiguration
type: string
values: [Debug, Release]
- name: api
type: string
- name: solutionLocation
type: string
resources:
repositories:
- repository: xxxx-apis-parent-templates
name: xxxx-dotnet-solution
type: git
ref: ${{ variables['Build.SourceBranchName'] }}
extends:
template: .az-devops-cicd/templates/aks.yaml#xxxx-apis-parent-templates
parameters:
solutionLocation: ${{ parameters.solutionLocation }}
projectRepository: xxxxxx-apis
projectBaseDir: ${{ parameters.solutionLocation }}/${{ parameters.api }}/
projectName: XXXX.${{ parameters.solutionLocation }}.Apis.${{ parameters.api }}
buildConfiguration: ${{ parameters.buildConfiguration }}

Azure Devops Pipeline - Repository branch on trigger

I have a pipeline using a git reposiory
resources:
repositories:
repository: myrepo
type: git
name: src/myrepo
ref: nameofbranch
trigger:
branches:
include:
- triggeringbranch
I want to be able to change repo branch (nameofbranch could be a parameter - for manual run),
but then when the pipeline is automatically triggered by changes on a branch (for example changes on triggeringbranch), I'd like of course the pipeline to use that triggeringbranch...
How to deal with it ?
Can i use some condition to set the value of ref , using Build.SourceBranch if not empty, or nameofbranch otherwise ?
Thank you
You can create a local variable and fill it using the conditions. After that, you can use the variable as input for "ref":
parameters:
- name: "branchName"
default: ""
variables:
- name: "branchName"
${{ if eq(parameters.branchName, '') }}:
value: $(Build.SourceBranch)
${{ else }}:
value: ${{ parameters.branchName }}
resources:
repositories:
- repository: myrepo
type: git
name: src/myrepo
ref: $(branchName)
trigger:
branches:
include:
- triggeringbranch

How to use a shared file from an azure devops repo within an azure devops pipeline

The issue is that I'm not able to load just a normal file.
The below code works for extending a azure pipeline file, but I can't use one .runsettings file from the same repository within my vstest step which is in the extended template. Any ideas, how I can share the .runsettings file?
resources:
repositories:
- repository: service
type: git
name: proj/service
ref: feature/myfeature
extends:
template: service-template1.0.yml#service
You need to add checkout step:
resources:
repositories:
- repository: service
type: git
name: proj/service
ref: feature/myfeature
extends:
template: service-template1.0.yml#service
paramaters:
repoName: self
and then in template
# File: simple-param.yml
parameters:
- name: repoName
type: string
steps:
- checkout: ${{ parameters.repoName }}
......

Passing parameters through nested templates (or declaring IF conditions on variables)

I would like to be able to pass a pipeline parameter all the way through my YAML pipeline without having to define a parameter in each and every YAML file.
Essentially I have a main YAML file which calls a stage YAML, that has multiple nested jobs YAML, which in turn calls nested steps YAML; essentially building up my pipeline as I should using templates: https://learn.microsoft.com/en-us/azure/devops/pipelines/process/templates?view=azure-devops
Here's a tree list sample folder;
E:.
├───01_stage (many files per folder)
├───02_jobs (many files per folder)
├───03_steps (many files per folder)
└───...main pipeline files
Ideally I want to run an IF condition on checking out a repository depending upon the pipeline being PROD or NON-PROD. I am fine with defining this as a parameter, but I am also open to it being defined as a variable. As far as I'm aware; you can't use IF condition on variables.
This is fine
- ${{ if eq(parameters.pester, true) }}: # or even as variables['pester']
- name: pester
value: yes
This is not fine
- ${{ if eq(variables.pester, true) }}: # or even as variables['pester']
- name: pester
value: yes
The condition I want this to run is nest far below many templates, and it would be absolutely painful to have to re-code everything to confirm to the parameters value being declared and passed down in each file.
This is where I want it:
steps:
- ${{ if eq(parameters['masterTagged'], 'true') }}: # here
- checkout: masterTagged
displayName: Repo Tagged
- ${{ if ne(parameters['masterTagged'], 'true') }}: # here
- checkout: self
displayName: Repo Self
- template: /.pipelines/03_steps/ssh_install.yml
- template: /.pipelines/03_steps/tf_install.yml
parameters:
terraformVersion: ${{ parameters['terraformVersion'] }}
- ...many more templates
Here is my main YAML pipeline file:
parameters:
- name: artifactory_base
type: boolean
default: true
# ...many more params
- name: pester
type: boolean
default: true
- name: planDeploy
type: boolean
default: true
- name: useBackupAgent
type: boolean
default: false
- name: masterTagged # key param
type: boolean
default: true
name: Team2
pr: none
resources:
repositories:
- repository: masterTagged
endpoint: nationwide-ccoe
name: my-github-org/my-github-repo
type: github
ref: refs/tags/v2.0.3
trigger: none
variables:
- template: /.pipelines/config/sub-asdfasdf.config.yml
- template: /.pipelines/config/namingstd.config.yml
- ${{ if eq(parameters.artifactory_base, true) }}:
- name: artifactory_base
value: yes
# ...many more conditions
- ${{ if eq(parameters.pester, true) }}:
- name: pester
value: yes
- ${{ if eq(parameters.planDeploy, true) }}:
- name: planDeploy
value: yes
stages:
- template: /.pipelines/01_stage/lz_deploy.yml
parameters:
${{ if eq(parameters.useBackupAgent, false) }}:
pool:
vmImage: Ubuntu 18.04
${{ if eq(parameters.useBackupAgent, true) }}:
pool:
name: backupAgents
terraformVersion: $(TERRAFORM_VERSION)
Is it possible to set this masterTagged parameter and for it to filter all the way down without having to declare it each time?
Also; is it even possible to use variables instead of parameters in this manner (I understand that parameters expand before variables):
- ${{ if eq(variables.pester, true) }}: # or even as variables['pester']
- name: pester
value: yes
...if it is, have I been doing it wrong all this time?
Note:
I do understand that you can use a standard task condition on the checkout task (shown below); however, having a 'switch' on two tasks ruins the folder path of the checked out repository. Even though we're only checking out on repository, it adds another folder level to the $SYSTEM_DEFAULTWORKINGDIRECTORY. Doing it this way would require more re-coding on the current structure of my YAML piplines.
- checkout: masterTagged
condition: eq(variables['masterTagged'], 'true')
displayName: Repo Tagged
- checkout: self
condition: ne(variables['masterTagged'], 'true')
displayName: Repo Self
If I could, but I know it's not possible (as seen by other peoples requests), I would enable a parameter or variable on the repository reference:
resources:
repositories:
- repository: masterTagged
endpoint: nationwide-ccoe
name: my-github-org/my-github-repo
type: github
ref: ${{ parameters.repoRef }} # here
Is it possible to set this masterTagged parameter and for it to filter all the way down without having to declare it each time?
No, because parameters are “scoped” to the file they are defined with. This is due to them being expanded when the pipeline is first compiled. (See > Pipeline run sequence)
You can use IF conditions on variables, however you can’t use template expressions (wrapped with {{}}) on variables inside templates as the variables do not exist/have not been populated at the point of template expansion.
One option is just using the conditions on the checkout tasks as you suggested, and dealing with the extra folder level to the default working directory. I had to do something similar a while back, our solution was to copy the contents of the repo folder up a level into the default working directory.
Your other option is to do the checkout in the top level pipeline file. This will allow you to template the checkout step/s using the parameter without having to pass it all the way through the files. This is the option I would suggest as you do not have to deal with the folder structure issues of the first option.
This would look something like this:
parameters:
- name: masterTagged
default: true
type: boolean
resources:
repositories:
- repository: masterTagged
endpoint: nationwide-ccoe
name: my-github-org/my-github-repo
type: github
ref: refs/tags/v2.0.3
steps:
- ${{ if eq(parameters.masterTagged, true) }}:
- checkout: masterTagged
- ${{ if eq(parameters.masterTagged, false) }}:
- checkout: self
- template: ./path/to/template.yml
I hope this answers your question.

azure devops yaml pipeline with template does not checkout referenced repository

I am using the new Azure DevOps Yaml multi stage pipeline functionality
I've got an Azure DevOps yaml pipeline file for which I want to use templates. I would like the pipeline to checkout self and another repository.
For some reason, self repo has been checked out when this runs, but the repo: pipelines is not being checked out and therefore the job fails (because some of the file dependencies it requires are not there.
Here is an excerpt from my template:
resources:
repositories:
- repository: self
- repository: pipelines
name: vstsproject/pipelines
type: git
source: pipelines
variables:
# Container registry service connection established during pipeline creation
imageRepository: 'vstsprojectweb'
dockerfilePath: '$(Build.SourcesDirectory)/src/Dockerfile.CI'
BuildConfiguration: 'Release'
tag: '$(Build.BuildId)'
stages:
- stage: 'PRD'
jobs:
- template: update-connection-string-db.yml#pipelines
parameters:
resourceGroup: 'application-DEV'
DBSearchString: '###dbservername###'
What is it that I am doing wrong?
I have referred to this microsoft documentation.
You don't need to reference the linked repo in the resources (i.e. self), and if it is the only repo, then it is checked out by default in jobs (not deployment jobs), but if you have additional repos, then you need to check them out manually (with -checkout: <name_of_repo>).
So just do (PS: Cleaned up a little, assumed that the repo is in the same project):
resources:
repositories:
- repository: pipelines
source: pipelines
variables:
# Container registry service connection established during pipeline creation
imageRepository: 'vstsprojectweb'
dockerfilePath: '$(Build.SourcesDirectory)/src/Dockerfile.CI'
BuildConfiguration: 'Release'
tag: '$(Build.BuildId)'
stages:
- stage: 'PRD'
jobs:
- checkout: self
- checkout: pipelines
- template: update-connection-string-db.yml#pipelines
parameters:
resourceGroup: 'application-DEV'
DBSearchString: '###dbservername###'
I ended up putting everything into the same repo and then checking out self in a job.
That worked for me.
jobs:
- job: dbconnectionstring
displayName: 'db connection string'
pool: Windows
steps:
- checkout: self
- template: templates/update-connection-string-db.yml