Pass a file reference as a parameter to a inserted step from another repo in a Azure YAML deployment - azure-devops

I have a repository of yaml based jobs that I'd like to reuse in a number of yaml script. The script that will use the jobs however are in other repository.
The reusable job takes a file path as an input parameter. And for some reason the file can't be found when the imported job executes by the pipeline.
How do I reference the file in the parameter from the main job so it can be found when executing the imported job?
# MyMainScriptTemplate.yml that will be executed by the pipeline
trigger:
- master
resources:
repositories:
- repository: AzureTemplates
type: git
name: AzureTemplates
jobs:
- template: /FunctionApp/DeployFunctionApp.yml#AzureTemplates
parameters:
file: /Azure/Functions/template.json #This can be found when executing ...
# ReusableJobTemplate.yml defines a job that should be referenced from the main script
parameters:
- name: file
type: string
jobs:
- job: DeployFunctionApp
steps:
- task: AzureResourceManagerTemplateDeployment#3
inputs:
deploymentScope: "Resource Group"
azureResourceManagerConnection: "Dev"
subscriptionId: "XYZ"
action: "Create Or Update Resource Group"
resourceGroupName: "XYZ"
location: "West Europe"
templateLocation: "Linked artifact"
csmFile: ${{ parameters.file }}
deploymentMode: "Incremental"
displayName: "Run a one-line script"

Please check how mulirepo behaves.
I would recommend you two steps:
add - checkout: AzureTemplates step before calling template
and change path from /Azure/Functions/template.json to (Agent.BuildDirectory)/AzureTemplates/Azure/Functions/template.json

Related

Use Azure DevOps pipeline variable for resource repositories name in azure-pipelines.yml

I have a shared Azure pipeline yaml definition with the purpose to define one CodeAnalysis pipeline per repository.
How can I define the repository name dynamically?
I tried with name: '$(projectName)' which leads to the error:
The repository $(projectName) in project 8ab9d22b-6819-483b-829d-******* could not be retrieved. Verify the name and credentials being used.
azure-pipelines.yml
resources:
repositories:
- repository: codeAnalysisRepo
type: git
name: shared/codeanalysis
- repository: SourceRepo
type: git
name: '$(projectName)'
jobs:
- job: 'BackendCodeAnalysis'
pool:
name: '$(AgentPool)'
steps:
- checkout: SourceRepo
clean: true
- template: sonarqube_msbuild_prepare.yml#codeAnalysisRepo
parameters:
projectKey: '$(project)'
projectName: '$(project)'
- task: DotNetCoreCLI#2
displayName: "build DestRepo"
inputs:
command: 'build'
projects: '$(Build.Repository.LocalPath)/**/*.csproj'
configuration: Release
- template: sonarqube_execute.yml#codeAnalysisRepo
It works when I hardcode the name
Currently, set parameter and variable is not supported in resources -> repositories.
A work around for this, you could set this at the checkout step. Here is s sample: Check out multiple repositories in your pipeline - Azure Pipelines | Microsoft Docs. Please note that, the repos should be in the same organization.
resources:
  repositories:
    - repository: Repo1
      type: git
      name: Artifacts/Repo1
  
jobs:
  - job: 'BackendCodeAnalysis'
    pool:
      vmimage: windows-latest
    steps:
      - checkout: git://$(projectName)
        clean: true
For your demand, you could create a suggestion ticket via: https://developercommunity.visualstudio.com/report?space=21&entry=problem.

Inline PythonScript Azure Pipelines task in external file

I am developing a Azure task template, and I have a large .py file that I want to be executed in one step
- task: PythonScript#0
displayName: 'Run a Python script'
inputs:
scriptSource: inline
script: |
... really long python code
It's possible to store the code in another file, at the same level of the yml template, and consume it from there? Or what would be the best approach to keep the template clean?
I know that it's possible to use scriptSource
- task: PythonScript#0
displayName: 'Run a Python script'
inputs:
scriptSource: 'filePath'
scriptPath: 'my_python.py'
arguments: '${{ parameters.my_param }}'
But as the template is in another repository than the repository ran in the pipeline, I don't think that I can reach that my_python.py without downloading it with a wget, or cloning, or doing additional steps. I am right?
Regards!
To use a template from another repo you need to define repository source like here:
# Repo: Contoso/LinuxProduct
# File: azure-pipelines.yml
resources:
repositories:
- repository: templates
type: github
name: Contoso/BuildTemplates
steps:
- template: common.yml#templates # Template reference
Once you have you need to just checkout this repo:
# Repo: Contoso/LinuxProduct
# File: azure-pipelines.yml
resources:
repositories:
- repository: templates
type: github
name: Contoso/BuildTemplates
steps:
- checkout: self
- checkout: templates #this download whole repo
- template: common.yml#templates # Template reference
Now you need to figure out where it is downloaded ;)
Multiple repositories: If you have multiple checkout steps in your job, your source code is checked out into directories named after the repositories as a subfolder of s in (Agent.BuildDirectory). If (Agent.BuildDirectory) is C:\agent\_work\1 and your repositories are named tools and code, your code is checked out to C:\agent\_work\1\s\tools and C:\agent\_work\1\s\code.
So if you have script in scripts folder in templates repo you will find it in $(Agent.BuildDirectory)\templates\scripts\script.py.
So then you can use it like this:
- task: PythonScript#0
displayName: 'Run a Python script'
inputs:
scriptSource: 'filePath'
scriptPath: '$(Agent.BuildDirectory)\templates\scripts\script.py'
arguments: '${{ parameters.my_param }}'

Unable to download secure files conditionally in Azure Pipelines

Question
I am using DownloadSecureFile#1 task to download Secure files.
The issue occurs when in Azure DevOps, in the Library's secure files section, only file_A.txt exists.
The script works fine when both files exists.
In my case, a user A will only need file_A.txt, user B will only need file_B.txt.
Is this an expected behavior? Any possible workarounds to fulfill the use-case?
Error Message:
There was a resource authorization issue: "The pipeline is not valid. Job Job: Step fileB input secureFile references secure file file_B.txt which could not be found. The secure file does not exist or has not been authorized for use. For authorization details, refer to https://aka.ms/yamlauthz."
Code:
parameters:
- name: file_name
type: string
default: ''
values:
- file_A.txt
- file_B.txt
pool:
vmImage: ubuntu-latest
steps:
- task: DownloadSecureFile#1
displayName: Download File A
condition: eq('${{ parameters.file_name }}', 'file_A.txt')
name: fileA
inputs:
secureFile: 'file_A.txt'
- task: DownloadSecureFile#1
displayName: Download file B
condition: eq('${{ parameters.file_name }}', 'file_B.txt')
name: fileB
inputs:
secureFile: 'file_B.txt'
Is this an expected behavior?
Yes, this is expected behavior. To turn a pipeline into a run, Azure Pipelines goes through several steps in this order:
First, expand templates and evaluate template expressions.
Next, evaluate dependencies at the stage level to pick the first
stage(s) to run.
For each stage selected to run, two things happen:
All resources used in all jobs are gathered up and validated for
authorization to run.
Evaluate dependencies at the job level to pick the first job(s) to
run.
For each job selected to run, expand multi-configs (strategy: matrix
or strategy: parallel in YAML) into multiple runtime jobs.
For each runtime job, evaluate conditions to decide whether that job
is eligible to run.
Request an agent for each eligible runtime job.
So, your secure files will be downloaded before evaluating conditions. Please refer to the document about Pipeline run sequence. As a workaround, you can refer to the sample shared by #danielorn.
Instead of using the condition on the tasks you can surround the step with an if-statement as described in use parameters to determine what steps run
parameters:
- name: file_name
type: string
default: ''
values:
- file_A.txt
- file_B.txt
pool:
vmImage: ubuntu-latest
steps:
- ${{ if eq(parameters.file_name, 'file_A.txt') }}:
- task: DownloadSecureFile#1
displayName: Download File A
name: fileA
inputs:
secureFile: 'file_A.txt'
- ${{ if eq(parameters.file_name, 'file_B.txt') }}:
- task: DownloadSecureFile#1
displayName: Download file B
name: fileB
inputs:
secureFile: 'file_B.txt'
However if every user needs exactly one file, a common (and cleaner) option would be to provide the name of the file needed as a parameter. If a secure file is not needed (i.e the parameter is the default empty) the step can be excluded using an if statement
parameters:
- name: file_name
type: string
default: ''
values:
- file_A.txt
- file_B.txt
pool:
vmImage: ubuntu-latest
steps:
- ${{ if ne(parameters.file_name, '') }}:
- task: DownloadSecureFile#1
displayName: Download Secure File
name: secureFileDownload
inputs:
secureFile: '${{ parameters.file_name }}'

Azure Devops - passing variables between job templates

Normal (non-template) jobs in Azure DevOps yaml support inter-job variable passing as follows:
jobs:
- job: A
steps:
- script: "echo ##vso[task.setvariable variable=skipsubsequent;isOutput=true]false"
name: printvar
- job: B
condition: and(succeeded(), ne(dependencies.A.outputs['printvar.skipsubsequent'], 'true'))
dependsOn: A
steps:
- script: echo hello from B
How do I do something similar in the following, given that templates don't support the dependsOn syntax? I need to get an output from the first template and pass it as 'environmentSlice' to the second template.
- stage: Deploy
displayName: Deploy stage
jobs:
- template: build-templates/get-environment-slice.yml#templates
parameters:
configFileLocation: 'config/config.json'
- template: build-templates/node-app-deploy.yml#templates
parameters:
# Build agent VM image name
vmImageName: $(Common.BuildVmImage)
environmentPrefix: 'Dev'
environmentSlice: '-$(dependencies.GetEnvironmentSlice.outputs['getEnvironmentSlice.environmentSlice'])'
The reason I want the separation between the two templates is the second one is a deployment template and I would like input from the first template in naming the environment in the second template. I.e. initial part of node-app-deploy.yml (2nd template) is:
jobs:
- deployment: Deploy
displayName: Deploy
# Because we use the environmentSlice to name the environment, we have to have it passed in rather than
# extracting it from the config file in steps below
environment: ${{ parameters.environmentPrefix }}${{ parameters.environmentSlice }}
Update:
The accepted solution does allow you to pass variables between separate templates, but won't work for my particular use case. I wanted to be able to name the 'environment' section of the 2nd template dynamically, i.e. environment: ${{ parameters.environmentPrefix }}${{ parameters.environmentSlice }}, but this can only be named statically since templates are compiled on pipeline startup.
The downside of the solution is that it introduces a hidden coupling between the templates. I would have preferred the calling pipeline to orchestrate the parameter passing between templates.
You can apply the depend on and dependency variable into templates.
See below sample:
To make sample more clear, here has 2 template files, one is azure-pipelines-1.yml, and another is azure-pipeline-1-copy.yml.
In azure-pipelines-1.yml, specify the environment value as output variable:
parameters:
  environment: ''
jobs:
- job: preDeploy
  variables:
    EnvironmentName: preDeploy-${{ parameters.environment }}
  steps:
  - checkout: none
  - pwsh: |
      echo "##vso[task.setvariable variable=EnvironmentName;isOutput=true]$($env:ENVIRONMENTNAME)"
    name: outputVars
And then, in azure-pipeline-1-copy.yml use dependency to get this output variable:
jobs:
- job: deployment
  dependsOn: preDeploy
  variables:
    EnvironmentNameCopy: $[dependencies.preDeploy.outputs['outputVars.EnvironmentName']]
  steps:
  - checkout: none
  - pwsh: |
      Write-Host "$(EnvironmentNameCopy)"
    name: outputVars
At last, in YAML pipeline, just need to pass the environment value
stages:
  - stage: deployQA
    jobs:
    - template: azure-pipelines-1.yml
      parameters:
        environment: FromTemplate1
    - template: azure-pipeline-1-copy.yml
Now, you can see the value get successfully in the second template job:
It is possible to avoid the dependency in the called template. However, as the OP says, the environment name cannot be created dynamically.
Here is an example of the "calling" template, which firstly calls another template (devops-variables.yml) that sets some environment variables that we wish to consume in a later template (devops-callee.yml):
stages:
- stage: 'Caller_Stage'
displayName: 'Caller Stage'
jobs:
- template: 'devops-variables.yml'
parameters:
InitialEnvironment: "Development"
- template: 'devops-callee.yml'
parameters:
SomeParameter: $[dependencies.Variables_Job.outputs['Variables_Job.Variables.SomeParameter']]
In the devops-variables.yml file, I have this:
"##vso[task.setvariable variable=SomeParameter;isOutput=true;]Wibble"
Then, in the "devops-callee.yml", I just consume it something like this:
parameters:
- name: SomeParameter
default: ''
jobs:
- deployment: 'Called_Job'
condition: succeeded()
displayName: 'Called Job'
environment: "Development"
pool:
vmImage: 'windows-2019'
dependsOn:
- Variables_Job
variables:
SomeParameter: ${{parameters.SomeParameter}}
strategy:
runOnce:
deploy:
steps:
- download: none
- task: AzureCLI#2
condition: succeeded()
displayName: 'An Echo Task'
inputs:
azureSubscription: "$(TheServiceConnection)"
scriptType: pscore
scriptLocation: inlineScript
inlineScript: |
echo "Before"
echo "$(SomeParameter)"
echo "After"
Output:
2021-04-10T09:22:29.6188535Z Before
2021-04-10T09:22:29.6196620Z Wibble
2021-04-10T09:22:29.6197124Z After
This way, the callee doesn't reference the caller. Unfortunately, setting the environment in the callee thus:
environment: "$(SomeParameter)"
doesn't work - you'll just get an environment with the literal characters '$(SomeParameter)'.

Using ARM Templates from external repository

I'm working with azure multistage pipelines, using deployment jobs with templates in a separate repo. I'm currently starting to use ARM templates in my deployment process and want to run ARM templates that are located in a different repository as well. This is where I get a little stuck, any help/advice appreciated.
To illustrate my setup:
Repo A -> Source code that has to be build and deployed to azure
Repo B -> Azure pipeline templates (only consists of yml files)
Repo C -> ARM templates
So what I want to accomplish: A uses B uses C.
REPO A: Documentation build and release yml
resources:
repositories:
- repository: templates
type: git
name: <ACCOUNT>/Azure.Pipelines.Templates
ref: refs/tags/2.2.40
stages:
- stage: Build
jobs:
- template: src/jobs/doc-build.yml#templates
- stage: DEV
jobs:
- template: src/deployment-jobs/doc.yml#templates
....
REPO B: Documentation deployment
parameters:
webAppName: ''
connectedServiceName: 'DEV'
jobs:
- deployment: doc_deploy
pool:
name: 'DOC'
environment: 'doc'
strategy:
runOnce:
deploy:
steps:
- template: ../deployment/arm-template.yml
parameters:
connectedServiceName: ${{ parameters.connectedServiceName }}
resourceGroupName: 'documentation'
templateFile: $(Build.SourcesDirectory)/Azure.ARM.Templates/src/web-app/documentation.jsonc
paramFile: $(Build.SourcesDirectory)/Azure.ARM.Templates/src/web-app/documentation-params.json
parameters: -name ${{ parameters.webAppName }}
...
REPO C: contains arm template + param file
The issue I'm facing is that I can't seem to be able to get to the files of repo c. I tried adding another repository entry on multiple levels but it does not seem to clone the dependent repo at all.
My current workaround/solution:
Use a powershell script to manually clone repo C and directly reference the file on disk.
Related github issue: https://github.com/microsoft/azure-pipelines-yaml/issues/103
I've also stumbled upon this issue, having to load arm templates from another repo into the current build. What I did was setting up a build on the arm-template-containing repo, producing a build artifact with following azure-pipelines.yml: (this would be your repo c)
trigger:
- master
steps:
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: '$(System.DefaultWorkingDirectory)/templates'
ArtifactName: 'templates'
publishLocation: 'Container'
Afterwards I could add following step into the actual pipeline:
- task: DownloadPipelineArtifact#2
displayName: 'Get ARM Templates'
inputs:
buildType: 'specific'
project: <YOUR-PROJECT-ID>'
definition: '<ARM-BUILD-DEFINITION-ID>'
buildVersionToDownload: 'latest'
artifactName: 'scripts'
targetPath: '$(Pipeline.Workspace)/templates'
and I was able to access the files as follows:
- task: AzureResourceGroupDeployment#2
displayName: 'Create Queues $(ResourceGroup.Name) '
inputs:
azureSubscription: '<YOUR-SUBSCRIPTION>'
resourceGroupName: '$(ResourceGroup.Name)'
location: '$(ResourceGroup.Location)'
csmFile: '$(Pipeline.Workspace)/templates/servicebus.json'
For more information about the Download Pipeline Artifact task check out following link:
Download Pipeline Artifact task