Azure DevOps: Populating secure file references with job matrix variables - azure-devops

For context, I am trying to use an Azure build pipeline to build multiple flavors of an Android app. Each flavor has its own separate signing keystore, and all of those keystores are stored in my 'secure files' in the library.
However, when I try to dereference the $(Keystore) variable during the 'android signing' task, it doesn't seem to recognize that that is a variable that exists, and tries instead to locate a file called '$(Keystore)'
Am I doing something wrong here? This seems like it should work.
A sanitized example looks like this:
# Android
# Build your Android project with Gradle.
# Add steps that test, sign, and distribute the APK, save build artifacts, and more:
# https://learn.microsoft.com/azure/devops/pipelines/languages/android
trigger:
- feat/ci-setup
pool:
vmImage: 'macos-latest'
variables:
${{ if startsWith(variables['build.sourceBranch'], 'refs/heads/feat/') }}:
Branch_Type: 'feature'
${{ if startsWith(variables['build.sourceBranch'], 'refs/heads/hotfix/') }}:
Branch_Type: 'hotfix'
${{ if startsWith(variables['build.sourceBranch'], 'refs/heads/release/') }}:
Branch_Type: 'release'
${{ if eq(variables['Branch_Type'], 'release') }}:
Configuration: 'release'
ConfigurationCC: 'Release'
${{ if ne(variables['Branch_Type'], 'release') }}:
Configuration: 'debug'
ConfigurationCC: 'Debug'
jobs:
- job: Build
variables:
- group: android_keystores
strategy:
maxParallel: 2
matrix:
Flavor_1:
AppFlavor: '1'
AppFlavorCC: '1'
Keystore: 'flavor1.keystore'
KeyAlias: 'flavor1'
KeystorePass: '$(flavor1_storepass)'
KeyPass: '$(flavor1_keypass)'
Flavor_2:
AppFlavor: '2'
AppFlavorCC: '2'
Keystore: 'flavor2.keystore'
KeyAlias: 'flavor2'
KeystorePass: '$(flavor2_storepass)'
KeyPass: '$(flavor2_keypass)'
steps:
- task: Gradle#2
inputs:
workingDirectory: ''
gradleWrapperFile: 'gradlew'
gradleOptions: '-Xmx3072m'
publishJUnitResults: false
tasks: 'assemble$(AppFlavorCC)$(ConfigurationCC)'
- task: AndroidSigning#3
displayName: Signing .apk
inputs:
apkFiles: 'app/build/outputs/apk/$(AppFlavor)/$(Configuration)/*.apk'
apksign: true
apksignerKeystoreFile: '$(Keystore)'
apksignerKeystorePassword: '$(KeystorePass)'
apksignerKeystoreAlias: '$(KeyAlias)'
apksignerKeyPassword: '$(KeyPass)'
zipalign: true
- task: Bash#3
displayName: Move APK to Artifact Folder
continueOnError: true
inputs:
targetType: 'inline'
script: |
mv \
app/build/outputs/apk/$(AppFlavor)/$(Configuration)/*.apk \
$(Build.ArtifactStagingDirectory)/$(ArtifactName)/
- task: PublishBuildArtifacts#1
displayName: Publish Build Artifacts
inputs:
PathtoPublish: '$(Build.ArtifactStagingDirectory)'
ArtifactName: 'Blueprint-Build'
publishLocation: 'Container'
But when the pipeline runs I am told this:
There was a resource authorization issue: "The pipeline is not valid. Job Build: Step AndroidSigning input keystoreFile references secure file $(Keystore) which could not be found. The secure file does not exist or has not been authorized for use. For authorization details, refer to https://aka.ms/yamlauthz."

Azure DevOps: Populating secure file references with job matrix variables
This is a limitation from the task itself.
When we test it with Classic mode, we could find out that the value of the option Keystore file could not be entered manually, we could only select a certain file through the drop-down menu:
That the reason why it doesn't seem to recognize that that is a variable that exists, and tries instead to locate a file called '$(Keystore)'.
To resolve this issue, you could change the task version from 3 to 1, which supports manual input:
And as another solution, you could also use the command line to sign the *.apk:
Android apk signing: sign an unsigned apk using command line

You're missing the step to download the Secure File. Unlike variable groups, you need to explicitly download them to have access via the secure file name.
You'll want to add something similar to the example task below to your steps to pull the secure file. Then, you'll access your secure file via NAME_PARAMETER.secureFilePath:
- task: DownloadSecureFile#1
displayName: "Download Keyfile 1"
name: "YOUR_SECUREFILE_NAME"
inputs:
secureFile: keyfile1
- task: AndroidSigning#3
displayName: Signing .apk
inputs:
apkFiles: 'app/build/outputs/apk/$(AppFlavor)/$(Configuration)/*.apk'
apksign: true
apksignerKeystoreFile: '$(YOUR_SECUREFILE_NAME.secureFilePath)'
apksignerKeystorePassword: '$(KeystorePass)'
apksignerKeystoreAlias: '$(KeyAlias)'
apksignerKeyPassword: '$(KeyPass)'
zipalign: true

Related

Matrix in Azure DevOps yaml pipeline: The pipeline is not valid

For our mobile app, I am trying to use matrix to set different pipeline values in Debug and Release:
jobs:
- job: Job_1
displayName: .Net MAUI Job
strategy:
maxParallel: 2
matrix:
Debug:
BuildConfiguration: Debug
ProvProfile: 'My_Testing_Profile.mobileprovision'
CertSecureFile: 'ios_development.p12'
CertPwd: $(IOSP12Password-testing)
Release:
BuildConfiguration: Release
ProvProfile: 'My_Distribution_Profile.mobileprovision'
CertSecureFile: 'ios_distribution.p12'
CertPwd: $(IOSP12Password-distribution)
...
- task: InstallAppleCertificate#2
displayName: Install Apple Certificate
inputs:
certSecureFile: $(CertSecureFile)
certPwd: $(CertPwd)
setUpPartitionIdACLForPrivateKey: false
deleteCert: false
deleteCustomKeychain: false
- task: InstallAppleProvisioningProfile#1
displayName: Install Testing Apple Provisioning Profile
inputs:
provisioningProfileLocation: 'secureFiles'
provProfileSecureFile: $(ProvProfile)
...
- task: DotNetCoreCLI#2
displayName: 'dotnet publish ($(BuildConfiguration))'
inputs:
command: 'publish'
publishWebProjects: false
projects: 'My_MobileApp.sln'
arguments: '-f:net6.0-ios -c:$(BuildConfiguration) -r ios-arm64 /p:ArchiveOnBuild=true /p:EnableAssemblyILStripping=false'
zipAfterPublish: false
modifyOutputPath: false
IOSP12Password-testing and IOSP12Password-distribution are variables set in the pipeline.
I am getting the following error:
There was a resource authorization issue: "The pipeline is not valid.
Job Job_1: Step InstallAppleCertificate input certSecureFile references secure file $(CertSecureFile) which could not be found. The secure file does not exist or has not been authorized for use.
Job Job_1: Step InstallAppleProvisioningProfile input provProfileSecureFile references secure file $(ProvProfile) which could not be found. The secure file does not exist or has not been authorized for use.
I suspect that CertPwd is also wrong.
I don't understand why it is not working, if there is no problem with BuildConfiguration at all.
Azure Devops Services now don't support using variable in secure file.
So if your profile and certificate files have already been added into the library-secure file, you need to directly write the name of the file into your yaml instead of variables.
If you do need the feature, you can directly report the feature requests. That will allow you to directly interact with the appropriate engineering team and make it more convenient for the engineering team to collect and categorize your suggestions.
updates:
You could try to use 'condition sytax' here to repeat the two tasks with different hard code value, if you need to choose the value in runtime, you could use parameters, it should be something like this:
parameters:
- name: CertSecureFile
values:
- ios_development.p12
- ios_distribution.p12
- name: ProvProfile
values:
- My_Testing_Profile.mobileprovision
- My_Distribution_Profile.mobileprovision
stages:
- stage: A
jobs:
- job: Job_1
displayName: .Net MAUI Job
strategy:
maxParallel: 2
matrix:
Debug:
BuildConfiguration: Debug
ProvProfile: 'My_Testing_Profile.mobileprovision'
CertSecureFile: 'ios_development.p12'
CertPwd: $(IOSP12Password-testing)
Release:
BuildConfiguration: Release
ProvProfile: 'My_Distribution_Profile.mobileprovision'
CertSecureFile: 'ios_distribution.p12'
CertPwd: $(IOSP12Password-distribution)
steps:
- task: InstallAppleCertificate#2
condition: eq('${{ parameters.CertSecureFile }}', 'ios_development.p12')
displayName: Install Apple Certificate
inputs:
certSecureFile: ios_development.p12
certPwd: $(CertPwd)
setUpPartitionIdACLForPrivateKey: false
deleteCert: false
deleteCustomKeychain: false
- task: InstallAppleProvisioningProfile#1
condition: eq('${{ parameters.ProvProfile }}', 'My_Testing_Profile.mobileprovision')
displayName: Install Testing Apple Provisioning Profile
inputs:
provisioningProfileLocation: 'secureFiles'
provProfileSecureFile: My_Testing_Profile.mobileprovision
- task: InstallAppleCertificate#2
condition: eq('${{ parameters.CertSecureFile }}', 'ios_distribution.p12')
displayName: Install Apple Certificate
inputs:
certSecureFile: ios_distribution.p12
certPwd: $(CertPwd)
setUpPartitionIdACLForPrivateKey: false
deleteCert: false
deleteCustomKeychain: false
- task: InstallAppleProvisioningProfile#1
condition: eq('${{ parameters.ProvProfile }}', 'My_Distribution_Profile.mobileprovision')
displayName: Install Testing Apple Provisioning Profile
inputs:
provisioningProfileLocation: 'secureFiles'
provProfileSecureFile: My_Distribution_Profile.mobileprovision
updates:
If you want to use template with parameters, you could put the repeated two jobs with two fixed value in two templates yaml in the same repo and branch.(here we name it for example, template1.yaml and template2.yaml)
Then you could try something like this:
steps:
- ${{ if and(eq(parameters.CertSecureFile, 'ios_development.p12'),eq(parameters.ProvProfile, 'My_Testing_Profile.mobileprovision')) }}:
- template: template1.yaml
- ${{ else }}:
- template: template2.yaml

Azure Pipeline Copy Secure file into build folder

I've a vite/svelte project which uses .env files for environment settings. I also have an Azure Pipeline which contains a secure file .env.staging this is on the .gitignore list of the associated repo. I'd like to download this secure file, copy it to my build directory and then have it's contents read when I run vite build --mode staging (well, npm run build:staging which includes vite build...)
When run locally from my machine npm run build:staging works as expected and reads the .env.staging file, however it seems to get ignored when used in the pipeline, am I doing anything wrong?
Here's my yml.
trigger:
- main
pool:
vmImage: 'ubuntu-latest'
steps:
- task: DownloadSecureFile#1
name: "dotenvStaging"
inputs:
secureFile: '.env.staging'
displayName: "Download .env.staging"
- task: NodeTool#0
inputs:
versionSpec: 14.15.4
displayName: "Install Node.JS"
- task: CopyFiles#2
inputs:
contents: "$(Agent.TempDirectory)/.env.staging"
targetFolder: "$(Agent.BuildDirectory)"
displayName: "Import .env.staging"
- script: npm install
displayName: "npm install"
- script: npm run build:staging
displayName: "npm run build:staging"
- task: ArchiveFiles#2
inputs:
rootFolderOrFile: 'dist'
archiveType: 'zip'
archiveFile: '$(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip'
#replaceExistingArchive: true
#verbose: # Optional
#quiet: # Optional
displayName: "Create archive"
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: '$(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip'
ArtifactName: 'drop'
publishLocation: 'Container'
displayName: "Publish archive"
I'm not sure if CopyFiles#2 is doing what I expect or not as it just matches the content parameter to copy whatever files match, which could be 0 if I'm writing it wrong...
Another note, I also tried using $(dotenvStaging.secureFilePath) as the content parameter, but it doesn't seem to do anything either.
Naturally I figured it out as soon as I posted, I needed to update the CopyFiles part to specify sourceFolder, clearly it didn't like my absolute file path for content.
- task: CopyFiles#2
inputs:
sourceFolder: "$(Agent.TempDirectory)"
contents: ".env.staging"
targetFolder: "$(Agent.BuildDirectory)"
displayName: "Import .env.staging"

Unable to download secure files conditionally in Azure Pipelines

Question
I am using DownloadSecureFile#1 task to download Secure files.
The issue occurs when in Azure DevOps, in the Library's secure files section, only file_A.txt exists.
The script works fine when both files exists.
In my case, a user A will only need file_A.txt, user B will only need file_B.txt.
Is this an expected behavior? Any possible workarounds to fulfill the use-case?
Error Message:
There was a resource authorization issue: "The pipeline is not valid. Job Job: Step fileB input secureFile references secure file file_B.txt which could not be found. The secure file does not exist or has not been authorized for use. For authorization details, refer to https://aka.ms/yamlauthz."
Code:
parameters:
- name: file_name
type: string
default: ''
values:
- file_A.txt
- file_B.txt
pool:
vmImage: ubuntu-latest
steps:
- task: DownloadSecureFile#1
displayName: Download File A
condition: eq('${{ parameters.file_name }}', 'file_A.txt')
name: fileA
inputs:
secureFile: 'file_A.txt'
- task: DownloadSecureFile#1
displayName: Download file B
condition: eq('${{ parameters.file_name }}', 'file_B.txt')
name: fileB
inputs:
secureFile: 'file_B.txt'
Is this an expected behavior?
Yes, this is expected behavior. To turn a pipeline into a run, Azure Pipelines goes through several steps in this order:
First, expand templates and evaluate template expressions.
Next, evaluate dependencies at the stage level to pick the first
stage(s) to run.
For each stage selected to run, two things happen:
All resources used in all jobs are gathered up and validated for
authorization to run.
Evaluate dependencies at the job level to pick the first job(s) to
run.
For each job selected to run, expand multi-configs (strategy: matrix
or strategy: parallel in YAML) into multiple runtime jobs.
For each runtime job, evaluate conditions to decide whether that job
is eligible to run.
Request an agent for each eligible runtime job.
So, your secure files will be downloaded before evaluating conditions. Please refer to the document about Pipeline run sequence. As a workaround, you can refer to the sample shared by #danielorn.
Instead of using the condition on the tasks you can surround the step with an if-statement as described in use parameters to determine what steps run
parameters:
- name: file_name
type: string
default: ''
values:
- file_A.txt
- file_B.txt
pool:
vmImage: ubuntu-latest
steps:
- ${{ if eq(parameters.file_name, 'file_A.txt') }}:
- task: DownloadSecureFile#1
displayName: Download File A
name: fileA
inputs:
secureFile: 'file_A.txt'
- ${{ if eq(parameters.file_name, 'file_B.txt') }}:
- task: DownloadSecureFile#1
displayName: Download file B
name: fileB
inputs:
secureFile: 'file_B.txt'
However if every user needs exactly one file, a common (and cleaner) option would be to provide the name of the file needed as a parameter. If a secure file is not needed (i.e the parameter is the default empty) the step can be excluded using an if statement
parameters:
- name: file_name
type: string
default: ''
values:
- file_A.txt
- file_B.txt
pool:
vmImage: ubuntu-latest
steps:
- ${{ if ne(parameters.file_name, '') }}:
- task: DownloadSecureFile#1
displayName: Download Secure File
name: secureFileDownload
inputs:
secureFile: '${{ parameters.file_name }}'

Azure pipeline - unzip artefact, copy one directory into Azure blob store YAML file

I am getting stuck with Azure pipelines.
I have an existing node SPA project that needs built for each environment (TEST and PRODUCTION). This i can do, but need to have a manual step when pushing to PROD. I am using Azure Dev-op pipeline environments with Approval and Checks to mandate this.
The issue is using a 'deploy job' to take an artefact from a previous step I am unable to find the right directory. This is my YAML file have so far:
variables:
# Agent VM image name
vmImageName: 'ubuntu-latest'
trigger:
- master
# Don't run against PRs
pr: none
stages:
- stage: Development
displayName: Devlopment stage
jobs:
- job: install
displayName: Install and test
pool:
vmImage: $(vmImageName)
steps:
- task: NodeTool#0
inputs:
versionSpec: '12.x'
displayName: 'Install Node.js'
- script: |
npm install
displayName: Install node modules
- script: |
npm run build
displayName: 'Build it'
# Build creates a ./dist folder. The contents will need to be copied to blob store
- task: ArchiveFiles#2
inputs:
rootFolderOrFile: '$(Build.BinariesDirectory)'
includeRootFolder: true
archiveType: 'zip'
archiveFile: '$(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip'
replaceExistingArchive: true
verbose: true
- deployment: ToDev
environment: development
dependsOn: install
strategy:
runOnce:
deploy:
steps:
- task: DownloadPipelineArtifact#2
inputs:
buildType: 'current'
targetPath: '$(Pipeline.Workspace)'
- task: ExtractFiles#1
inputs:
archiveFilePatterns: '**/*.zip'
cleanDestinationFolder: true
destinationFolder: './cpDist/'
# Somehow within a deploy job retrieve the .zip artefact, unzip, copy the ./dist folder into the blob store
- task: AzureCLI#2
inputs:
azureSubscription: MYTEST-Development
scriptLocation: "inlineScript"
scriptType: "bash"
inlineScript: |
az storage blob upload-batch -d \$web --account-name davey -s dist --connection-string 'DefaultEndpointsProtocol=https;AccountName=davey;AccountKey=xxxxxxx.yyyyyyyyy.zzzzzzzzzz;EndpointSuffix=core.windows.net'
displayName: "Copy build files to Development blob storage davey"
- script: |
pwd
ls
cd cpDist/
pwd
ls -al
displayName: 'list'
- bash: echo "Done"
If you are confused with the folder path, you could add few debug steps to check the location of know system variables to understand what was going on using a powershell script as below:
- task: PowerShell#2
displayName: 'Degug parameters'
inputs:
targetType: Inline
script: |
Write-Host "$(Build.ArtifactStagingDirectory)"
Write-Host "$(System.DefaultWorkingDirectory)"
Write-Host "$(System.ArtifactsDirectory)"
Write-Host "$(Pipeline.Workspace)"
Write-Host "$(System.ArtifactsDirectory)"
You should simply publish the build generated artifacts to drop folder.
Kindly check this official doc -- Artifact selection , in there is explaining that you can define the path which to download the artifacts to with the following task:
steps:
- download: none
- task: DownloadPipelineArtifact#2
displayName: 'Download Build Artifacts'
inputs:
patterns: '**/*.zip'
path: '$(Build.ArtifactStagingDirectory)'
Please be aware that the download happens automatically to $(Pipeline.Workspace), so if you don’t want you deployment to download the files twice, you need to specify the “download: none” in your steps.

how to convert classic build job to yaml build in AzureDevops

We have a working classic build job in azure Devops with an self hosted agent pool. But when we tried to convert this build job to yaml method, while executing no agents are getting assigned and its hanging. Could you please correct me here if i am doing something task.
Error
"All eligible agents are disabled or offline"
below is the converted yaml file from classic build - agent job
pool:
name: MYpool
demands: maven
#Your build pipeline references an undefined variable named ‘Parameters.mavenPOMFile’. Create or edit the build pipeline for this YAML file, define the variable on the Variables tab. See https://go.microsoft.com/fwlink/?linkid=865972
steps:
- task: Maven#3
displayName: 'Maven pom.xml'
inputs:
mavenPomFile: '$(Parameters.mavenPOMFile)'
- task: CopyFiles#2
displayName: 'Copy Files to: $(build.artifactstagingdirectory)'
inputs:
SourceFolder: '$(system.defaultworkingdirectory)'
Contents: '**/*.war'
TargetFolder: '$(build.artifactstagingdirectory)'
condition: succeededOrFailed()
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: Root'
inputs:
PathtoPublish: '$(build.artifactstagingdirectory)'
ArtifactName: Root
condition: succeededOrFailed()
- task: CopyFiles#2
displayName: 'Copy wars to build directory'
inputs:
SourceFolder: '$(build.artifactstagingdirectory)/target'
TargetFolder: '/home/myadmin/builds/$(build.buildnumber)'
- task: CopyFiles#2
displayName: 'copying docker file to Build Directory'
inputs:
SourceFolder: Admin
TargetFolder: '/home/myadmin/builds/$(build.buildnumber)'
- bash: |
# Write your commands here
mv /home/myadmin/builds/$(build.buildnumber)/mypack0.0.1.war /home/myadmin/builds/$(build.buildnumber)/ROOT.war
displayName: 'Name war file Root.war'
- task: Docker#2
displayName: 'Build the docker image'
inputs:
repository: 'mycontainerregistry.azurecr.io/myservice'
command: build
Dockerfile: '/home/myadmin/builds/$(build.buildnumber)/Dockerfile'
tags: '$(Build.BuildNumber)-DEV'
- bash: |
# Write your commands here
docker login mycontainerregistry.azurecr.io
docker push mycontainerregistry.azurecr.io/myservice:$(Build.BuildNumber)-DEV
displayName: 'Push Docker Image'
- task: CopyFiles#2
displayName: 'Copy Deployment file'
inputs:
SourceFolder: /home/myadmin/kubernetes
TargetFolder: '/home/myadmin/builds/$(build.buildnumber)'
- task: qetza.replacetokens.replacetokens-task.replacetokens#3
displayName: 'Replace image in deployment file'
inputs:
rootDirectory: '/home/myadmin/builds/$(build.buildnumber)'
targetFiles: '**/*.yml'
In my previous answer, I said when I wait for nearly 20-30 mins, the interface of agent will prompt below message.
In fact, this is a process which upgrade the agent to latest version automatically.
Yes, when you using YAML with private agent, the agent version must be the latest one. No matter you add the demands or not.
For our system, the agent version is a implicit demand that your agent must satisfied with the latest one when you applying it in YAML.
If it is not satisfied, it will be blocked and the agent upgrade process will be forced to be performed automatically by system after some times.
So, to execute the private agent in YAML successfully, please upgrade the agent to latest one manually.
Since what my colleague and I talked are all private to microsoft in this ticket, sorry you could not get visible on this summary. So, here I take the screenshots about it, and you can refer to it: https://imgur.com/a/4OnzHp3
We are still working on why the system prompting so confusing message like: All eligible agents are disabled or offline. And, am trying to do some contribution to let this message more clear, for example: no agents meet demands: agent version xxx.