I'm working with azure multistage pipelines, using deployment jobs with templates in a separate repo. I'm currently starting to use ARM templates in my deployment process and want to run ARM templates that are located in a different repository as well. This is where I get a little stuck, any help/advice appreciated.
To illustrate my setup:
Repo A -> Source code that has to be build and deployed to azure
Repo B -> Azure pipeline templates (only consists of yml files)
Repo C -> ARM templates
So what I want to accomplish: A uses B uses C.
REPO A: Documentation build and release yml
resources:
repositories:
- repository: templates
type: git
name: <ACCOUNT>/Azure.Pipelines.Templates
ref: refs/tags/2.2.40
stages:
- stage: Build
jobs:
- template: src/jobs/doc-build.yml#templates
- stage: DEV
jobs:
- template: src/deployment-jobs/doc.yml#templates
....
REPO B: Documentation deployment
parameters:
webAppName: ''
connectedServiceName: 'DEV'
jobs:
- deployment: doc_deploy
pool:
name: 'DOC'
environment: 'doc'
strategy:
runOnce:
deploy:
steps:
- template: ../deployment/arm-template.yml
parameters:
connectedServiceName: ${{ parameters.connectedServiceName }}
resourceGroupName: 'documentation'
templateFile: $(Build.SourcesDirectory)/Azure.ARM.Templates/src/web-app/documentation.jsonc
paramFile: $(Build.SourcesDirectory)/Azure.ARM.Templates/src/web-app/documentation-params.json
parameters: -name ${{ parameters.webAppName }}
...
REPO C: contains arm template + param file
The issue I'm facing is that I can't seem to be able to get to the files of repo c. I tried adding another repository entry on multiple levels but it does not seem to clone the dependent repo at all.
My current workaround/solution:
Use a powershell script to manually clone repo C and directly reference the file on disk.
Related github issue: https://github.com/microsoft/azure-pipelines-yaml/issues/103
I've also stumbled upon this issue, having to load arm templates from another repo into the current build. What I did was setting up a build on the arm-template-containing repo, producing a build artifact with following azure-pipelines.yml: (this would be your repo c)
trigger:
- master
steps:
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: '$(System.DefaultWorkingDirectory)/templates'
ArtifactName: 'templates'
publishLocation: 'Container'
Afterwards I could add following step into the actual pipeline:
- task: DownloadPipelineArtifact#2
displayName: 'Get ARM Templates'
inputs:
buildType: 'specific'
project: <YOUR-PROJECT-ID>'
definition: '<ARM-BUILD-DEFINITION-ID>'
buildVersionToDownload: 'latest'
artifactName: 'scripts'
targetPath: '$(Pipeline.Workspace)/templates'
and I was able to access the files as follows:
- task: AzureResourceGroupDeployment#2
displayName: 'Create Queues $(ResourceGroup.Name) '
inputs:
azureSubscription: '<YOUR-SUBSCRIPTION>'
resourceGroupName: '$(ResourceGroup.Name)'
location: '$(ResourceGroup.Location)'
csmFile: '$(Pipeline.Workspace)/templates/servicebus.json'
For more information about the Download Pipeline Artifact task check out following link:
Download Pipeline Artifact task
Related
I have a shared Azure pipeline yaml definition with the purpose to define one CodeAnalysis pipeline per repository.
How can I define the repository name dynamically?
I tried with name: '$(projectName)' which leads to the error:
The repository $(projectName) in project 8ab9d22b-6819-483b-829d-******* could not be retrieved. Verify the name and credentials being used.
azure-pipelines.yml
resources:
repositories:
- repository: codeAnalysisRepo
type: git
name: shared/codeanalysis
- repository: SourceRepo
type: git
name: '$(projectName)'
jobs:
- job: 'BackendCodeAnalysis'
pool:
name: '$(AgentPool)'
steps:
- checkout: SourceRepo
clean: true
- template: sonarqube_msbuild_prepare.yml#codeAnalysisRepo
parameters:
projectKey: '$(project)'
projectName: '$(project)'
- task: DotNetCoreCLI#2
displayName: "build DestRepo"
inputs:
command: 'build'
projects: '$(Build.Repository.LocalPath)/**/*.csproj'
configuration: Release
- template: sonarqube_execute.yml#codeAnalysisRepo
It works when I hardcode the name
Currently, set parameter and variable is not supported in resources -> repositories.
A work around for this, you could set this at the checkout step. Here is s sample: Check out multiple repositories in your pipeline - Azure Pipelines | Microsoft Docs. Please note that, the repos should be in the same organization.
resources:
repositories:
- repository: Repo1
type: git
name: Artifacts/Repo1
jobs:
- job: 'BackendCodeAnalysis'
pool:
vmimage: windows-latest
steps:
- checkout: git://$(projectName)
clean: true
For your demand, you could create a suggestion ticket via: https://developercommunity.visualstudio.com/report?space=21&entry=problem.
I have created a pipeline using only YAML.
I have defined the deployment part like this:
- stage: AzureDevOpsStaging
displayName: Deploy build artifacts to staging environment
dependsOn: BuildSolution
condition: succeeded('BuildSolution')
jobs:
- deployment: DeployArtifacts
displayName: Deploy artifacts
environment:
name: AzureDevOpsStaging
resourceType: VirtualMachine
strategy:
runOnce:
deploy:
steps:
- download: current
artifact: drop
- task: IISWebAppDeploymentOnMachineGroup#0
displayName: Deploy artifacts to IIS
inputs:
webSiteName: 'mysite-staging'
package: '$(Pipeline.Workspace)\drop\*.zip'
xmlTransformation: true
When I run this I get:
##[warning]Unable to apply transformation for the given package. Verify the following.
##[warning]1. Whether the Transformation is already applied for the MSBuild generated package during build. If yes, remove the <DependentUpon> tag for each config in the csproj file and rebuild.
##[warning]2. Ensure that the config file and transformation files are present in the same folder inside the package.
Things that I've checked:
Both Web.config and Web.AzureDevOpsStaging.config files are in the zip/artifact
Name of stage - The docs say stage must have the same name as your transform config file; that is: Web.AzureDevOpsStaging.config.
Name of .config transform file - the name of the .config transform file is Web.AzureDevOpsStaging.config
Name of environment (the docs doesn't say the name has to be the same as Web.ThisPart.config but I still named the environment
AzureDevOpsStaging just in case.)
But again doing all of the above results in the Web.config not being transformed.
I got it to work with using the file transform task instead which is referenced in the docs from the IIS Web App Deploy task:
- stage: AzureDevOpsStaging
displayName: Deploy build artifacts to staging environment
dependsOn: BuildSolution
condition: succeeded('BuildSolution')
jobs:
- deployment: DeployArtifacts
displayName: Deploy artifacts
environment:
name: AzureDevOpsStaging
resourceType: VirtualMachine
strategy:
runOnce:
deploy:
steps:
- download: current
artifact: drop
- task: FileTransform#1
inputs:
folderPath: '$(Pipeline.Workspace)\drop\*.zip'
enableXmlTransform: true
xmlTransformationRules: -transform **\*.AzureDevOpsStaging.config -xml **\*.config
- task: IISWebAppDeploymentOnMachineGroup#0
displayName: Deploy artifacts to IIS
inputs:
webSiteName: 'mysite-staging'
package: '$(Pipeline.Workspace)\drop\*.zip'
So can someone please explain to me how I am supposed to configure my YAML to get it to work using only the IISWebAppDeploymentOnMachineGroup#0 task?
And if this is not possible am I using the task FileTransform#1 properly?
Also, I saw there is a version FileTransform#2 as well. That task didn't have one of the properties that #1 has so I reverted to using v1 instead. But would be great if someone has a bit more info on this newer version and if it's going to deprecate #1 in the future?
Btw, I also got xmlTransformation: true to work with classic release pipeline under the Releases tab in Azure DevOps using the UI. But again I don't want to use the classic stuff, I want to do everything in YAML.
And if this is not possible am I using the task FileTransform#1 properly?
The answer is yes.
The task FileTransform is the one I am using and use frequently.
When I use it in the YAML pipeline as you set:
- task: FileTransform#1
displayName: 'File Transform'
inputs:
folderPath: '$(Pipeline.Workspace)\drop\*.zip'
enableXmlTransform: true
xmlTransformationRules: '-transform **\*.UAT.config -xml **\*.config'
fileType: xml
It works fine on my side:
In order to perform the conversion correctly, it is necessary to ensure that the syntax in config is correct, and the specified directory is correct
I have 30 or so Java Microservices that run of the same ci and cd template. i.e. Each of my Microservices has a build pipeline as follows and as shown below it runs automatically on a merge to master:
name: 0.0.$(Rev:r)
trigger:
- master
pr: none
variables:
- group: MyCompany
resources:
repositories:
- repository: templates
type: git
name: <id>/yaml-templates
stages:
- stage: Build
jobs:
- job: build
displayName: Build
steps:
- template: my-templates/ci-build-template.yaml#templates
- stage: PushToContainerRegistry
dependsOn: Build
condition: succeeded()
jobs:
- job: PushToContainerRegistry
displayName: PushToContainerRegistry
steps:
- template: my-templates/ci-template.yaml#templates
Where ci-build-template.yaml contains...
steps:
- checkout: self
path: s
- task: PowerShell#2
- task: Gradle#2
displayName: 'Gradle Build'
- task: SonarQubePrepare#4
displayName: SonarQube Analysis
- task: CopyFiles#2
displayName: Copy build/docker to Staging Directory
I would like to implement pr build validation wherever someone raises a pr to merge into master. In the PR build only the Build stage should run and from the build template only some tasks within ci-build-template.yaml should run.
A few questions for my learning:
How can i uplift the yaml pipeline above to make the "PushToContainerRegistry" skip if it is a pr build?
How can i uplift ci-build-template.yaml to make the "SonarQubePrepare#4" and "CopyFiles#2" skip if it is a pr build?
And lastly how can i uplift the yaml pipeline above to enable build validation for all pr's that have a target branch of master?
Whilst doing my own research i know you can do this via clickops but I am keep on learning on how to implement via yaml.
thanks
How can i uplift the yaml pipeline above to make the
"PushToContainerRegistry" skip if it is a pr build?
How can i uplift ci-build-template.yaml to make the
"SonarQubePrepare#4" and "CopyFiles#2" skip if it is a pr build?
Just need to use condition of task:
For example,
pool:
name: Azure Pipelines
steps:
- script: |
echo Write your commands here
echo Hello world
echo $(Build.Reason)
displayName: 'Command Line Script'
condition: and(succeeded(), ne(variables['Build.Reason'], 'PullRequest'))
Above definition will skip the step if Pull request trigger the pipeline.
Please refer to these documents:
https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/azure-repos-git?view=azure-devops&tabs=yaml#using-the-trigger-type-in-conditions
https://learn.microsoft.com/en-us/azure/devops/pipelines/build/variables?view=azure-devops&tabs=yaml#build-variables-devops-services
And lastly how can i uplift the yaml pipeline above to enable build
validation for all pr's that have a target branch of master?
You can use this expression in the condition:
eq(variables['System.PullRequest.TargetBranch'], 'refs/heads/master')
If you are based on DevOps git repo, then just need to add branch policy is ok:
https://learn.microsoft.com/en-us/azure/devops/repos/git/branch-policies?view=azure-devops&tabs=browser#configure-branch-policies
I've a CD pipeline that builds a project (multiple times for different environments) and publishes / saves the ./dist directories as one stage. I can download each environment and run locally as expected.
Each environment build is a stage that needs a manual approval. This is where I am getting lost. Each stage shows the correct artifact being pulled into the stage BUT the AzureStaticWebApp#0 -> app_location input results in a "Could not detect this directory." error.
To recap:
After building the project and saving as an artifact (I can manually download and verify) I am unable to push that built code to Azure Static Web App as it cannot be found. I've tried any number of combinations to no effect. Any advice?
I'm using templates, here is the Push Built Project to Azure Static Web Apps template
When this template runs, I can see jobs running and successfully pulling down the right artifact with this output:
Successfully downloaded artifacts to /home/vsts/work/1/
Finishing: Download Artifact
But the AzureStaticWebApp#0 task gives this error:
App Directory Location: '/home/vsts/work/1/DEV' is invalid. Could not detect this directory. Please verify your deployment configuration file reflects your repository structure.
parameters:
- name: environment
default: development
type: string
- name: variableGroup
default: development-variables-group
type: string
jobs:
- deployment:
displayName: 'Deploy to'
environment: ${{parameters.environment}}
variables:
- group: ${{parameters.variableGroup}}
pool:
vmImage: ubuntu-latest
strategy:
runOnce:
deploy:
steps:
- task: AzureStaticWebApp#0
inputs:
app_location: '$(Pipeline.Workspace)/DEV'
api_location: 'api'
output_location: 'dist'
skip_app_build: true
env:
azure_static_web_apps_api_token: $(deployment-token)
EDIT
Does the task AzureStaticWebApp not have access to anything outside the project?
- deployment:
displayName: 'Deploy to'
environment: ${{parameters.environment}}
variables:
- group: ${{parameters.variableGroup}}
pool:
vmImage: ubuntu-latest
strategy:
runOnce:
deploy:
steps:
- checkout: self
submodules: true
# This step pulls down a complied site. E.g DEV/index.htm, ./images, staticwebapp.config.json
# That has an output like:
# Downloading DEV/index.html to /home/vsts/work/1/DEV/index.html
# Successfully downloaded artifacts to /home/vsts/work/1/
- download: current
artifact: DEV
- task: AzureStaticWebApp#0
inputs:
# I've tried many different values for app_location but all return back not found error
app_location: '/DEV'
api_location: 'api'
output_location: 'dist'
skip_app_build: true
env:
azure_static_web_apps_api_token: $(deploymenttoken)
Solved --- well found a way to make it work.
The build step created 'app/dist' directory and content
The 'app/dist' folder only is published as an
artifact
When downloading the artifact you need to 'put it back'
into the project. In this case DEV/ -> app/dist.
- task: DownloadPipelineArtifact#2
inputs:
artifact: DEV
path: ./app/dist # Put build artifact back into the project
displayName: "Download artifacts"
- task: AzureStaticWebApp#0
inputs:
app_location: 'app/dist'
api_location: 'api'
output_location: 'dist'
skip_app_build: true
env:
azure_static_web_apps_api_token: $(deploymenttoken)
app_location specifies the root of your application code. The property should point to a location in your repo.
Check the documentation here: https://learn.microsoft.com/en-us/azure/static-web-apps/publish-devops
Also, https://github.com/Azure/static-web-apps/issues/5#issuecomment-855309544
There are two other solutions described in #552 that do not require adding the DownloadPipelineArtifact#2 step.
The solution that worked for me was to set the workingDirectory to the pipeline's artifact location. Then set the app_location value relative to the workingDirectory
For example, if your artifact was downloaded to $(Pipeline.Workspace)/MyBuild/drop
Your properties would be:
app_location: /drop
workingDirectory: $(Pipeline.Workspace)/MyBuild
yml snippet:
- task: AzureStaticWebApp#0
displayName: Publish Static Web App
inputs:
app_location: /drop # The name of your artifact
output_location: "" # Leave this empty
skip_app_build: true
azure_static_web_apps_api_token: $(deployment-token)
# The path where your artifact was downloaded
workingDirectory: $(Pipeline.Workspace)/MyBuild
This is really confusing because every other pipeline task accepts absolute paths but AzureStaticWebApp#0 requires a relative path for app_location.
I am developing a Azure task template, and I have a large .py file that I want to be executed in one step
- task: PythonScript#0
displayName: 'Run a Python script'
inputs:
scriptSource: inline
script: |
... really long python code
It's possible to store the code in another file, at the same level of the yml template, and consume it from there? Or what would be the best approach to keep the template clean?
I know that it's possible to use scriptSource
- task: PythonScript#0
displayName: 'Run a Python script'
inputs:
scriptSource: 'filePath'
scriptPath: 'my_python.py'
arguments: '${{ parameters.my_param }}'
But as the template is in another repository than the repository ran in the pipeline, I don't think that I can reach that my_python.py without downloading it with a wget, or cloning, or doing additional steps. I am right?
Regards!
To use a template from another repo you need to define repository source like here:
# Repo: Contoso/LinuxProduct
# File: azure-pipelines.yml
resources:
repositories:
- repository: templates
type: github
name: Contoso/BuildTemplates
steps:
- template: common.yml#templates # Template reference
Once you have you need to just checkout this repo:
# Repo: Contoso/LinuxProduct
# File: azure-pipelines.yml
resources:
repositories:
- repository: templates
type: github
name: Contoso/BuildTemplates
steps:
- checkout: self
- checkout: templates #this download whole repo
- template: common.yml#templates # Template reference
Now you need to figure out where it is downloaded ;)
Multiple repositories: If you have multiple checkout steps in your job, your source code is checked out into directories named after the repositories as a subfolder of s in (Agent.BuildDirectory). If (Agent.BuildDirectory) is C:\agent\_work\1 and your repositories are named tools and code, your code is checked out to C:\agent\_work\1\s\tools and C:\agent\_work\1\s\code.
So if you have script in scripts folder in templates repo you will find it in $(Agent.BuildDirectory)\templates\scripts\script.py.
So then you can use it like this:
- task: PythonScript#0
displayName: 'Run a Python script'
inputs:
scriptSource: 'filePath'
scriptPath: '$(Agent.BuildDirectory)\templates\scripts\script.py'
arguments: '${{ parameters.my_param }}'