YAML - Manage different branches within one location - azure-devops

Switching from Classic over to YAML for one pipeline where I would like to only have to update the YAML on the main branch and not the release branches. it should still trigger off any branch though.
trigger:
branches:
include:
- refs/heads/main
- refs/heads/*
paths:
include:
- src
batch: True
jobs:
- template: templates/code-analysis.yml
- job: Job_2
displayName: Main Branch
timeoutInMinutes: 90
condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/main'))
steps:
- checkout: self
submodules: recursive
fetchDepth: 100
- template: templates/update-version.yml
- task: WindowsMachineFileCopy#2
displayName: Copy files
continueOnError: True
inputs:
SourcePath: src
MachineNames: server1,server2
AdminUserName: $(AdminUserName)
AdminPassword: $(AdminPassword)
TargetPath: Websites\dev
AdditionalArguments: $(RoboCopyAdditionalArguments)
- template: templates/run-and-publish-tests.yml
- task: PowerShell#2
displayName: Generate Docs
- deployment: UpdateBeta
displayName: Beta
timeoutInMinutes: 90
condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/release/22.02'))
environment: "Beta"
strategy:
runOnce:
deploy:
steps:
- checkout: self
submodules: recursive
fetchDepth: 100
- template: templates/update-version.yml
- task: WindowsMachineFileCopy#2
displayName: Copy files from src for upcoming release
continueOnError: True
inputs:
SourcePath: src
MachineNames: server12,server22
AdminUserName: $(AdminUserName)
AdminPassword: $(AdminPassword)
TargetPath: Websites\next
AdditionalArguments: $(RoboCopyAdditionalArguments)
- template: templates/run-and-publish-tests.yml
- deployment: UpdateBetaRelease
displayName: Beta Release
timeoutInMinutes: 90
condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/release/22.01'))
pool:
name: Hospital Team
environment: "Beta Release"
strategy:
runOnce:
deploy:
steps:
- checkout: self
submodules: recursive
fetchDepth: 100
- template: templates/update-version.yml
- task: WindowsMachineFileCopy#2
displayName: Copy files from src for release
continueOnError: True
inputs:
SourcePath: src
MachineNames: server111
AdminUserName: $(AdminUserName)
AdminPassword: $(AdminPassword)
TargetPath: Websites\release
AdditionalArguments: $(RoboCopyAdditionalArguments)
- template: templates/run-and-publish-tests.yml
But as it stands right now, I must update each branch with the updated servers or arguments needed, which leads to a little more complexity than I want as I would prefer to just manage all of the branches off the main branch yaml. Is there a better way?

If you want the YAML pipeline can be run for a branch, the YAML files of the pipeline definition must be existing in the branch.
Currently, we have no method to maintain the YAML files of the pipeline which is for multiple branches in only one branch.
I have tried with the repository resources feature, and it can't meet your demands.

Related

Azure DevOps - How to ensure working directory

How can i ensure that all stages of my pipelines are performed in the same working directory.
I have pipeline that looks like this:
resources:
repositories:
- repository: AzureRepoDatagovernance
type: git
name: DIF_data_governance
ref: develop
trigger:
branches:
include:
- main
paths:
include:
- terraform/DIF
variables:
- group: PRD_new_resources
- name: initial_deployment
value: false
pool: $(agent_pool_name)
stages:
- stage: VariableCheck
jobs:
- job: VariableMerge
steps:
- checkout: self
- checkout: AzureRepoDatagovernance
- ${{ if eq(variables.initial_deployment, 'false') }}:
- task: PythonScript#0
inputs:
scriptSource: filePath
scriptPath: DIF-devops/config/dynamic_containers.py
pythonInterpreter: /usr/bin/python3
arguments: --automount-path $(System.DefaultWorkingDirectory)/DIF_data_governance/data_ingestion_framework/$(env)/AutoMount_Config.json --variables-path $(System.DefaultWorkingDirectory)/DIF-devops/terraform/DIF/DIF.tfvars.json
displayName: "Adjust container names in variables.tf.json"
- stage: Plan
jobs:
- job: Plan
steps:
- checkout: self
- checkout: AzureRepoDatagovernance
- script: |
cd $(System.DefaultWorkingDirectory)$(terraform_folder_name) && ls -lah
terraform init
terraform plan -out=outfile -var-file=DIF.tfvars.json
displayName: "Plan infrastructure changes to $(terraform_folder_name) environment"
- stage: ManualCheck
jobs:
- job: ManualCheck
pool: server
steps:
- task: ManualValidation#0
timeoutInMinutes: 5
displayName: "Validate the configuration changes"
- stage: Apply
jobs:
- job: Apply
steps:
- checkout: self
- checkout: AzureRepoDatagovernance
- script: |
cd $(System.DefaultWorkingDirectory)$(terraform_folder_name) && ls -lah
terraform apply -auto-approve "outfile"
displayName: "Apply infrastructure changes to $(terraform_folder_name) environment"
How can I make sure that all 4 stages are inside this same working directory so I can check out just once and all stages have access to work done by previous jobs? I know that this
I know that my pipeline has some flaws that will need to be polished.
This is not possible. Each azure devops stage has its own working directory and it is considered a different devops agent job. The jobs inside the stage will use the same working directory for the steps that are included on them.
If you need to pass code or artifacts between stages you should use publish pipeline artifacts and download pipeline artifacts native devops tasks.

Assistance with First Yaml pipeline in Azure DevOps

I'm writing my first Terraform YAML pipeline in Azure DevOps. I define four repositories as resources, check them out and run to Terraform Plan. The pipeline succeeds but only the first repository of Terraform runs. This is based what is coming to the screen from that repositories' output.tf. The others don't generate any output even when I define a output variable with a string value. Something like this:
output "rg_module_debug" { value="rg module ran" }
Here is the pipeline code, any feedback on why the other code isn't running would be appreciated
name: 'Naming Test'
trigger:
- None
pool:
vmImage: ubuntu-latest
resources:
repositories:
- repository: VariablesRepo # identifier (A-Z, a-z, 0-9, and underscore)
type: git #git refers to Azure Repos Git repos
name: AzureTutorial/terraform-azurerm-variables-environment
ref: main
- repository: NameRepo # identifier (A-Z, a-z, 0-9, and underscore)
type: git #git refers to Azure Repos Git repos
name: AzureTutorial/terraform-azurerm-module-name
ref: main
- repository: NamingRepo # identifier (A-Z, a-z, 0-9, and underscore)
type: git #git refers to Azure Repos Git repos
name: AzureTutorial/terraform-azurerm-module-naming
ref: main
- repository: ResourceGrpRepo # identifier (A-Z, a-z, 0-9, and underscore)
type: git #git refers to Azure Repos Git repos
name: AzureTutorial/terraform-azurerm-module-resource_group
ref: main
stages:
- stage: Install
jobs:
- job:
timeoutInMinutes: 60 # how long to run the job before automatically cancelling
cancelTimeoutInMinutes: 2 # how much time to give 'run always even if cancelled tasks'
before stopping them
steps:
- checkout: self
- checkout: VariablesRepo
- checkout: NameRepo
- checkout: NamingRepo
- checkout: ResourceGrpRepo
- task: TerraformInstaller#0
displayName: 'install'
inputs:
terraformVersion: 'latest'
- task: TerraformCLI#0
displayName: 'terraform init'
inputs:
provider: 'azurerm'
command: 'init'
workingDirectory: '$(System.DefaultWorkingDirectory)/terraform-azurerm-module-name'
#environmentServiceName: TutorialSvcCon
- task: TerraformCLI#0
displayName: 'terraform plan'
inputs:
provider: 'azurerm'
command: 'plan'
workingDirectory: '$(System.DefaultWorkingDirectory)/terraform-azurerm-module-name'
#environmentServiceName: TutorialSvcCon
You are checking out multiple repositories therefore according to the documentation:
Multiple repositories:
If you have multiple checkout steps in your job, your source code is
checked out into directories named after the repositories as a
subfolder of s in (Agent.BuildDirectory). If (Agent.BuildDirectory) is
C:\agent_work\1 and your repositories are named tools and code, your
code is checked out to C:\agent_work\1\s\tools and
C:\agent_work\1\s\code.
Thus to achieve what you want you'd have to create one TerraformCLI#0 task for each checked out repository.
Therefore your stages code would be:
...
stages:
- stage: Install
jobs:
- job:
timeoutInMinutes: 60 # how long to run the job before automatically cancelling
cancelTimeoutInMinutes: 2 # how much time to give 'run always even if cancelled tasks'
before stopping them
steps:
- checkout: self
- checkout: VariablesRepo
- checkout: NameRepo
- checkout: NamingRepo
- checkout: ResourceGrpRepo
- task: TerraformInstaller#0
displayName: 'install'
inputs:
terraformVersion: 'latest'
- task: TerraformCLI#0
displayName: 'terraform init'
inputs:
provider: 'azurerm'
command: 'init'
workingDirectory: '$(System.DefaultWorkingDirectory)/NameRepo'
#environmentServiceName: TutorialSvcCon
- task: TerraformCLI#0
displayName: 'terraform plan'
inputs:
provider: 'azurerm'
command: 'plan'
workingDirectory: '$(System.DefaultWorkingDirectory)/NameRepo'
#environmentServiceName: TutorialSvcCon
If for instance you want to do the same operation for each repository then you add more tasks pointing to the RepositoryName as you correctly set up at the beginning:
- task: TerraformCLI#0
displayName: 'terraform init'
inputs:
provider: 'azurerm'
command: 'init'
workingDirectory: '$(System.DefaultWorkingDirectory)/VariablesRepo'
#environmentServiceName: TutorialSvcCon
- task: TerraformCLI#0
displayName: 'terraform plan'
inputs:
provider: 'azurerm'
command: 'plan'
workingDirectory: '$(System.DefaultWorkingDirectory)/VariablesRepo'
#environmentServiceName: TutorialSvcCon
And so forth.

How to execute code from wo differnt azure repositories?

I am a newbie to Azure DevOps and want to execute code from two different repositories and perform a different operation on each repo for e.g:
Stages:
- stage: Data_Setup
jobs:
- job: Data_Setup // Want to perform this operation on repo1
timeoutInMinutes: 120
pool:
vmImage: ubuntu-20.04
continueOnError: true
steps:
- task: Gradle#2
inputs:
gradleWrapperFile: gradlew
tasks: cleanReports test aggregate
env:
SYSTEM_ACCESSTOKEN: $(System.AccessToken)
- stage: Run_all_regression_tests // Want to perform this operation on repo2
jobs:
- job: Run_all_regression_tests
timeoutInMinutes: 100
pool:
vmImage: ubuntu-20.04
continueOnError: true
steps:
- task: Gradle#2
inputs:
gradleWrapperFile: gradlew
tasks: cleanReports createJar test aggregate
env:
SYSTEM_ACCESSTOKEN: $(System.AccessToken)
You add multiple repositories by ,
resources:
repositories:
and when you want to use a specific repository you can checkout that in the step using "checkout" like below (Copied from Microsoft documentation)
resources:
repositories:
- repository: MyGitHubRepo # The name used to reference this repository in the checkout step
type: github
endpoint: MyGitHubServiceConnection
name: MyGitHubOrgOrUser/MyGitHubRepo
- repository: MyBitbucketRepo
type: bitbucket
endpoint: MyBitbucketServiceConnection
name: MyBitbucketOrgOrUser/MyBitbucketRepo
- repository: MyAzureReposGitRepository # In a different organization
endpoint: MyAzureReposGitServiceConnection
type: git
name: OtherProject/MyAzureReposGitRepo
trigger:
- main
pool:
vmImage: 'ubuntu-latest'
steps:
- checkout: self
- checkout: MyGitHubRepo
- checkout: MyBitbucketRepo
- checkout: MyAzureReposGitRepository
- script: dir $(Build.SourcesDirectory)

Ho to read artifacts downloaded in one stage to the next consequence stage in Yaml Pipeline in Azure

If you see my Yaml below, I have $(Build.SourcesDirectory) which has artifacts which I am downloading and copying it in Azure Blob. In Next Stage I would like to make use of $(Build.SourcesDirectory) contents, but I am not getting anything if I call $(Build.SourcesDirectory) in my next stage. Why is that happening and how can I fix this??
- task: PublishPipelineArtifact#1
displayName: 'Publish PS'
inputs:
targetPath: $(Build.ArtifactStagingDirectory)\flow_ps
artifactName: 'Flow_Tools_PS'
- task: DownloadPipelineArtifact#2
displayName: Download Flow CodeGen
inputs:
artifact: 'Flow_Tools_PS'
path: '$(Build.SourcesDirectory)/Flow_Tools_PS'
- task: AzureFileCopy#2
displayName: 'Publish Flow_Tools_PS to Blob'
inputs:
SourcePath: '$(Build.SourcesDirectory)/Flow_Tools_PS'
azureSubscription: 'Azure CICD'
Destination: AzureBlob
storage: '$(BlobStorageAccount)'
ContainerName: '$(BlobContainer)'
BlobPrefix: '$(BlobPrefix)/$(DeploymentVersion)/Flow_Tools_PS'
AdditionalArgumentsForBlobCopy: '/V /S'
outputStorageUri: BlobUri
outputStorageContainerSasToken: BlobSASToken
- stage: PublishFlowNotebooks
dependsOn: Flow
jobs:
- job: DevOpsScripts
pool:
vmImage: 'windows-latest'
environment: 'Flow'
steps:
- checkout: DevOpsScripts
- powershell: |
Get-ChildItem $(Build.SourcesDirectory) -Recurse
name: DebugCheckout
displayName: Debug script checkout
- task: DownloadPipelineArtifact#2
inputs:
buildVersionToDownload: 'latest'
targetPath: '$(Pipeline.Workspace)'
- task: PowerShell#2
displayName: 'PowerShell Script'
inputs:
targetType: filePath
filePath: '$(System.DefaultWorkingDirectory)/ReleaseNoteScripts/UploadWithReleaseNotes.ps1'
If you want to share some directories between jobs you should publish content of that directory:
steps:
- publish: $(Build.SourcesDirectory)
artifact: Flow
and then on the next job download this artifact:
steps:
- download: current
artifact: Flow
Be aware that
By default, files are downloaded to $(Pipeline.Workspace)/. If an artifact name was not specified, a sub-directory will be created for each downloaded artifact.
You can read more about this here.
And if you want to checkout multiple repositories this is possible
resources:
repositories:
- repository: MyGitHubRepo # The name used to reference this repository in the checkout step
type: github
endpoint: MyGitHubServiceConnection
name: MyGitHubOrgOrUser/MyGitHubRepo
- repository: MyBitbucketRepo
type: bitbucket
endpoint: MyBitbucketServiceConnection
name: MyBitbucketOrgOrUser/MyBitbucketRepo
- repository: MyAzureReposGitRepository # In a different organization
endpoint: MyAzureReposGitServiceConnection
type: git
name: OtherProject/MyAzureReposGitRepo
trigger:
- main
pool:
vmImage: 'ubuntu-latest'
steps:
- checkout: self
- checkout: MyGitHubRepo
- checkout: MyBitbucketRepo
- checkout: MyAzureReposGitRepository
- script: dir $(Build.SourcesDirectory)

Azure Pipeline with multiple drop folders in YAML

I have create a YAML pipeline for Azure deployment. There are many templates, but I will only show the master pipeline to illustrate my issue.
Basically
the first stage is to build from repository source.
the next stage is pre-deployment followed by deployment
The build drops the output files to a drop folder. During pre-deployment some of these files go through some transformations (replacing tokens with values according to target environment).
The problem is that currently there is only one drop folder, so you can see the problem coming .... If I deploy to DEV, the files are transformed using the DEV values. But then if I deploy to INT, the files are already transformed and I end up deploying to INT files with DEV values.
It get worse if DEV and INT deployment run at the same time...
So how can I use separate drop folder per environment ? In predeployment, should I copy the drop folder to another location before transformation. In which case, how do I specify the new location in the deployment stages ?
Here's the master pipeline for reference :
trigger:
- master
pool:
name: Default
demands:
- npm
- msbuild
- visualstudio
stages:
- stage: build
displayName: 'Build & Test stage'
jobs:
- template: templates/pipeline-build/master.yml
parameters:
buildConfiguration: 'Release'
dropFolder: '\\srvbuild\DROP'
- stage: deployDev
displayName: 'Deploy Dev Stage'
dependsOn: build
condition: succeeded()
jobs:
- deployment: deploymentjob
displayName: deployment job
environment: dev
variables:
- template: variables/dev.yml
strategy:
runOnce:
preDeploy:
steps:
- template: templates/pipeline-predeploy/master.yml
deploy:
steps:
- template: templates/pipeline-deploy/master.yml
- stage: deployInt
displayName: 'Deploy Int Stage'
dependsOn: build
condition: succeeded()
jobs:
- deployment: deploymentjob
displayName: deployment job
environment: int
variables:
- template: variables/int.yml
strategy:
runOnce:
preDeploy:
steps:
- template: templates/pipeline-predeploy/master.yml
deploy:
steps:
- template: templates/pipeline-deploy/master.yml
As workaround, you can publish the build artifact to A file share, and then download the build artifact through the Download Fileshare Artifacts task in each stage to transform
it separately.
- task: PublishPipelineArtifact#1
displayName: 'Publish Pipeline Artifact'
inputs:
artifact: drop
publishLocation: filepath
fileSharePath: '***'
Use this task to download fileshare artifacts:
- task: DownloadFileshareArtifacts#1
inputs:
filesharePath:
artifactName:
#itemPattern: '**' # Optional
#downloadPath: '$(System.ArtifactsDirectory)'
#parallelizationLimit: '8' # Optional