I am a newbie to Azure DevOps and want to execute code from two different repositories and perform a different operation on each repo for e.g:
Stages:
- stage: Data_Setup
jobs:
- job: Data_Setup // Want to perform this operation on repo1
timeoutInMinutes: 120
pool:
vmImage: ubuntu-20.04
continueOnError: true
steps:
- task: Gradle#2
inputs:
gradleWrapperFile: gradlew
tasks: cleanReports test aggregate
env:
SYSTEM_ACCESSTOKEN: $(System.AccessToken)
- stage: Run_all_regression_tests // Want to perform this operation on repo2
jobs:
- job: Run_all_regression_tests
timeoutInMinutes: 100
pool:
vmImage: ubuntu-20.04
continueOnError: true
steps:
- task: Gradle#2
inputs:
gradleWrapperFile: gradlew
tasks: cleanReports createJar test aggregate
env:
SYSTEM_ACCESSTOKEN: $(System.AccessToken)
You add multiple repositories by ,
resources:
repositories:
and when you want to use a specific repository you can checkout that in the step using "checkout" like below (Copied from Microsoft documentation)
resources:
repositories:
- repository: MyGitHubRepo # The name used to reference this repository in the checkout step
type: github
endpoint: MyGitHubServiceConnection
name: MyGitHubOrgOrUser/MyGitHubRepo
- repository: MyBitbucketRepo
type: bitbucket
endpoint: MyBitbucketServiceConnection
name: MyBitbucketOrgOrUser/MyBitbucketRepo
- repository: MyAzureReposGitRepository # In a different organization
endpoint: MyAzureReposGitServiceConnection
type: git
name: OtherProject/MyAzureReposGitRepo
trigger:
- main
pool:
vmImage: 'ubuntu-latest'
steps:
- checkout: self
- checkout: MyGitHubRepo
- checkout: MyBitbucketRepo
- checkout: MyAzureReposGitRepository
- script: dir $(Build.SourcesDirectory)
Related
How can i ensure that all stages of my pipelines are performed in the same working directory.
I have pipeline that looks like this:
resources:
repositories:
- repository: AzureRepoDatagovernance
type: git
name: DIF_data_governance
ref: develop
trigger:
branches:
include:
- main
paths:
include:
- terraform/DIF
variables:
- group: PRD_new_resources
- name: initial_deployment
value: false
pool: $(agent_pool_name)
stages:
- stage: VariableCheck
jobs:
- job: VariableMerge
steps:
- checkout: self
- checkout: AzureRepoDatagovernance
- ${{ if eq(variables.initial_deployment, 'false') }}:
- task: PythonScript#0
inputs:
scriptSource: filePath
scriptPath: DIF-devops/config/dynamic_containers.py
pythonInterpreter: /usr/bin/python3
arguments: --automount-path $(System.DefaultWorkingDirectory)/DIF_data_governance/data_ingestion_framework/$(env)/AutoMount_Config.json --variables-path $(System.DefaultWorkingDirectory)/DIF-devops/terraform/DIF/DIF.tfvars.json
displayName: "Adjust container names in variables.tf.json"
- stage: Plan
jobs:
- job: Plan
steps:
- checkout: self
- checkout: AzureRepoDatagovernance
- script: |
cd $(System.DefaultWorkingDirectory)$(terraform_folder_name) && ls -lah
terraform init
terraform plan -out=outfile -var-file=DIF.tfvars.json
displayName: "Plan infrastructure changes to $(terraform_folder_name) environment"
- stage: ManualCheck
jobs:
- job: ManualCheck
pool: server
steps:
- task: ManualValidation#0
timeoutInMinutes: 5
displayName: "Validate the configuration changes"
- stage: Apply
jobs:
- job: Apply
steps:
- checkout: self
- checkout: AzureRepoDatagovernance
- script: |
cd $(System.DefaultWorkingDirectory)$(terraform_folder_name) && ls -lah
terraform apply -auto-approve "outfile"
displayName: "Apply infrastructure changes to $(terraform_folder_name) environment"
How can I make sure that all 4 stages are inside this same working directory so I can check out just once and all stages have access to work done by previous jobs? I know that this
I know that my pipeline has some flaws that will need to be polished.
This is not possible. Each azure devops stage has its own working directory and it is considered a different devops agent job. The jobs inside the stage will use the same working directory for the steps that are included on them.
If you need to pass code or artifacts between stages you should use publish pipeline artifacts and download pipeline artifacts native devops tasks.
I have a separate pipeline in Azure Devops for sonar analysis, which scans another repo in Azure Devops, however, it is only scanning the master branch, I would like it to scan specific branches, ideally, all feature branches or just our test branch.
I've tried a few things but unable to get this working, here is part of my sonar pipeline code (removed/changed any names/sensitive info):
resources:
repositories:
- repository: platform-code
type: git
name: platform-code
#pipelines:
# - pipeline: platform-build
# source: platform-build
# trigger:
# branches:
# - feature/awesomeaddition
#trigger:
# branches:
# include:
# - master
# - features/*
stages:
- stage: Sonarqube
variables:
- template: variables.yaml
jobs:
- job: SonarqubeScan
steps:
- checkout: self
- checkout: platform-code
- task: SonarQubePrepare#4
displayName: 'SonarQube Prepare'
inputs:
SonarQube: ${{ variables.SonarQubeServiceConnectionName }}
scannerMode: 'CLI'
configMode: 'manual'
cliProjectKey: ${{ variables.cliProjectKey }}
cliProjectName: ${{ variables.cliProjectName }}
cliSources: '$(Build.SourcesDirectory)/platform-build'
extraProperties: |
sonar.branch.name='feature/awesomeaddition'
Switching from Classic over to YAML for one pipeline where I would like to only have to update the YAML on the main branch and not the release branches. it should still trigger off any branch though.
trigger:
branches:
include:
- refs/heads/main
- refs/heads/*
paths:
include:
- src
batch: True
jobs:
- template: templates/code-analysis.yml
- job: Job_2
displayName: Main Branch
timeoutInMinutes: 90
condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/main'))
steps:
- checkout: self
submodules: recursive
fetchDepth: 100
- template: templates/update-version.yml
- task: WindowsMachineFileCopy#2
displayName: Copy files
continueOnError: True
inputs:
SourcePath: src
MachineNames: server1,server2
AdminUserName: $(AdminUserName)
AdminPassword: $(AdminPassword)
TargetPath: Websites\dev
AdditionalArguments: $(RoboCopyAdditionalArguments)
- template: templates/run-and-publish-tests.yml
- task: PowerShell#2
displayName: Generate Docs
- deployment: UpdateBeta
displayName: Beta
timeoutInMinutes: 90
condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/release/22.02'))
environment: "Beta"
strategy:
runOnce:
deploy:
steps:
- checkout: self
submodules: recursive
fetchDepth: 100
- template: templates/update-version.yml
- task: WindowsMachineFileCopy#2
displayName: Copy files from src for upcoming release
continueOnError: True
inputs:
SourcePath: src
MachineNames: server12,server22
AdminUserName: $(AdminUserName)
AdminPassword: $(AdminPassword)
TargetPath: Websites\next
AdditionalArguments: $(RoboCopyAdditionalArguments)
- template: templates/run-and-publish-tests.yml
- deployment: UpdateBetaRelease
displayName: Beta Release
timeoutInMinutes: 90
condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/release/22.01'))
pool:
name: Hospital Team
environment: "Beta Release"
strategy:
runOnce:
deploy:
steps:
- checkout: self
submodules: recursive
fetchDepth: 100
- template: templates/update-version.yml
- task: WindowsMachineFileCopy#2
displayName: Copy files from src for release
continueOnError: True
inputs:
SourcePath: src
MachineNames: server111
AdminUserName: $(AdminUserName)
AdminPassword: $(AdminPassword)
TargetPath: Websites\release
AdditionalArguments: $(RoboCopyAdditionalArguments)
- template: templates/run-and-publish-tests.yml
But as it stands right now, I must update each branch with the updated servers or arguments needed, which leads to a little more complexity than I want as I would prefer to just manage all of the branches off the main branch yaml. Is there a better way?
If you want the YAML pipeline can be run for a branch, the YAML files of the pipeline definition must be existing in the branch.
Currently, we have no method to maintain the YAML files of the pipeline which is for multiple branches in only one branch.
I have tried with the repository resources feature, and it can't meet your demands.
I have pipeline.yml which is stored in repo - buildresources and it also have resources.json file
buildresources repo
pipeline.yml
resources.json
now I want to pass resources.json in following pipeline.yml which further extends template in other repo,how do I pass the resources.json path in buildresources repo to deploy-iac.yml in AzureIacPoC/AzureIacPoc repo
resources:
repositories:
- repository: cloud-iac # The name used to reference this repository in the checkout step
type: git
name: AzureIacPoC/AzureIacPoc
ref: refs/heads/features/azure
trigger:
branches:
include: [features/*, master]
extends:
template: deploy-iac.yml#cloud-iac
parameters:
resourceGroupName: 'anuj-test-automation'
location: 'Canada Central'
csmfile: resources.json
environment: 'sandbox'
resources:
repositories:
- repository: AnotherRepoName
type: git
name: YourProjectName/AnotherRepoName
pool:
vmImage: ubuntu-latest
steps:
- checkout: AnotherRepoName
- script: dir $(Build.SourcesDirectory)
displayName: 'Run a one-line script'
If you see my Yaml below, I have $(Build.SourcesDirectory) which has artifacts which I am downloading and copying it in Azure Blob. In Next Stage I would like to make use of $(Build.SourcesDirectory) contents, but I am not getting anything if I call $(Build.SourcesDirectory) in my next stage. Why is that happening and how can I fix this??
- task: PublishPipelineArtifact#1
displayName: 'Publish PS'
inputs:
targetPath: $(Build.ArtifactStagingDirectory)\flow_ps
artifactName: 'Flow_Tools_PS'
- task: DownloadPipelineArtifact#2
displayName: Download Flow CodeGen
inputs:
artifact: 'Flow_Tools_PS'
path: '$(Build.SourcesDirectory)/Flow_Tools_PS'
- task: AzureFileCopy#2
displayName: 'Publish Flow_Tools_PS to Blob'
inputs:
SourcePath: '$(Build.SourcesDirectory)/Flow_Tools_PS'
azureSubscription: 'Azure CICD'
Destination: AzureBlob
storage: '$(BlobStorageAccount)'
ContainerName: '$(BlobContainer)'
BlobPrefix: '$(BlobPrefix)/$(DeploymentVersion)/Flow_Tools_PS'
AdditionalArgumentsForBlobCopy: '/V /S'
outputStorageUri: BlobUri
outputStorageContainerSasToken: BlobSASToken
- stage: PublishFlowNotebooks
dependsOn: Flow
jobs:
- job: DevOpsScripts
pool:
vmImage: 'windows-latest'
environment: 'Flow'
steps:
- checkout: DevOpsScripts
- powershell: |
Get-ChildItem $(Build.SourcesDirectory) -Recurse
name: DebugCheckout
displayName: Debug script checkout
- task: DownloadPipelineArtifact#2
inputs:
buildVersionToDownload: 'latest'
targetPath: '$(Pipeline.Workspace)'
- task: PowerShell#2
displayName: 'PowerShell Script'
inputs:
targetType: filePath
filePath: '$(System.DefaultWorkingDirectory)/ReleaseNoteScripts/UploadWithReleaseNotes.ps1'
If you want to share some directories between jobs you should publish content of that directory:
steps:
- publish: $(Build.SourcesDirectory)
artifact: Flow
and then on the next job download this artifact:
steps:
- download: current
artifact: Flow
Be aware that
By default, files are downloaded to $(Pipeline.Workspace)/. If an artifact name was not specified, a sub-directory will be created for each downloaded artifact.
You can read more about this here.
And if you want to checkout multiple repositories this is possible
resources:
repositories:
- repository: MyGitHubRepo # The name used to reference this repository in the checkout step
type: github
endpoint: MyGitHubServiceConnection
name: MyGitHubOrgOrUser/MyGitHubRepo
- repository: MyBitbucketRepo
type: bitbucket
endpoint: MyBitbucketServiceConnection
name: MyBitbucketOrgOrUser/MyBitbucketRepo
- repository: MyAzureReposGitRepository # In a different organization
endpoint: MyAzureReposGitServiceConnection
type: git
name: OtherProject/MyAzureReposGitRepo
trigger:
- main
pool:
vmImage: 'ubuntu-latest'
steps:
- checkout: self
- checkout: MyGitHubRepo
- checkout: MyBitbucketRepo
- checkout: MyAzureReposGitRepository
- script: dir $(Build.SourcesDirectory)