In Azure DevOps Multi Stage YAML Pipeline, under resources section I had defined 2 repo resources Demo2 and Demo3. I want to access the changes happening between the builds for the repo Demo2 and Demo3. In pipeline summary page, there is an option view changes, which gives the commits from the repo and I am trying to get that details via RestAPI.
I tried to find details via Azure DevOps RestAPI page and az devops CLI but can't find anything helpful, so reaching out here for help.
resources:
repositories:
- repository: Demo2
type: git
name: 'Test/Repo2'
- repository: Demo3
type: git
name: 'Test/Repo3'
trigger:
- main
pool:
vmImage: ubuntu-latest
steps:
- checkout: Demo2
- checkout: Demo3
- script: echo Hello, world!
displayName: 'Run a one-line script'
Azure Devops RestAPI to access Resources in YAML pipelines
I am afraid there is no such documented REST API to get the details via option view changes.
But we could try to use F12 to grab the URL:
https://dev.azure.com/{organization}/{project}/_traceability/runview/changes?currentRunId={Build Id}
Then we will get the feedback with HTML type, we could convert it the Json type, we could get some info about the commit:
If we need to check the context of the commit, we could use the REST API [Commits - Get] 2 to get the details.
Related
I would like to know the way to consume or call terraform modules from one project in one organisation to another project from another organisation using azure devops. I tried to explore ways but found one solution using the below but my IT team is not letting to use this method as this is braking the subsequent pipelines. Any suggestions please?
Also, requirement is I just need to refer the modules of terraform which are in another organization but as per my POC its downloading/checkout the code from that organization/project and then I am able to refer those modules. I would like to only refer those modules instead checkout the code from another organization and utilising/referencing.
Below is the reply from pipeline team:
Can you exclude this part as it is not ideal and you need to take a different approach?
echo "Git config update start"
MY_PAT=$(yourPAT)
B64_PAT=$(printf "%s"":$MY_PAT" | base64)
git config --global http.extraheader "Authorization: Basic ${B64_PAT}"
echo "Git config update end"
terraform init
terraform plan
you are introducing your cred in .gitconfig that's breaking all subsequent pipelines
in the agent.
POC: The below code is cloning the entire modules code from another organization and we are referecing those modules but I just need to refer those modules directly instead of downloading and calling/referencing modules.
resources:
repositories:
- repository: Modules
type: git
name: 'Compute Platforms/CES-Terraform-Automation-Service'
endpoint: Repo-bp-digital # Azure DevOps service connection
ref: Modules
- repository: self
type: git
name: 'Cloud Onboarding/terraform-testing-by-vivek'
AFAIK, There’s only one option to connect to the project of another Azure DevOps organization that is by creating a Service Connection in the organization from where you want to run the pipeline and by creating a PAT token in the target organization and referencing it in the service connection,
I created 2 Organizations, 1) Organization alpha1 and 2) Organization beta2. I created 2 projects in both organizations with one YAML script and a task.
Created a PAT Token in Organization beta2.
Created service connection in the Alpha organization from where I am running the pipeline to beta org by referencing PAT token from beta org like below:-
trigger:
- master
variables:
pythonVersion: '3.8'
vmImageName: 'ubuntu-latest'
resources:
repositories:
- repository: remoteRepo
type: git
name: remote-access/shared-common-install
endpoint: remoteaccesstemp # Service connection name
ref: refs/heads/main
stages:
- stage: remote_git_test
jobs:
- job: git_test
steps:
# Running the template from the same repsitory
- template: templates/hello-alpha.yaml
# Checkout the remote repository
- checkout: remoteRepo
persistCredentials: true
# Call the template that is located in another repository in another organization
- template: templates/hello-beta.yaml#remoteRepo
Alternatively, you can create a terraform task in Azure DevOps and call your terraform module from another organization with the below script:-
terraform init -backend-config="repository=organization-beta2/project-beta2/_git/beta-2" -backend-config="token=Pat-token"
and
provider "azuredevops"{
org_service_url = var.org_service_url
personal_access_token = var.personal_access_token
}
You can add this code in your terraform init script in your Organization repo from where you’re running pipeline and reference the template in System.Artifacts.
Even Azure DevOps Rest API does not support connecting to different Azure DevOps organizations.
References:-
GitHub - Azure-Samples/azure-pipelines-remote-tasks
Trying to setup an Azure DevOps organization using Terraform :: my tech ramblings — A blog for writing about my techie ramblings By Carlos
Azure DevOps Git: Fork into another Repo using Azure DevOps REST API - Stack Overflow By Andi Li-MSFT
You think is possible to reference the same BitBucket Repository under 2 different names with 2 different paths in order to achieve and trigger 2 different stages ? I want to build and publish 7 different microservices on K8s but the repos are only 3 (divided in subfolders) you this this can be achievable? The idea is to create different blocks of template one for each microservice. But during the template checkout the 3 macrorepository only (for dev purposes). Let me show you my idea.
resources:
repositories:
############## 3 BITBUCKET BIG REPOS ##############
- repository: omni-omsf-api
type: bitbucket
endpoint: OMNI-OMSF-BitBucket-SC
name: ovsdev/omni-omsf-api
trigger: none
- repository: omni-omsf-extension
type: bitbucket
endpoint: OMNI-OMSF-BitBucket-SC
name: ovsdev/omni-omsf-extension
trigger: none
- repository: omni-omsf-core
type: bitbucket
endpoint: OMNI-OMSF-BitBucket-SC
name: ovsdev/omni-omsf-core
trigger: none
###################################################
############# 7 SUB-REPOS ONLY FOR TRIGGERING #########
- repository: ovs-api-service
type: bitbucket
endpoint: OMNI-OMSF-BitBucket-SC
name: ovsdev/omni-omsf-api
trigger:
branches:
include:
- release_qa
- master
paths:
include:
- ovs-api-service/*
###################################################
stages:
- ${{ if or( and( eq( parameters.ovsapiservice, true), eq( variables['Build.Reason'], 'Manual') ), eq( variables['Build.Repository.Name'], 'ovs-api-service') ) }}:
- template: microservice-buildRelease.template.yml
parameters:
dockerFilePath: omni-omsf-api/ovs-api-service/Dockerfile
dockerImageName: ovs-api-service
tag: $(Build.BuildId)
microservicename: ovs-api-service
- ${{ else }}:
- stage:
jobs:
- job:
steps:
- task: Bash#3
displayName: Showing folder hierarchy
inputs:
targetType: 'inline'
script: |
tree $(Pipeline.Workspace)
Can I reference the same BitBucket Repository under two different
names
The answer is Yes. Different alias can refer from the same repository.
enable custom triggering on Azure DevOps Services YAML
I read the YAML definition you provided, if you are talking about trigger of the resources section, then the answer is NO.
It should be pointed out that the usage you are using does not exist.
Please check these official articles, both of them mentioned this point:
resources.repositories.repository definition
trigger: trigger # CI trigger for this repository, no CI trigger if
skipped (only works for Azure Repos).
Triggers Usage in resources
Repository resource triggers only work for Azure Repos Git
repositories in the same organization at present. They do not work for
GitHub or Bitbucket repository resources.
So trigger via resources section of YAML is unable to achieve, you can only set YAML on bitbucket side and the condition should based on common trigger on bitbucket side.
I have a build that uses the Azure DevOps REST api to do analysis across a collection of repositories in a single Azure DevOps project.
To speed up the build, it only checks out a single repository containing certain build utilities, such as powershell scripts - the rest of the analysis is done via querying specific information via the REST api.
NOTE: This build is running on Azure DevOps Server 2020, which still calls the setting "Limit job authorization scope to referenced Azure DevOps repositories". Looking at doc history, I believe this is equivalent to "Protect access to repositories in YAML pipeline", I use the more recent term below.
This all worked fine until "Protect access to repositories in YAML pipelines" was turned on. With that setting turned on, the REST api only returns information about the repository containing the build utilities. This is due to the reduced scope of the Job Access Token (see Protect access to repositories in YAML pipelines)
I've attempted to create a template containing the list of all repositories, so that a few select builds can continue to easily access all repositories. Previously, no explicit list of repositories was needed, but now it appears they will have to manually be listed, and I'd like to do that in a single file.
Both yaml files below are in the same repository.
Template allRepos.yaml:
parameters:
- name: steps
type: stepList
default: []
jobs:
- job:
pool: 'swimming'
uses:
repositories:
- R1
- R2
- R3
- Rnnn
steps:
- ${{ parameters.steps }}
Yaml for pipeline:
extends:
template: allRepos.yaml
parameters:
steps:
- checkout: self
- task: PowerShell#2
displayName: multiRepoAnalysis
inputs:
filePath: analysis.ps1
env:
SYSTEM_ACCESSTOKEN: $(System.AccessToken)
It seems like this should work according to:
access tokens - Protect access to repositories in YAML pipelines)
repos - Protect access to repositories in YAML pipelines
jobs.job definition
However, attempting to run this pipeline results in the errors:
Job 'Job1' references the repository 'R1' which is not defined by the pipeline.
Job 'Job1' references the repository 'R2' which is not defined by the pipeline.
Job 'Job1' references the repository 'R3' which is not defined by the pipeline.
Job 'Job1' references the repository 'Rnnn' which is not defined by the pipeline.
How can I create a template that:
allows the access token for specific builds to access all repositories when "Protect access to repositories in YAML pipelines" is turned on
do so without checking out each repository
Limitations
This solution only works for a list of at most 20 repositories
- checkout: self counts toward the 20 repository limit
You must include submodule repositories in the list, which also counts toward the 20 repository limit.
Exceeding 20 repositories will still result in an error:
##[error]Tokens may be scoped to a maximum of 20 repositories. Reference fewer repositories in YAML or disable Scoped Tokens.
This limitation could be worked around by splitting the build into multiple definitions - but note if your analysis includes accessing submodule repositories, those submodule repositories must be referenced in any list of the primary repositories.
To my current knowledge, there is not a way to scope the access token for a particular pipeline with a generic "all repositories" permission.
Solution
The original templates were on the right track - but missed the fact that any uses: repositories must previously have been declared as a resources: repository.
There is a github issue to clarify the Microsoft documentation: uses: keyword is not explained, does not work as described
The working solution is:
Template allRepos.yaml:
parameters:
- name: steps
type: stepList
default: []
resources:
repositories:
- repository: R1
type: git
name: MyDevOpsProject/R1
- repository: R2
type: git
name: MyDevOpsProject/R2
- repository: R3
type: git
name: MyDevOpsProject/R3
- repository: Rnnn
type: git
name: MyDevOpsProject/Rnnn
jobs:
- job:
pool: 'swimming'
uses:
repositories:
- R1
- R2
- R3
- Rnnn
steps:
- ${{ parameters.steps }}
Yaml for pipeline:
extends:
template: allRepos.yaml
parameters:
steps:
- checkout: self
- task: PowerShell#2
displayName: multiRepoAnalysis
inputs:
filePath: analysis.ps1
env:
SYSTEM_ACCESSTOKEN: $(System.AccessToken)
I want to run a pipeline when a pull request is triggered on master branch from github.
I am using azure devops to run the pipeline
I want to pass the pull request number from github to use as a variable in the azure devops yaml pipeline
Can this be done?
How do you get the Pull request Variable from Github to use in Azue Devops Yaml Pipeline?
You could use the predefined variables System.PullRequest.PullRequestNumber to get the pull request number from github.
pr:
- main
pool:
vmImage: ubuntu-latest
steps:
- script: |
echo The PullRequest Number is that: $(System.PullRequest.PullRequestNumber)
displayName: 'Get the PullRequest Number'
You should create YAML file for the github with above scripts:
The test result:
I have a pipeline that has a section like this that lists the pipelines that would trigger the pipeline.
resources:
# List all the microservice pipelines to be watched plus infrastructure, the pipeline name is the name
# of the stack. Note template-maven and template-gradle are not to be part of this build.
pipelines:
- pipeline: auth
project: services
source: auth
branch: master
trigger:
branches:
include:
- master
- pipeline: ai
project: services
source: artificial-intelligence
branch: master
trigger:
branches:
include:
- master
- pipeline: ui
project: frontend
source: ui CI
branch: master
trigger:
branches:
include:
- master
I then have a job with the following steps (because deployment pulls all files, I just need one folder from each pipeline
- job: publishDeploymentPipelineFiles
condition: not(canceled())
steps:
- checkout: none
- download: auth
artifact: drop
- download: ai
artifact: drop
- download: ui
artifact: drop
What I am hoping for is some form of template that does
steps:
- checkout: none
- template: pull-deployment-manifests.yml
parameters:
sources:
- project: services
source: auth
stackName: auth
- project: services
source: artificial-intelligence
stackName: ai
- project: frontend
source: ui CI
stackName: ui
Which only lists the project and CI pipeline and create the appropriate pipeline ID from stackName and create the resources and the steps.
My workaround right now is to create a project that takes a CSV containing those items and generating the azure-pipelines.yml
As far as I know you can't dynamically create resources. So you create this
steps:
- checkout: none
- template: pull-deployment-manifests.yml
parameters:
sources:
- project: services
source: auth
stackName: auth
- project: services
source: artificial-intelligence
stackName: ai
- project: frontend
source: ui CI
stackName: ui
and run checkout inside the template unless you defined resources with those names on root level.
As documentation says here:
Resources are defined at one place and can be consumed anywhere in your pipeline.
Sure you can set up a template with resources, and use this template in a YAML pipeline. You can reference "Extend from a template with resources".
However, please note that if you have defined resources and steps in the template, you can't use it under the steps key in the YAML pipeline. You should use the extends key to extend the resources from the template, just like as the example shows in the document.
You may need to defined all the required steps in the template, or use the steps from other step template into the template.