Azure Devops : Multiple Project and Repo checkout - azure-devops

I would like to run a yaml pipeline from one project. I have a task in my yaml to scan all the source code. Using this Yaml I would like to scan all the source code in master branch for all the project and all the repository inside the same Org.
How can I get all the repo for all the project and iterate? Can someone help me ?
test.yaml
repositories:
- repository: justAnotherName
type: github
name: myGitRepo
endpoint: myGitServiceConnection
trigger:
branches:
include:
- master
steps:
- task: CredScan#2
inputs:
toolMajorVersion: 'V2'
outputFormat: 'tsv'
scanFolder: '$(Build.SourcesDirectory)'

If you're looking to pull every repo within a project, you have one of two options (see below). However, I'd advise caution before attempting this on a Microsoft-hosted agent, they have a 60-minute timeout by default. If you're using a self-hosted agent, you need not worry. I'd still advise breaking this up to avoid creating a long-running release that also consumes a large amount of disk space with each run.
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/phases?view=azure-devops&tabs=yaml#timeouts
That being said, here are the options you have:
Option 1 (Not the best)
Manually add a repository: dependency for every project and a checkout: task for every repo within the projects.
This is heavily manual and would require maintenance every time a report is added.
Option 2
You can write a custom PowerShell/bash script that uses the Azure DevOps API and git to automatically scan all projects and repos within the org and pull them onto the machine.
Start by issuing a request to get all of the projects within the org:
https://learn.microsoft.com/en-us/rest/api/azure/devops/core/projects/list?view=azure-devops-rest-6.0
Then, iterate through every project and get all repos:
https://learn.microsoft.com/en-us/rest/api/azure/devops/git/repositories/list?view=azure-devops-rest-6.0
Finally, iterate through each repo and run git clone [repository URL] to clone it onto the build agent.
NOTE: You will want to ensure to have a lot of free disk space on the agent machine and that you clean up the build space after this operation.

Related

Reference the Commits of Build Artifacts in Azure Pipelines

I have an Azure Pipeline which
is triggered manually
downloads the build artifacts of 3 other pipelines, all of them based on different (git) repositories
allows the user to input the BuildId (Run number) of these pipeline-artifacts, to choose which runs to take them from
creates a package with them
is written in YAML
I'm looking for a way to show in the "Source Code" tab the Source code link for the commit relative to each of pipeline builds, which artifacts have been downloaded during this run.
To do this, I have to name each of the repositories and checkout the proper version.
What I can't get my head around is how to exploit the BuildId variable to get the SourceVersion variable.
I know that Build.BuildId variable is the one defining the run id of the pipeline, and we use this to choose which run to take the specific artifact from.
At the same time, Build.SourceVersion contains the commit Id used for the pipeline run. But normally, Build is the current Build.
How can I reference Build_x, starting from Build_x.BuildId, so to be able to recover the Build_x.SourceVersion?
Thank you
Based on your requirement, I suggest that you can change use the Pipelines Resource in YAML pipeline.
Here is an example:
resources:
pipelines:
- pipeline: PipelineAlisa
project: project
source: Pipelinename
- pipeline: PipelineAlisa
project: proejct
source: Pipelinename
pool:
vmImage: windows-latest
steps:
- download: MyCIAlias
- download: MyCIAlias1
In this case, you can still select the Build runs when your run the pipeline(Resources option).
This should also be able to achieve the same function as your existing pipeline.
And this method can more conveniently obtain the corresponding build id and source version of pipeline artifacts.
You can directly use the variables:
Source version : RESOURCES.PIPELINE.Aliasname.SOURCECOMMIT
Build ID: RESOURCES.PIPELINE.Aliasname.RUNID
For example:

GitHub PR doesn't trigger GitLab pipeline

I'm trying to use GitHub to trigger on PR a GitLab pipeline.
Practically when a developer creates a PR in GitHub, his/her code get tested against a GitLab pipeline.
I'm trying to follow this user guide: https://docs.gitlab.com/ee/ci/ci_cd_for_external_repos/github_integration.html
and we have a silver account, but it won't work. When creating the PR, the GitLab pipeline is not triggered.
Anyone with this kind of experience who can help?
Thanks
Joe
I've found the cause of the issue.
In order for GitHub to trigger GitLab as CD/CI mostly in PR request, you need to have a Silver/Premium account AND, very important, being the root owner.
Any other case, you won't be able to see github in the integration list on GitLab. People from gitlab had the brilliant idea to hide it instead of showing it disabled (which would had been a tip to understand that you needed an upgraded license)
In the video above it's not explained.
Firstly, you need to give us the content of your .gitlab-ci.yaml file. In your question you asked about GitHub but you're following Gitlab documentation which is completely different. Both are using git commands to commit and push repos but Github & Gitlab are different.
For Github pipelines, you need to create a repository, then you go to Actions. Github will propose you to configure a .github/workflows directory which contain a file.yaml. In this .yaml file you can code your pipelines. According to your project, Github will propose you several linux machines with the adequate configuration to run your files (If it's a Java Project --> you'll be proposed maven machines, Python --> Python Machines, React/Angular -> machines with npm installed, Docker, Kubernetes for deployments...) and you're limited to 4 private project as far as I know (check this last information).
For Gitlab you have two options, you can use preconfigured machines like github, and you call them by adding for example atag: npm in your .gitlab-ci.yaml file, to call a machine with npm installed, but you need to pay an amount of money. Or you can configure your own runners by following the Gitlab documentation with gitlab commands (which is the best option), but you'll need good machines and servers to run npm - mvn - python3 - ... commands
Of course, in your Gitlab repository, and finally to answer your question this an example, of .gitlab-ci.yaml file with two simple stages: build & test, the only statement specifies that these pipelines will run if there is a merge request ( I use the preconfigured machines of Gitlab as a sample here) More details on my python github project https://github.com/mehdimaaref7/Scrapping-Sentiment-Analysis and for gitlab https://docs.gitlab.com/runner/
stages:
- build
- test
build:
tags:
- shell
- linux
stage: build
script:
- echo "Building"
- mkdir build
- touch build/info.txt
artifacts:
paths:
- build/
only:
- merge_requests
test:
tags:
- shell
- linux
stage: test
script:
- echo "Testing"
- test -f "build/info.txt"
only:
- merge_requests

Merge GitHub branches from Azure

I am setting up CI/CD at work and there is one step I’m not sure how to do and furthermore, if it is a right thing do.
For background, I am used to develop in C# with Visual Studio, source code in TFS and deploying with basic script that copies files on the intranet.
Now, I’m requested to setup Build and Release pipelines on Dot Net Core projects in GitHub.
I have three branches on this project: DEV, RELEASE and MASTER
I created one pipeline that triggers on DEV’s commits, creates an artefact and deploy to DEV server.
Those are the pipelines that deploy all developers work to a DEV server where they run their own tests.
Next step, when we want to deploy to staging servers, we click a button in Azure, this merge the DEV branch to the RELEASE branch but I know close to nothing in GitHub, not even sure those are the appropriate words.
When the merge is done, this will trigger a build pipeline that will create a different artefact, when this artefact is updated, deploy to staging server.
Once this release is validated on Staging and Quality, we would merge RELEASE to MASTER and do the same until PROD servers. It is all on intranet and self-hosted agents.
Is that a good way of doing things? Can it be done this way? I need a PowerShell task or is there something that exists?
If you are using Azure DevOps pipeline, the pipeline should select GitHub for repository type, then we can configure the CI trigger.
a. Configure CI trigger:
Classic steps:
1.Open project setting->Service connections->select GitHub-> create a new GitHub service connection
2.Create a new build pipeline via classic editor-> Select GitHub as the source.
3.Open pipeline->select the tab Triggers-> enable the option Enable continuous integration and configure the Branch filters
b.YAML steps:
1.Open project setting->Service connections->select GitHub-> create a new GitHub service connection
2.Create a new build pipeline and select GitHub(YAML)
c.The sample of Check out GitHub repositories in your pipeline
resources:
repositories:
- repository: MyGitHubRepo # The name used to reference this repository in the checkout step
type: github
endpoint: MyGitHubServiceConnection
name: MyGitHubOrgOrUser/MyGitHubRepo
trigger:
- {branch name}
Configure CD trigger:
Please refer to this doc to configure the release trigger.
If you are using GitHub action.
Please select the correct workflow to configure the CI/CD, Please refer this doc for more details
CI sample:
on:
push:
branches:
- ' DEV'
Update1
When Dev branch is updated, it is built and deployed to Dev server by pipeline.
Create build A and release B, configure the CI build trigger, when the branch Dev is updated, it triggers the build pipeline A, and when build pipeline A is completed, it triggers the release pipeline B.
click a button to synchronize Dev branch to Release branch.
We cannot see the button, as a workaround, we can add task powershell and call the API to create pull request and complete the pull request. We also can add task cmd and publish the code via git cmd

How to use powershell/bash script file in azure pipeline template

example: Azure Devops, I have one organization, few projects, and few repositories inside each project (most of them contains build pipelines):
ORGANIZATION:
..-- Project1
.....-- Repo1
.....-- Repo2
..-- Project2
.....-- Repo1
.....-- Repo2
..-- BuildTemplates
.....-- BuildTemplatesRepository
........-- Template1.yml
........-- Template2.yml
........-- Template1.ps1
Template1.yml contains powershell task or step:
- pwsh: ./Template1.ps1
Problem:
When Template1.yml executes inside pipelines from another repo (Project1/Repo1/azure-pipelines.yml) I get error:
[error]ENOENT: no such file or directory, stat '/home/vsts/work/1/s/Template1.ps1'
I understand why there is error, because *.ps1 file isn't copied inside container where process is going on, but how to solve this issue in best way (without coping this script manually)?
For this issue , you can check out multiple repositories in your pipeline.
Pipelines often rely on multiple repositories. You can have different repositories with source, tools, scripts, or other items that you need to build your code. By using multiple checkout steps in your pipeline, you can fetch and check out other repositories in addition to the one you use to store your YAML pipeline.
So you can check out other repo that contains Template1.ps1 in the pipeline .
steps:
- checkout: git://MyProject/MyRepo # Azure Repos Git repository in the same organization
For details ,please refer to this official document.

Deploy individual services from a monorepo using github actions

I have around 10 individual micro-services which are mostly cloud functions for various data processing jobs, which all live in a single github repository.
The goal is to trigger the selective deployment of these service to Google Cloud Functions, on push to a branch - when an individual function has been updated.
I must avoid the situation in which update of a single service causes the deployment of all the cloud functions.
My current repository structure:
/repo
--/service_A
----/function
----/notebook
--/service_B
----/function
----/notebook
On a side note, what are the pros/cons of using Github Actions VS Google Cloud Build for such automation?
GitHub Actions supports monorepos with path filtering for workflows. You can create a workflow to selectively trigger when files on a specific path change.
https://help.github.com/en/articles/workflow-syntax-for-github-actions#onpushpull_requestpaths
For example, this workflow will trigger on a push when any files under the path service_A/ have changed (note the ** glob to match files in nested directories).
on:
push:
paths:
- 'service_A/**'
You could also run some script to discover which services were changed based on git diff and trigger corresponding job via GitHub REST API.
There could be two workflows main.yml and services.yml.
Main workflow will be configured to be started always on push and it will only start script to find out which services were changed. For each changed service repository dispatch event will be triggered with service name in payload.
Services workflow will be configured to be started on repository_dispatch and it will contain one job for each service. Jobs would have additional condition based on event payload.
See showcase with similar setup:
https://github.com/zladovan/monorepo
It's not a Monorepo
If you only have apps, then I'm sorry... but all you have is a repo of many apps.
A monorepo is a collection of packages that you can map a graph of dependencies between.
Aha, I have a monorepo
But if you have a collection of packges which depend on each other, then read on.
apps/
one/
depends:
pkg/foo
two/
depends:
pkg/bar
pkg/foo
pkg/
foo/
bar/
baz/
The answer is that you switch to a tool that can describe which packages have changed between the current git ref and some other git ref.
The following two examples runs the release npm script on each package that changed under apps/* and all the packges they would depend on.
I'm unsure if the pnpm method silently skips packages that don't have a release target/command/script.
Use NX Dev
Using NX.dev, it will work it out for you with its nx affected command.
you need a nx.json in the root of your monorepo
it assumes you're using the package.json approach with nx.dev, if you have project.json in each package, then the target would reside there.
your CI would then look like:
pnpx nx affected --target=release
Pnpm Filtering
Your other option is to switch to pnpm and use its filtering syntax:
pnpm --filter "...{apps/**}[origin/master]" release
Naive Path Filtering
If you just try and rely on "which paths" changed in this git commit, then you miss out on transient changes that affect the packages you actually want to deploy.
If you have a github action like:
on:
push:
paths:
- 'app/**'
Then you won't ever get any builds for when you only push commits that change anything in pkg/**.
Other interesting github actions
https://github.com/marketplace/actions/nx-check-changes
https://github.com/marketplace/actions/nx-affected-dependencies-action
https://github.com/marketplace/actions/nx-affected-list (a non nx alternative here is dorny/paths-filter
https://github.com/marketplace/actions/nx-affected-matrix
Has Changed Path Action might be worth a try:
name: Conditional Deploy
on: push
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v2
with:
fetch-depth: 100
- uses: marceloprado/has-changed-path#v1
id: service_A_deployment
with:
paths: service_A
- name: Deploy front
if: steps.service_A_deployment.outputs.changed == 'true'
run: /deploy-service_A.sh