I'm trying to build a Azure DevOps pipeline that uses a separate repository for Dockerfiles / templates. Whats the cleanest way to use Dockerfiles from another repository?
We have experimented with having the template refer to the dockerfile but the build server seems to not have access to the filepath.
In Dockerfile repository:
steps:
- script: docker build -f pathTo/Dockerfile
In build repository:
steps:
- template: Dockerfilerepository.yml
We want this to create a docker build process in side the building repository but we instead get this error message:
unable to prepare context: unable to evaluate symlinks in Dockerfile path: lstat /home/vsts/work/1/s/pipelines: no such file or directory
you'd need to checkout that repo separately and then you can use files in that repo, you can use a script step for that, something like this:
git clone https://x-access-token:$(github-access-token)#github.com/ORG/OTHER_PRIVATE_REPO.git
but its probably not the best idea to keep docker files in a separate repo
Related
I would like to run a yaml pipeline from one project. I have a task in my yaml to scan all the source code. Using this Yaml I would like to scan all the source code in master branch for all the project and all the repository inside the same Org.
How can I get all the repo for all the project and iterate? Can someone help me ?
test.yaml
repositories:
- repository: justAnotherName
type: github
name: myGitRepo
endpoint: myGitServiceConnection
trigger:
branches:
include:
- master
steps:
- task: CredScan#2
inputs:
toolMajorVersion: 'V2'
outputFormat: 'tsv'
scanFolder: '$(Build.SourcesDirectory)'
If you're looking to pull every repo within a project, you have one of two options (see below). However, I'd advise caution before attempting this on a Microsoft-hosted agent, they have a 60-minute timeout by default. If you're using a self-hosted agent, you need not worry. I'd still advise breaking this up to avoid creating a long-running release that also consumes a large amount of disk space with each run.
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/phases?view=azure-devops&tabs=yaml#timeouts
That being said, here are the options you have:
Option 1 (Not the best)
Manually add a repository: dependency for every project and a checkout: task for every repo within the projects.
This is heavily manual and would require maintenance every time a report is added.
Option 2
You can write a custom PowerShell/bash script that uses the Azure DevOps API and git to automatically scan all projects and repos within the org and pull them onto the machine.
Start by issuing a request to get all of the projects within the org:
https://learn.microsoft.com/en-us/rest/api/azure/devops/core/projects/list?view=azure-devops-rest-6.0
Then, iterate through every project and get all repos:
https://learn.microsoft.com/en-us/rest/api/azure/devops/git/repositories/list?view=azure-devops-rest-6.0
Finally, iterate through each repo and run git clone [repository URL] to clone it onto the build agent.
NOTE: You will want to ensure to have a lot of free disk space on the agent machine and that you clean up the build space after this operation.
I'm trying to use GitHub to trigger on PR a GitLab pipeline.
Practically when a developer creates a PR in GitHub, his/her code get tested against a GitLab pipeline.
I'm trying to follow this user guide: https://docs.gitlab.com/ee/ci/ci_cd_for_external_repos/github_integration.html
and we have a silver account, but it won't work. When creating the PR, the GitLab pipeline is not triggered.
Anyone with this kind of experience who can help?
Thanks
Joe
I've found the cause of the issue.
In order for GitHub to trigger GitLab as CD/CI mostly in PR request, you need to have a Silver/Premium account AND, very important, being the root owner.
Any other case, you won't be able to see github in the integration list on GitLab. People from gitlab had the brilliant idea to hide it instead of showing it disabled (which would had been a tip to understand that you needed an upgraded license)
In the video above it's not explained.
Firstly, you need to give us the content of your .gitlab-ci.yaml file. In your question you asked about GitHub but you're following Gitlab documentation which is completely different. Both are using git commands to commit and push repos but Github & Gitlab are different.
For Github pipelines, you need to create a repository, then you go to Actions. Github will propose you to configure a .github/workflows directory which contain a file.yaml. In this .yaml file you can code your pipelines. According to your project, Github will propose you several linux machines with the adequate configuration to run your files (If it's a Java Project --> you'll be proposed maven machines, Python --> Python Machines, React/Angular -> machines with npm installed, Docker, Kubernetes for deployments...) and you're limited to 4 private project as far as I know (check this last information).
For Gitlab you have two options, you can use preconfigured machines like github, and you call them by adding for example atag: npm in your .gitlab-ci.yaml file, to call a machine with npm installed, but you need to pay an amount of money. Or you can configure your own runners by following the Gitlab documentation with gitlab commands (which is the best option), but you'll need good machines and servers to run npm - mvn - python3 - ... commands
Of course, in your Gitlab repository, and finally to answer your question this an example, of .gitlab-ci.yaml file with two simple stages: build & test, the only statement specifies that these pipelines will run if there is a merge request ( I use the preconfigured machines of Gitlab as a sample here) More details on my python github project https://github.com/mehdimaaref7/Scrapping-Sentiment-Analysis and for gitlab https://docs.gitlab.com/runner/
stages:
- build
- test
build:
tags:
- shell
- linux
stage: build
script:
- echo "Building"
- mkdir build
- touch build/info.txt
artifacts:
paths:
- build/
only:
- merge_requests
test:
tags:
- shell
- linux
stage: test
script:
- echo "Testing"
- test -f "build/info.txt"
only:
- merge_requests
example: Azure Devops, I have one organization, few projects, and few repositories inside each project (most of them contains build pipelines):
ORGANIZATION:
..-- Project1
.....-- Repo1
.....-- Repo2
..-- Project2
.....-- Repo1
.....-- Repo2
..-- BuildTemplates
.....-- BuildTemplatesRepository
........-- Template1.yml
........-- Template2.yml
........-- Template1.ps1
Template1.yml contains powershell task or step:
- pwsh: ./Template1.ps1
Problem:
When Template1.yml executes inside pipelines from another repo (Project1/Repo1/azure-pipelines.yml) I get error:
[error]ENOENT: no such file or directory, stat '/home/vsts/work/1/s/Template1.ps1'
I understand why there is error, because *.ps1 file isn't copied inside container where process is going on, but how to solve this issue in best way (without coping this script manually)?
For this issue , you can check out multiple repositories in your pipeline.
Pipelines often rely on multiple repositories. You can have different repositories with source, tools, scripts, or other items that you need to build your code. By using multiple checkout steps in your pipeline, you can fetch and check out other repositories in addition to the one you use to store your YAML pipeline.
So you can check out other repo that contains Template1.ps1 in the pipeline .
steps:
- checkout: git://MyProject/MyRepo # Azure Repos Git repository in the same organization
For details ,please refer to this official document.
I have around 10 individual micro-services which are mostly cloud functions for various data processing jobs, which all live in a single github repository.
The goal is to trigger the selective deployment of these service to Google Cloud Functions, on push to a branch - when an individual function has been updated.
I must avoid the situation in which update of a single service causes the deployment of all the cloud functions.
My current repository structure:
/repo
--/service_A
----/function
----/notebook
--/service_B
----/function
----/notebook
On a side note, what are the pros/cons of using Github Actions VS Google Cloud Build for such automation?
GitHub Actions supports monorepos with path filtering for workflows. You can create a workflow to selectively trigger when files on a specific path change.
https://help.github.com/en/articles/workflow-syntax-for-github-actions#onpushpull_requestpaths
For example, this workflow will trigger on a push when any files under the path service_A/ have changed (note the ** glob to match files in nested directories).
on:
push:
paths:
- 'service_A/**'
You could also run some script to discover which services were changed based on git diff and trigger corresponding job via GitHub REST API.
There could be two workflows main.yml and services.yml.
Main workflow will be configured to be started always on push and it will only start script to find out which services were changed. For each changed service repository dispatch event will be triggered with service name in payload.
Services workflow will be configured to be started on repository_dispatch and it will contain one job for each service. Jobs would have additional condition based on event payload.
See showcase with similar setup:
https://github.com/zladovan/monorepo
It's not a Monorepo
If you only have apps, then I'm sorry... but all you have is a repo of many apps.
A monorepo is a collection of packages that you can map a graph of dependencies between.
Aha, I have a monorepo
But if you have a collection of packges which depend on each other, then read on.
apps/
one/
depends:
pkg/foo
two/
depends:
pkg/bar
pkg/foo
pkg/
foo/
bar/
baz/
The answer is that you switch to a tool that can describe which packages have changed between the current git ref and some other git ref.
The following two examples runs the release npm script on each package that changed under apps/* and all the packges they would depend on.
I'm unsure if the pnpm method silently skips packages that don't have a release target/command/script.
Use NX Dev
Using NX.dev, it will work it out for you with its nx affected command.
you need a nx.json in the root of your monorepo
it assumes you're using the package.json approach with nx.dev, if you have project.json in each package, then the target would reside there.
your CI would then look like:
pnpx nx affected --target=release
Pnpm Filtering
Your other option is to switch to pnpm and use its filtering syntax:
pnpm --filter "...{apps/**}[origin/master]" release
Naive Path Filtering
If you just try and rely on "which paths" changed in this git commit, then you miss out on transient changes that affect the packages you actually want to deploy.
If you have a github action like:
on:
push:
paths:
- 'app/**'
Then you won't ever get any builds for when you only push commits that change anything in pkg/**.
Other interesting github actions
https://github.com/marketplace/actions/nx-check-changes
https://github.com/marketplace/actions/nx-affected-dependencies-action
https://github.com/marketplace/actions/nx-affected-list (a non nx alternative here is dorny/paths-filter
https://github.com/marketplace/actions/nx-affected-matrix
Has Changed Path Action might be worth a try:
name: Conditional Deploy
on: push
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v2
with:
fetch-depth: 100
- uses: marceloprado/has-changed-path#v1
id: service_A_deployment
with:
paths: service_A
- name: Deploy front
if: steps.service_A_deployment.outputs.changed == 'true'
run: /deploy-service_A.sh
I'm using Gitlab CI for my project. When I push on develop branch, it runs tests and update the code on my test environment (a remote server).
But the gitlab runner is already using the same build folder : builds/a3ac64e9/0/myproject/myproject
But I would like to create a now folder every time :
builds/a3ac64e9/1/yproject/myproject
builds/a3ac64e9/2/yproject/myproject
builds/a3ac64e9/3/yproject/myproject
and so on
Using this, I could just update my website by changing a symbolic link pointing to the last runner directory.
Is there a way to configure Gitlab Runner this way ?
While it doesn't make sense to use your build directory as your deployment directory, you can setup a custom build directory
Open config.toml in a text editor: (more info on where to find it here)
Set enabled = true under [runners.custom_build_dir] (more info here)
[runners.custom_build_dir]
enabled = true
In your .gitlab-ci.yml file, under variables set GIT_CLONE_PATH. It must start with $CI_BUILDS_DIR/, e.g. $CI_BUILDS_DIR/$CI_JOB_ID/$CI_PROJECT_NAME, which will probably give you what you're looking for, although if you have multiple stages, they will have different job IDs. Alternatively, you could try $CI_BUILDS_DIR/$CI_COMMIT_SHA, which would give you a unique folder for each commit. (More info here)
variables:
GIT_CLONE_PATH: '$CI_BUILDS_DIR/$CI_JOB_ID/$CI_PROJECT_NAME'
Unfortunately there is currently an issue with using GIT_BUILDS_DIR in GIT_CLONE_PATH, if you're using Windows and Powershell, so you may have to do something like this as a work-around, if all your runners have the same build directory: GIT_CLONE_PATH: 'C:\GitLab-Runner/builds/$CI_JOB_ID/$CI_PROJECT_NAME'
You may want to take a look at the variables available to you (predefined variables) to find the most suitable variables for your path.
You might want to read the following answer Changing the build intermediate paths for gitlab-runner
I'll repost my answer here:
Conceptually, this approach is not the way to go; the build directory is not a deployment directory, it's a temporary directory, to build or to deploy from, whereas on a shell executor this could be fixed.
So what you need is to deploy from that directory with a script as per gitlab-ci.yml below, to the correct directory of deployment.
stages:
- deploy
variables:
TARGET_DIR: /home/ab12/public_html/$CI_PROJECT_NAME
deploy:
stage: deploy
script:
mkdir -pv $TARGET_DIR
rsync -r --delete ./ $TARGET_DIR
tags:
- myrunner
This will move your projectfiles in /home/ab12/public_html/
naming your projects as project1 .. projectn, all your projects could use this same .gitlab-ci.yml file.
You can not achieve this only with Gitlab CI runner configuration, but you can create 2 runners, and assign them exclusively to each branch by using a combination of only and tags keywords.
Assuming your two branches are named master and develop and two runners have been tagged with master_runner and develop_runner tags, your .gitlab-ci.yml can look like this:
master_job:
<<: *your_job
only:
- master
tags:
- master_runner
develop_job:
<<: *your_job
only:
- develop
tags:
- develop_runner
(<<: *your_job is your actual job that you can factorize)