How to setup GitHub workflow CI / build to build each directory when something gets pushed? - github

I've added GitHub Actions Workflow to my repo and tried to configure it but failed. Checked few websites but couldn't find a clear answer. How can one configure its Workflow so that C++ CI/Build would build each directory separately whenever I push something to the repository?
Note: My repo contains several folders of source code and each has different project/code snippets.

You can filter each workflow to only run when commits affect files in a certain path:
https://help.github.com/en/actions/reference/workflow-syntax-for-github-actions#onpushpull_requestpaths
on:
push:
paths:
- 'sub-project/**'
- '!sub-project/docs/**'

Related

Trigger specific build pipeline in Azure Dev Ops with single repository(.sln) having multiple projects(.csproj)

I've a single repository having visual studio solution(.sln), where I've more than one project(.csproj) in same solution(i.e. WebAPI project, WebApp project etc.)...
Now, I've created separate pipeline(s) for individual project, which trigger whenever any commit comes to my XYZ branch...(ex. through PR code merge from feature to XYZ branch)
Now, the issue is...
Whenever any commit come to any project in that repository all pipeline start building there respective projects... Here I just want to build a specific project pipeline for which the commit file(s) comes into...
You can specify file paths to include or exclude.
# specific path build
trigger:
branches:
include:
- master
- releases/*
paths:
include:
- docs
exclude:
- docs/README.md
When you specify paths, you must explicitly specify branches to trigger on. You can't trigger a pipeline with only a path filter; you must also have a branch filter, and the changed files that match the path filter must be from a branch that matches the branch filter.
Check here:
https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/azure-repos-git?view=azure-devops&tabs=yaml#ci-triggers

GitHub PR doesn't trigger GitLab pipeline

I'm trying to use GitHub to trigger on PR a GitLab pipeline.
Practically when a developer creates a PR in GitHub, his/her code get tested against a GitLab pipeline.
I'm trying to follow this user guide: https://docs.gitlab.com/ee/ci/ci_cd_for_external_repos/github_integration.html
and we have a silver account, but it won't work. When creating the PR, the GitLab pipeline is not triggered.
Anyone with this kind of experience who can help?
Thanks
Joe
I've found the cause of the issue.
In order for GitHub to trigger GitLab as CD/CI mostly in PR request, you need to have a Silver/Premium account AND, very important, being the root owner.
Any other case, you won't be able to see github in the integration list on GitLab. People from gitlab had the brilliant idea to hide it instead of showing it disabled (which would had been a tip to understand that you needed an upgraded license)
In the video above it's not explained.
Firstly, you need to give us the content of your .gitlab-ci.yaml file. In your question you asked about GitHub but you're following Gitlab documentation which is completely different. Both are using git commands to commit and push repos but Github & Gitlab are different.
For Github pipelines, you need to create a repository, then you go to Actions. Github will propose you to configure a .github/workflows directory which contain a file.yaml. In this .yaml file you can code your pipelines. According to your project, Github will propose you several linux machines with the adequate configuration to run your files (If it's a Java Project --> you'll be proposed maven machines, Python --> Python Machines, React/Angular -> machines with npm installed, Docker, Kubernetes for deployments...) and you're limited to 4 private project as far as I know (check this last information).
For Gitlab you have two options, you can use preconfigured machines like github, and you call them by adding for example atag: npm in your .gitlab-ci.yaml file, to call a machine with npm installed, but you need to pay an amount of money. Or you can configure your own runners by following the Gitlab documentation with gitlab commands (which is the best option), but you'll need good machines and servers to run npm - mvn - python3 - ... commands
Of course, in your Gitlab repository, and finally to answer your question this an example, of .gitlab-ci.yaml file with two simple stages: build & test, the only statement specifies that these pipelines will run if there is a merge request ( I use the preconfigured machines of Gitlab as a sample here) More details on my python github project https://github.com/mehdimaaref7/Scrapping-Sentiment-Analysis and for gitlab https://docs.gitlab.com/runner/
stages:
- build
- test
build:
tags:
- shell
- linux
stage: build
script:
- echo "Building"
- mkdir build
- touch build/info.txt
artifacts:
paths:
- build/
only:
- merge_requests
test:
tags:
- shell
- linux
stage: test
script:
- echo "Testing"
- test -f "build/info.txt"
only:
- merge_requests

azure devops triggers trouble understanding

I've got a problem where I cannot get my pipeline to run when I want it to.
Background
I have a repo in GitHub, and I'm running my pipelines in ADO.
I have two branch - main branch, feature branch.
I want a pull request to trigger a pipeline called "pull-request" when I make a pull request in github.com. I want the pipeline to exclude any changes that happen to a file called azure-pipeline-pull-request.yml (which is the pipeline file).
I don't want it to run any other time.
I have tried many different combos in the yml file, but I cannot get the pipeline to run when a PR happens.
This is the code at the top of the YAML file.
trigger:
branches:
include:
- main
paths:
exclude:
- azure-pipelines-pull-request.yml
pr:
branches:
include:
- main
paths:
exclude:
- azure-pipelines-pull-request.yml
I've tried it without the trigger section. I could do with some help in explaining what is happening.
I tried your code and successfully triggered the pipeline to run. So some other reasons caused the pipeline not running automatically instead of your script.
There are several possible reasons for the issue. Click I just created a new YAML pipeline with CI/PR triggers, but the pipeline is not being triggered. for detailed information and steps. Here is a brief overview of the content of the document:
Go to "Triggers" in UI. Turn off the "Override the YAML trigger from here" setting.
Check whether your Github repository is connected to multiple Azure DevOps organizations. If so, remove the service connection and re-establish it.
Check whether there is a failure in Webhooks in Github.
Make sure that the YAML file in the correct branch has the necessary CI or PR configuration.
Did you use templates for your YAML file? If so, make sure that your triggers are defined in the main YAML file.

Deploy individual services from a monorepo using github actions

I have around 10 individual micro-services which are mostly cloud functions for various data processing jobs, which all live in a single github repository.
The goal is to trigger the selective deployment of these service to Google Cloud Functions, on push to a branch - when an individual function has been updated.
I must avoid the situation in which update of a single service causes the deployment of all the cloud functions.
My current repository structure:
/repo
--/service_A
----/function
----/notebook
--/service_B
----/function
----/notebook
On a side note, what are the pros/cons of using Github Actions VS Google Cloud Build for such automation?
GitHub Actions supports monorepos with path filtering for workflows. You can create a workflow to selectively trigger when files on a specific path change.
https://help.github.com/en/articles/workflow-syntax-for-github-actions#onpushpull_requestpaths
For example, this workflow will trigger on a push when any files under the path service_A/ have changed (note the ** glob to match files in nested directories).
on:
push:
paths:
- 'service_A/**'
You could also run some script to discover which services were changed based on git diff and trigger corresponding job via GitHub REST API.
There could be two workflows main.yml and services.yml.
Main workflow will be configured to be started always on push and it will only start script to find out which services were changed. For each changed service repository dispatch event will be triggered with service name in payload.
Services workflow will be configured to be started on repository_dispatch and it will contain one job for each service. Jobs would have additional condition based on event payload.
See showcase with similar setup:
https://github.com/zladovan/monorepo
It's not a Monorepo
If you only have apps, then I'm sorry... but all you have is a repo of many apps.
A monorepo is a collection of packages that you can map a graph of dependencies between.
Aha, I have a monorepo
But if you have a collection of packges which depend on each other, then read on.
apps/
one/
depends:
pkg/foo
two/
depends:
pkg/bar
pkg/foo
pkg/
foo/
bar/
baz/
The answer is that you switch to a tool that can describe which packages have changed between the current git ref and some other git ref.
The following two examples runs the release npm script on each package that changed under apps/* and all the packges they would depend on.
I'm unsure if the pnpm method silently skips packages that don't have a release target/command/script.
Use NX Dev
Using NX.dev, it will work it out for you with its nx affected command.
you need a nx.json in the root of your monorepo
it assumes you're using the package.json approach with nx.dev, if you have project.json in each package, then the target would reside there.
your CI would then look like:
pnpx nx affected --target=release
Pnpm Filtering
Your other option is to switch to pnpm and use its filtering syntax:
pnpm --filter "...{apps/**}[origin/master]" release
Naive Path Filtering
If you just try and rely on "which paths" changed in this git commit, then you miss out on transient changes that affect the packages you actually want to deploy.
If you have a github action like:
on:
push:
paths:
- 'app/**'
Then you won't ever get any builds for when you only push commits that change anything in pkg/**.
Other interesting github actions
https://github.com/marketplace/actions/nx-check-changes
https://github.com/marketplace/actions/nx-affected-dependencies-action
https://github.com/marketplace/actions/nx-affected-list (a non nx alternative here is dorny/paths-filter
https://github.com/marketplace/actions/nx-affected-matrix
Has Changed Path Action might be worth a try:
name: Conditional Deploy
on: push
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v2
with:
fetch-depth: 100
- uses: marceloprado/has-changed-path#v1
id: service_A_deployment
with:
paths: service_A
- name: Deploy front
if: steps.service_A_deployment.outputs.changed == 'true'
run: /deploy-service_A.sh

Azure Devops - Build Automation

I have a Azure DevOps Git Repo with many solutions in it, and are starting down the path of build, test, deploy automation.
I figured out how to run a rebuild if any file changes in the repo.
However, since the repo has many solutions in it, I only want to run a given rebuild of a solution if a specific subfolder changes.
Is that possible, and if so, how do I accomplish this?
you can use path based trigger filters (i'm fairly certain they are only supported in yaml builds). example:
trigger:
paths:
include:
- folder1/*
- folder2/somefile
- etc
Reading:
https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema?view=azure-devops&tabs=schema
https://learn.microsoft.com/en-us/azure/devops/pipelines/create-first-pipeline?view=azure-devops&tabs=tfs-2018-2