Github actions: Re-usable workflows - github

So what I am trying to do is to organize my ci workflow. I have a self-hosted runner that is my personal computer. Since the workflow files is getting too large I am thinking about reducing the size by having different workflow files instead.
so this is my files so far:
name: "Pipeline"
run-name: ${{ github.actor }} just pushed.
on:
schedule:
- cron: "0 2 * * *"
push:
branches-ignore:
- "docs/*"
tags:
- "v*"
workflow_dispatch:
inputs:
cc_platform:
description: "Input of branch name to build metalayers."
required: true
default: "dev"
jobs:
build:
runs-on: self-hosted
name: Build job.
steps:
- name: Checking out the repository.
uses: actions/checkout#v3
- name: Settings Credentials.
uses: webfactory/ssh-agent#v0.6.0
with:
ssh-private-key: ${{ secrets.SSH_KEY_CC }}
- name: Creating config file.
run:
touch conf
- name: Removing old folders & running build script.
run: |
if [[ -d "my_folder" ]]; then rm -rf my_folder; fi
./build
- name: Generate a zip file of artifacts.
uses: actions/upload-artifact#v3
with:
name: art_${{ github.sha }}
path: |
.*.rootfs.wic.xz
.*.rootfs.manifest
if-no-files-found: error
so what I want to do it to have the artifacts and the build step in their own workflow file. But it does give me som concersn.
Will each workflow file get run in their own runner?
Do they share memory?

Related

Github action pointing to a different directory than the current directory

When running an action during a pull request the path emitted by mkdir while trying to create a folder was different than the current directory set by a previous step. This also continues to the next steps as follows..
name: Publish demo
on:
push:
branches:
- 'develop'
pull_request:
branches:
- 'develop'
jobs:
web-deploy:
name: Deploy
runs-on: windows-latest
steps:
- name: Get latest code
uses: actions/checkout#v3
- name: Setup MSBuild
uses: microsoft/setup-msbuild#v1
- name: Setup NuGet
uses: NuGet/setup-nuget#v1.1.1
- name: Navigate to Workspace
run: cd ${{ github.workspace }}/demo_project
- name: Create Build Directory
run: mkdir _build
- name: Restore Packages
run: nuget restore demo_project.csproj
- name: Build Solution
run: |
msbuild.exe demo_project.csproj /nologo /nr:false /p:DeployOnBuild=true /p:DeployDefaultTarget=WebPublish /p:WebPublishMethod=FileSystem /p:DeleteExistingFiles=True /p:platform="Any CPU" /p:configuration="Release" /p:PublishUrl="_build"
- name: Sync files
uses: SamKirkland/FTP-Deploy-Action#4.3.3
with:
local-dir: "_build"
server: <server>
username: <username>
password: ${{ secrets.password }}
So, in this line..
run: mkdir _build
the _build folder should be created in demo_project but instead it gets created in ${{ github.workspace }} which i think means that setting the current directory here..
run: cd ${{ github.workspace }}/demo_project
which in turn prints out this in job view..
Run cd D:\a\demo_solution\demo_solution/demo_project
and..
Input file does not exist: D:\demo_project.csproj.
did not take effect. So, what am i missing?
you could specify a different working directory for a job:
You can provide default shell and working-directory options for all run steps in a job
jobs:
web-deploy:
name: Deploy
runs-on: windows-latest
defaults:
run:
shell: bash
working-directory: ${{ github.workspace }}/demo_project
steps:
- name: Get latest code
uses: actions/checkout#v3

lint specific folder using MegaLinter when there is a new push to the specific folder in github actions

I am pretty new for github actions workflow. I have the following question.
What I have:
Have a repo with subfolders folder1/dotnet and folder2/dotnet
What I want to Achieve:
I want to crate github workflow which will lint only folder1 and folder 2 when new code is pushed to specific folder
Currently bellow code lints entire repo
name: pr_dotnet
on:
push:
paths:
- "folder1/dotnet/**"
- "folder2/dotnet/**"
jobs:
lint:
name: Lint dotnet specific folders
runs-on: ubuntu-latest
strategy:
matrix: { dir: ['/folder1/dotnet', 'folder2/dotnet'] }
steps:
- name: Checkout code
uses: actions/checkout#v3
with:
token: ${{ secrets.PAT || secrets.GITHUB_TOKEN }}
- name: MegaLinter
uses: oxsecurity/megalinter/flavors/dotnet#v6.12.0
working-directory: ${{matrix.dir}}
- name: Archive linted artifacts
if: ${{ success() }} || ${{ failure() }}
uses: actions/upload-artifact#v2
with:
name: MegaLinter reports
path: |
megalinter-reports
mega-linter.log
You can run MegaLinter with a sub-workspace as root using variable DEFAULT_WORKSPACE
DEFAULT_WORKSPACE: mega-linter-runner
- name: MegaLinter
uses: oxsecurity/megalinter/flavors/dotnet#v6.12.0
env:
DEFAULT_WORKSPACE: ${{matrix.dir}}
As MegaLinter won't browse at upper lever than DEFAULT_WORKSPACE, you may need to define one .mega-linter.yml config files by root workspace, or use EXTENDS to store the shared configuration online

Execute github actions workflow/job in directory where code was changed

I am trying to implement a github actions workflow with a job which will plan and apply my terraform code changes only for directory where changes were made. The problem I am currently facing is that I can't figure out how to switch directories so that terraform plan is executed from a directory where code has been updated/changed.
I have a monorepo setup which is as follow:
repo
tf-folder-1
tf-folder-2
tf-folder-3
Each folder contains an independent terraform configuration. So, for example I would like run a workflow only when files change inside tf-folder-1. Such workflow needs to switch to working directory which is tf-folder-1 and then run terraform plan/apply.
jobs:
terraform:
name: "Terraform"
runs-on: ubuntu-latest
defaults:
run:
working-directory: ./tf-folder-1
steps:
- name: Checkout
uses: actions/checkout#v3
- name: Configure AWS credentials from Test account
uses: aws-actions/configure-aws-credentials#v1
with:
role-to-assume: arn:aws:iam::000000000000000:role/deploy-role
aws-region: eu-west-2
- name: Setup Terraform
uses: hashicorp/setup-terraform#v2
...
So far, I have the above terraform job but it only runs for statically defined working-directory. It doesn't work with a use case where it should run the workflow when changes happen within specific folder. Can someone advise how to fix this pipeline?
Thanks
GitHub Actions has path filtering you can take advantage of when you are working with workflows that are triggered off a push or push_request event.
For example say you have a monorepo with the directories, tf_1, tf_2, and tf_3. You can do something like below for when changes occur to the directory tf_1.
name: Demonstrate GitHub Actions on Monorepo
on:
push:
branches:
- master
paths:
- 'tf_1/**'
defaults:
run:
working-directory: tf_1
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v1
For more details on path filtering, please refer to the GitHub Actions syntax documentation.
You can use a GitHub action that outputs the directories where the files have changed/modified, for example, this one: Changed-files or even perform the calculation with a shell step using git diff.
If you use the GHA suggested you can set the input dir_names to true, which would output unique changed directories instead of filenames, based on the results of that you can change the directory to run your Terraform operations.
Here is the solution to run multiple jobs based on number of directories that have been update.
In the below snippet you can see directories job which will check which directories have been updated, later it output an array or switch which is then used in matrix strategy for terraform job.
jobs:
directories:
name: "Directory-changes"
runs-on: ubuntu-latest
steps:
- uses: theappnest/terraform-monorepo-action#master
id: directories
with:
ignore: |
aws/**/policies
aws/**/templates
aws/**/scripts
- run: echo ${{ steps.directories.outputs.modules }}
outputs:
dirs: ${{ steps.directories.outputs.modules }}
terraform:
name: "Terraform"
runs-on: ubuntu-latest
needs: directories
strategy:
matrix:
directories: ${{ fromJson(needs.directories.outputs.dirs) }}
defaults:
run:
working-directory: ${{ matrix.directories }}
steps:
- name: Checkout
uses: actions/checkout#v3
- name: Setup Terraform
uses: hashicorp/setup-terraform#v2
with:
cli_config_credentials_token: ${{ secrets.TF_CLOUD_TEAM_API_TOKEN_PREPROD }}
- name: Terraform Format
id: fmt
run: terraform fmt -check
- name: Terraform Init
id: init
run: terraform init
- name: Terraform Validate
id: validate
run: terraform validate -no-color
- name: Terraform Plan
id: plan
if: github.event_name == 'pull_request'
run: terraform plan -no-color -input=false
continue-on-error: true

GitHub Actions conditions on step

I am trying to run terraform linting using github action and can't figure out how to filter outputs. In first job, I return list of directories where terraform files are found. In second job, I need to have a condition where directories containing modules have terraform init ran against them . I tried 'if' statement to filter out only those directories but it seems the condition is ignored and the step is done for all directories.
jobs:
collectInputs:
name: Collect terraform directories
runs-on: ubuntu-latest
outputs:
directories: ${{ steps.dirs.outputs.directories }}
steps:
- name: Checkout
uses: actions/checkout#v2
- name: Get root directories
id: dirs
uses: clowdhaus/terraform-composite-actions/directories#main
- name: Outputs
run: echo "${{ steps.dirs.outputs.directories}}"
tflint:
name: tflint
needs: collectInputs
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
directory: ${{ fromJson(needs.collectInputs.outputs.directories) }}
steps:
- name: Clone repo
uses: actions/checkout#v2
- name: show only directory with 'module' substring
if: contains("${{ matrix.directory }}", 'module')
run: echo "This directory contains string 'module'"

GitHub PR checks showing jobs instead of workflows

I have a workflow that checks files on push and pull_request.
It has 2 jobs: One to list the changed files that match a pattern (Dockerfiles), and a second job with a matrix strategy to be executed for every file.
The jobs:
jobs:
get-files:
name: Get changed files
runs-on: ubuntu-latest
outputs:
dockerfiles: ${{ steps.filter.outputs.dockerfiles_files }}
steps:
- uses: dorny/paths-filter#v2
id: filter
with:
list-files: json
filters: |
dockerfiles:
- "**/Dockerfile"
check:
name: Check Dockerfiles
needs: get-files
strategy:
matrix:
dockerfile: ${{ fromJson(needs.get-files.outputs.dockerfiles) }}
runs-on: ubuntu-latest
steps:
- id: get-directory
# Remove last path segment to only keep the Dockerfile directory
run: |
directory=$(echo ${{matrix.dockerfile}} | sed -r 's/\/[^\/]+$//g')
echo "::set-output name=directory::$directory"
- run: echo "${{steps.get-directory.outputs.directory}}"
- uses: actions/checkout#v2
- name: Build Dockerfile ${{ matrix.dockerfile }}
run: docker build ${{steps.get-directory.outputs.directory}} -f ${{ matrix.dockerfile }}
My problem is, that the "get-files" ("Get changed files") job appears as a check in the pull requests:
Is there any way to hide it in the PR checks?
If not, is there a better way to have a check per modified file?