Execute github actions workflow/job in directory where code was changed - github

I am trying to implement a github actions workflow with a job which will plan and apply my terraform code changes only for directory where changes were made. The problem I am currently facing is that I can't figure out how to switch directories so that terraform plan is executed from a directory where code has been updated/changed.
I have a monorepo setup which is as follow:
repo
tf-folder-1
tf-folder-2
tf-folder-3
Each folder contains an independent terraform configuration. So, for example I would like run a workflow only when files change inside tf-folder-1. Such workflow needs to switch to working directory which is tf-folder-1 and then run terraform plan/apply.
jobs:
terraform:
name: "Terraform"
runs-on: ubuntu-latest
defaults:
run:
working-directory: ./tf-folder-1
steps:
- name: Checkout
uses: actions/checkout#v3
- name: Configure AWS credentials from Test account
uses: aws-actions/configure-aws-credentials#v1
with:
role-to-assume: arn:aws:iam::000000000000000:role/deploy-role
aws-region: eu-west-2
- name: Setup Terraform
uses: hashicorp/setup-terraform#v2
...
So far, I have the above terraform job but it only runs for statically defined working-directory. It doesn't work with a use case where it should run the workflow when changes happen within specific folder. Can someone advise how to fix this pipeline?
Thanks

GitHub Actions has path filtering you can take advantage of when you are working with workflows that are triggered off a push or push_request event.
For example say you have a monorepo with the directories, tf_1, tf_2, and tf_3. You can do something like below for when changes occur to the directory tf_1.
name: Demonstrate GitHub Actions on Monorepo
on:
push:
branches:
- master
paths:
- 'tf_1/**'
defaults:
run:
working-directory: tf_1
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v1
For more details on path filtering, please refer to the GitHub Actions syntax documentation.

You can use a GitHub action that outputs the directories where the files have changed/modified, for example, this one: Changed-files or even perform the calculation with a shell step using git diff.
If you use the GHA suggested you can set the input dir_names to true, which would output unique changed directories instead of filenames, based on the results of that you can change the directory to run your Terraform operations.

Here is the solution to run multiple jobs based on number of directories that have been update.
In the below snippet you can see directories job which will check which directories have been updated, later it output an array or switch which is then used in matrix strategy for terraform job.
jobs:
directories:
name: "Directory-changes"
runs-on: ubuntu-latest
steps:
- uses: theappnest/terraform-monorepo-action#master
id: directories
with:
ignore: |
aws/**/policies
aws/**/templates
aws/**/scripts
- run: echo ${{ steps.directories.outputs.modules }}
outputs:
dirs: ${{ steps.directories.outputs.modules }}
terraform:
name: "Terraform"
runs-on: ubuntu-latest
needs: directories
strategy:
matrix:
directories: ${{ fromJson(needs.directories.outputs.dirs) }}
defaults:
run:
working-directory: ${{ matrix.directories }}
steps:
- name: Checkout
uses: actions/checkout#v3
- name: Setup Terraform
uses: hashicorp/setup-terraform#v2
with:
cli_config_credentials_token: ${{ secrets.TF_CLOUD_TEAM_API_TOKEN_PREPROD }}
- name: Terraform Format
id: fmt
run: terraform fmt -check
- name: Terraform Init
id: init
run: terraform init
- name: Terraform Validate
id: validate
run: terraform validate -no-color
- name: Terraform Plan
id: plan
if: github.event_name == 'pull_request'
run: terraform plan -no-color -input=false
continue-on-error: true

Related

Github actions: Re-usable workflows

So what I am trying to do is to organize my ci workflow. I have a self-hosted runner that is my personal computer. Since the workflow files is getting too large I am thinking about reducing the size by having different workflow files instead.
so this is my files so far:
name: "Pipeline"
run-name: ${{ github.actor }} just pushed.
on:
schedule:
- cron: "0 2 * * *"
push:
branches-ignore:
- "docs/*"
tags:
- "v*"
workflow_dispatch:
inputs:
cc_platform:
description: "Input of branch name to build metalayers."
required: true
default: "dev"
jobs:
build:
runs-on: self-hosted
name: Build job.
steps:
- name: Checking out the repository.
uses: actions/checkout#v3
- name: Settings Credentials.
uses: webfactory/ssh-agent#v0.6.0
with:
ssh-private-key: ${{ secrets.SSH_KEY_CC }}
- name: Creating config file.
run:
touch conf
- name: Removing old folders & running build script.
run: |
if [[ -d "my_folder" ]]; then rm -rf my_folder; fi
./build
- name: Generate a zip file of artifacts.
uses: actions/upload-artifact#v3
with:
name: art_${{ github.sha }}
path: |
.*.rootfs.wic.xz
.*.rootfs.manifest
if-no-files-found: error
so what I want to do it to have the artifacts and the build step in their own workflow file. But it does give me som concersn.
Will each workflow file get run in their own runner?
Do they share memory?

GitHub Actions conditions on step

I am trying to run terraform linting using github action and can't figure out how to filter outputs. In first job, I return list of directories where terraform files are found. In second job, I need to have a condition where directories containing modules have terraform init ran against them . I tried 'if' statement to filter out only those directories but it seems the condition is ignored and the step is done for all directories.
jobs:
collectInputs:
name: Collect terraform directories
runs-on: ubuntu-latest
outputs:
directories: ${{ steps.dirs.outputs.directories }}
steps:
- name: Checkout
uses: actions/checkout#v2
- name: Get root directories
id: dirs
uses: clowdhaus/terraform-composite-actions/directories#main
- name: Outputs
run: echo "${{ steps.dirs.outputs.directories}}"
tflint:
name: tflint
needs: collectInputs
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
directory: ${{ fromJson(needs.collectInputs.outputs.directories) }}
steps:
- name: Clone repo
uses: actions/checkout#v2
- name: show only directory with 'module' substring
if: contains("${{ matrix.directory }}", 'module')
run: echo "This directory contains string 'module'"

How to setup shared github action workflow with secrets and node modules

I have a main job like the below and two other parallel jobs are dependent on the first job including secret generation and node module installation like secret setup and install node module.
I tried to make it work with needs but all the environment setup is gone with needs.
And reusable workflow seems to just setup keys.
name: build
on: [push]
jobs:
codepull:
runs-on: ubuntu-latest
steps:
- uses: actions/setup-node#v3
with:
node-version: '16.16.0'
- name: install node module
run |
yarn
- name: secrets
run |
yarn secrets
codepull-ios:
- name: build ios
run |
...
codepull-ios:
runs-on: ubuntu-latest
steps:
...
codepull-android:
runs-on: ubuntu-latest
steps:
...
I checked reusable workflow but those seems only for setting up env variables.
Anyone tried to do similar things?
jobs runs in it's own environment so do not share nothing by default.
By you can define jobs output and use it in the dependant job, like:
name: build
on: [push]
jobs:
codepull:
runs-on: ubuntu-latest
outputs: # define here the job output
one-secret: ${{ steps.secret.outputs.my-secret }}
steps:
- uses: actions/setup-node#v3
with:
node-version: '16.16.0'
- name: install node module
run |
yarn
- name: secrets
id: secret
run |
yarn secrets
secretkey=$(cat password.txt) # stupid example to take some var from somewhere
echo "::set-output name=my-secret::$secretkey"
codepull-ios:
- name: build ios
run |
...
codepull-ios:
runs-on: ubuntu-latest
needs:
- codepull
steps:
-run:
echo needs.codepull.outputs.one-secret # use the secrets
...
codepull-android:
runs-on: ubuntu-latest
steps:
...
Some link to the doc about jobs output

Github Actions reusable workflow access repo issues

I moved some common ci work into its own repo for linting, static checks etc. Multiple repos will then use this to avoid duplication. Issue I am having is, obviously the checks need to be carried out on the repo that invokes the workflow. How is this made possible? When the common workflow is executed it has no access to the contents of the initial repo. It only checks out itself.
Example source repo:
name: Perform Pre Build Check
on:
push:
workflow_dispatch:
jobs:
checks:
uses: <org>/<common-repo>/.github/workflows/checks.yml#main
Common workflow:
name: Perform Pre-Build Checks
on:
workflow_call:
jobs:
formatting-check:
name: Formatting Check
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v2
- name: Run linting check
run: xxxxxx
- name: Install cppcheck
run: sudo apt-get -y install cppcheck
- name: Run cppcheck
run: xxxxx
continue-on-error: true
This is what I ended up doing. Not sure if this was right approach but it worked. Basically just passing in the repo & ref name from current repo that triggered the workflow,
Example source repo:
name: Perform Pre Build Check
on:
push:
workflow_dispatch:
jobs:
checks:
uses: <org>/<common-repo>/.github/workflows/checks.yml#main
with:
repo-name: ${{ github.GITHUB_REPOSITORY }}
ref-name: ${{ github.GITHUB_REF_NAME }}
Common workflow:
name: Perform Pre-Build Checks
on:
workflow_call:
inputs:
repo-name:
required: true
type: string
ref-name:
required: true
type: string
jobs:
formatting-check:
name: Formatting Check
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v2
with:
repository: ${{inputs.repo-name}}
ref: ${{inputs.ref-name}}
- name: Run linting check
run: xxxxxx
- name: Install cppcheck
run: sudo apt-get -y install cppcheck
- name: Run cppcheck
run: xxxxx
continue-on-error: true

How to make a zip including submodules with Github actions?

I am trying to deploy to AWS using Github actions. The only problem is, that I have a main repo and frontend and backend submodules inside it.
This is the script I am using for deploy:
name: Deploy
on:
workflow_dispatch:
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Checkout source code
uses: actions/checkout#v2
- name: Generate deployment package
run: git submodule update --init --recursive
run: zip -r deploy.zip . -x '*.git*'
- name: Get timestamp
uses: gerred/actions/current-time#master
id: current-time
- name: Run string replace
uses: frabert/replace-string-action#master
id: format-time
with:
pattern: '[:\.]+'
string: "${{ steps.current-time.outputs.time }}"
replace-with: '-'
flags: 'g'
- name: Deploy to EB
uses: einaregilsson/beanstalk-deploy#v18
with:
aws_access_key: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws_secret_key: ${{ secrets.AWS_SECRET_KEY }}
application_name: test-stage
environment_name: Testenv-env
version_label: "${{ steps.format-time.outputs.replaced }}"
region: eu-center-1
deployment_package: deploy.zip
The problem is while it is creating a zip. It does not include submodules. Without submodules the project almost contains nothing. Is it possible somehow to iclude them? Or do you have any better solutions for this?
Consulting the actions/checkout documentation, there is a submodules argument (default value false) that controls whether the checkout includes submodules. So, you likely want
steps:
- name: Checkout source code
uses: actions/checkout#v2
with:
submodules: true