I am trying to run terraform linting using github action and can't figure out how to filter outputs. In first job, I return list of directories where terraform files are found. In second job, I need to have a condition where directories containing modules have terraform init ran against them . I tried 'if' statement to filter out only those directories but it seems the condition is ignored and the step is done for all directories.
jobs:
collectInputs:
name: Collect terraform directories
runs-on: ubuntu-latest
outputs:
directories: ${{ steps.dirs.outputs.directories }}
steps:
- name: Checkout
uses: actions/checkout#v2
- name: Get root directories
id: dirs
uses: clowdhaus/terraform-composite-actions/directories#main
- name: Outputs
run: echo "${{ steps.dirs.outputs.directories}}"
tflint:
name: tflint
needs: collectInputs
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
directory: ${{ fromJson(needs.collectInputs.outputs.directories) }}
steps:
- name: Clone repo
uses: actions/checkout#v2
- name: show only directory with 'module' substring
if: contains("${{ matrix.directory }}", 'module')
run: echo "This directory contains string 'module'"
Related
So what I am trying to do is to organize my ci workflow. I have a self-hosted runner that is my personal computer. Since the workflow files is getting too large I am thinking about reducing the size by having different workflow files instead.
so this is my files so far:
name: "Pipeline"
run-name: ${{ github.actor }} just pushed.
on:
schedule:
- cron: "0 2 * * *"
push:
branches-ignore:
- "docs/*"
tags:
- "v*"
workflow_dispatch:
inputs:
cc_platform:
description: "Input of branch name to build metalayers."
required: true
default: "dev"
jobs:
build:
runs-on: self-hosted
name: Build job.
steps:
- name: Checking out the repository.
uses: actions/checkout#v3
- name: Settings Credentials.
uses: webfactory/ssh-agent#v0.6.0
with:
ssh-private-key: ${{ secrets.SSH_KEY_CC }}
- name: Creating config file.
run:
touch conf
- name: Removing old folders & running build script.
run: |
if [[ -d "my_folder" ]]; then rm -rf my_folder; fi
./build
- name: Generate a zip file of artifacts.
uses: actions/upload-artifact#v3
with:
name: art_${{ github.sha }}
path: |
.*.rootfs.wic.xz
.*.rootfs.manifest
if-no-files-found: error
so what I want to do it to have the artifacts and the build step in their own workflow file. But it does give me som concersn.
Will each workflow file get run in their own runner?
Do they share memory?
I am pretty new for github actions workflow. I have the following question.
What I have:
Have a repo with subfolders folder1/dotnet and folder2/dotnet
What I want to Achieve:
I want to crate github workflow which will lint only folder1 and folder 2 when new code is pushed to specific folder
Currently bellow code lints entire repo
name: pr_dotnet
on:
push:
paths:
- "folder1/dotnet/**"
- "folder2/dotnet/**"
jobs:
lint:
name: Lint dotnet specific folders
runs-on: ubuntu-latest
strategy:
matrix: { dir: ['/folder1/dotnet', 'folder2/dotnet'] }
steps:
- name: Checkout code
uses: actions/checkout#v3
with:
token: ${{ secrets.PAT || secrets.GITHUB_TOKEN }}
- name: MegaLinter
uses: oxsecurity/megalinter/flavors/dotnet#v6.12.0
working-directory: ${{matrix.dir}}
- name: Archive linted artifacts
if: ${{ success() }} || ${{ failure() }}
uses: actions/upload-artifact#v2
with:
name: MegaLinter reports
path: |
megalinter-reports
mega-linter.log
You can run MegaLinter with a sub-workspace as root using variable DEFAULT_WORKSPACE
DEFAULT_WORKSPACE: mega-linter-runner
- name: MegaLinter
uses: oxsecurity/megalinter/flavors/dotnet#v6.12.0
env:
DEFAULT_WORKSPACE: ${{matrix.dir}}
As MegaLinter won't browse at upper lever than DEFAULT_WORKSPACE, you may need to define one .mega-linter.yml config files by root workspace, or use EXTENDS to store the shared configuration online
I am trying to implement a github actions workflow with a job which will plan and apply my terraform code changes only for directory where changes were made. The problem I am currently facing is that I can't figure out how to switch directories so that terraform plan is executed from a directory where code has been updated/changed.
I have a monorepo setup which is as follow:
repo
tf-folder-1
tf-folder-2
tf-folder-3
Each folder contains an independent terraform configuration. So, for example I would like run a workflow only when files change inside tf-folder-1. Such workflow needs to switch to working directory which is tf-folder-1 and then run terraform plan/apply.
jobs:
terraform:
name: "Terraform"
runs-on: ubuntu-latest
defaults:
run:
working-directory: ./tf-folder-1
steps:
- name: Checkout
uses: actions/checkout#v3
- name: Configure AWS credentials from Test account
uses: aws-actions/configure-aws-credentials#v1
with:
role-to-assume: arn:aws:iam::000000000000000:role/deploy-role
aws-region: eu-west-2
- name: Setup Terraform
uses: hashicorp/setup-terraform#v2
...
So far, I have the above terraform job but it only runs for statically defined working-directory. It doesn't work with a use case where it should run the workflow when changes happen within specific folder. Can someone advise how to fix this pipeline?
Thanks
GitHub Actions has path filtering you can take advantage of when you are working with workflows that are triggered off a push or push_request event.
For example say you have a monorepo with the directories, tf_1, tf_2, and tf_3. You can do something like below for when changes occur to the directory tf_1.
name: Demonstrate GitHub Actions on Monorepo
on:
push:
branches:
- master
paths:
- 'tf_1/**'
defaults:
run:
working-directory: tf_1
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v1
For more details on path filtering, please refer to the GitHub Actions syntax documentation.
You can use a GitHub action that outputs the directories where the files have changed/modified, for example, this one: Changed-files or even perform the calculation with a shell step using git diff.
If you use the GHA suggested you can set the input dir_names to true, which would output unique changed directories instead of filenames, based on the results of that you can change the directory to run your Terraform operations.
Here is the solution to run multiple jobs based on number of directories that have been update.
In the below snippet you can see directories job which will check which directories have been updated, later it output an array or switch which is then used in matrix strategy for terraform job.
jobs:
directories:
name: "Directory-changes"
runs-on: ubuntu-latest
steps:
- uses: theappnest/terraform-monorepo-action#master
id: directories
with:
ignore: |
aws/**/policies
aws/**/templates
aws/**/scripts
- run: echo ${{ steps.directories.outputs.modules }}
outputs:
dirs: ${{ steps.directories.outputs.modules }}
terraform:
name: "Terraform"
runs-on: ubuntu-latest
needs: directories
strategy:
matrix:
directories: ${{ fromJson(needs.directories.outputs.dirs) }}
defaults:
run:
working-directory: ${{ matrix.directories }}
steps:
- name: Checkout
uses: actions/checkout#v3
- name: Setup Terraform
uses: hashicorp/setup-terraform#v2
with:
cli_config_credentials_token: ${{ secrets.TF_CLOUD_TEAM_API_TOKEN_PREPROD }}
- name: Terraform Format
id: fmt
run: terraform fmt -check
- name: Terraform Init
id: init
run: terraform init
- name: Terraform Validate
id: validate
run: terraform validate -no-color
- name: Terraform Plan
id: plan
if: github.event_name == 'pull_request'
run: terraform plan -no-color -input=false
continue-on-error: true
I have a workflow that checks files on push and pull_request.
It has 2 jobs: One to list the changed files that match a pattern (Dockerfiles), and a second job with a matrix strategy to be executed for every file.
The jobs:
jobs:
get-files:
name: Get changed files
runs-on: ubuntu-latest
outputs:
dockerfiles: ${{ steps.filter.outputs.dockerfiles_files }}
steps:
- uses: dorny/paths-filter#v2
id: filter
with:
list-files: json
filters: |
dockerfiles:
- "**/Dockerfile"
check:
name: Check Dockerfiles
needs: get-files
strategy:
matrix:
dockerfile: ${{ fromJson(needs.get-files.outputs.dockerfiles) }}
runs-on: ubuntu-latest
steps:
- id: get-directory
# Remove last path segment to only keep the Dockerfile directory
run: |
directory=$(echo ${{matrix.dockerfile}} | sed -r 's/\/[^\/]+$//g')
echo "::set-output name=directory::$directory"
- run: echo "${{steps.get-directory.outputs.directory}}"
- uses: actions/checkout#v2
- name: Build Dockerfile ${{ matrix.dockerfile }}
run: docker build ${{steps.get-directory.outputs.directory}} -f ${{ matrix.dockerfile }}
My problem is, that the "get-files" ("Get changed files") job appears as a check in the pull requests:
Is there any way to hide it in the PR checks?
If not, is there a better way to have a check per modified file?
I am trying to deploy to AWS using Github actions. The only problem is, that I have a main repo and frontend and backend submodules inside it.
This is the script I am using for deploy:
name: Deploy
on:
workflow_dispatch:
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Checkout source code
uses: actions/checkout#v2
- name: Generate deployment package
run: git submodule update --init --recursive
run: zip -r deploy.zip . -x '*.git*'
- name: Get timestamp
uses: gerred/actions/current-time#master
id: current-time
- name: Run string replace
uses: frabert/replace-string-action#master
id: format-time
with:
pattern: '[:\.]+'
string: "${{ steps.current-time.outputs.time }}"
replace-with: '-'
flags: 'g'
- name: Deploy to EB
uses: einaregilsson/beanstalk-deploy#v18
with:
aws_access_key: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws_secret_key: ${{ secrets.AWS_SECRET_KEY }}
application_name: test-stage
environment_name: Testenv-env
version_label: "${{ steps.format-time.outputs.replaced }}"
region: eu-center-1
deployment_package: deploy.zip
The problem is while it is creating a zip. It does not include submodules. Without submodules the project almost contains nothing. Is it possible somehow to iclude them? Or do you have any better solutions for this?
Consulting the actions/checkout documentation, there is a submodules argument (default value false) that controls whether the checkout includes submodules. So, you likely want
steps:
- name: Checkout source code
uses: actions/checkout#v2
with:
submodules: true