GitHub Runner Reusable workflow report error - github

I am using a reusable workflow, and the calling workflow does not report a failure if the reused workflow fails.
If I were to call the workflow directly, a failure would occur and get reported which is the expected behaviour.
What needs to be done to be able to report a failure in the workflow?
Calling Workflow
name: Deploy to dev
on:
push:
branches:
- 'main'
permissions:
id-token: write
contents: read
pull-requests: write
jobs:
call-workflow-passing-data:
uses: ./.github/workflows/deploy-to-env.yml
with:
environment: dev
secrets: inherit
continue-on-error: false
Reused Workflow
name: Deploy to Environment
on:
workflow_call:
inputs:
environment:
required: true
type: string
permissions:
id-token: write
contents: read
pull-requests: write
jobs:
validate:
runs-on:
- my-custom-runner
name: Apply Terraform
environment:
name: ${{ inputs.environment }}
env:
TF_VAR_environment: dev
steps:
- name: Checkout this repo
uses: actions/checkout#v3
- uses: actions/setup-node#v3
with:
node-version: 14
- uses: hashicorp/setup-terraform#v2
with:
terraform_version: 1.2.3
- name: Terraform fmt
id: fmt
run: terraform fmt -check
continue-on-error: false
- name: Terraform Init
id: init
run: terraform -chdir=terraform/src init
- name: Terraform Plan
id: plan
run: terraform plan
continue-on-error: false
- name: Terraform Apply
id: apply
run: terraform apply
continue-on-error: false
- name: Terraform outputs
id: outputs
run: terraform output -json
continue-on-error: true

Related

GitHub Actions Map configuration in Matrix to input having only a single execution

I have a Github actions workflow that will be called via either workflow_call or workflow_dispatch and it will take an input of environment, and I want to map that to details about this environment.
I tried...
Example workflow:
name: Deploy Console
on:
workflow_dispatch:
inputs:
environment:
description: "The environment you are releasing to"
required: true
type: environment
release-tag:
description: "The tag that you are releasing"
required: true
type: string
workflow_call: # Only used for Dev.
inputs:
environment:
description: "The environment you are releasing to"
required: true
type: string
release-tag:
description: "The tag that you are releasing"
required: true
type: string
jobs:
deploy:
runs-on: ubuntu-latest
permissions:
id-token: write
contents: read
environment:
name: ${{ inputs.environment }}
url: https://console.${{ inputs.environment == 'prod' && '' || format('{0}.', inputs.environment) }}website.com
env:
ARTIFACT_FILENAME: website-console-${{ matrix.angular-build }}-${{ inputs.release-tag }}.tgz
strategy:
matrix:
environment:
- ${{ inputs.environment }}
include:
- environment: dev
s3-bucket: website-dev-console
cloudfront-distribution: 10101010101010
angular-build: dev
- environment: smoke
s3-bucket: website-smoke-console
cloudfront-distribution: 10101010101011
angular-build: prod
- environment: prod
s3-bucket: website-prod-console
cloudfront-distribution: 10101010101012
angular-build: prod
steps:
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials#v1
with:
role-to-assume: arn:aws:iam::123412341234:role/github-oidc-console
aws-region: us-west-2
- uses: robinraju/release-downloader#v1.6
with:
tag: ${{ inputs.release-tag }}
fileName: ${{ env.ARTIFACT_FILENAME }}
- run: tar -xzf ${{ env.ARTIFACT_FILENAME }}
- run: aws s3 sync dist s3://${{ matrix.s3-bucket }} --quiet
- run: aws cloudfront create-invalidation --distribution-id ${{ matrix.cloudfront-distribution }} --paths '/*'
My goal is that when the workflow is called, there is one-and-only-one job spawned. I want some map of configuration that is first-class yaml rather than using an additional step in the job to define it. It doesn't need to be written in matrix form, but that's what I thought would work, although this iteration has the workflow spawning 3 jobs, one for each environment.

Github action expressions

Working on Github actions for the first time.
In my .yml file I have the following
on:
workflow_dispatch:
branches:
- main
inputs:
environment:
type: choice
description: 'Select environment to deploy in'
required: true
options:
- dev
- non-prod
- prod
- staging
based on the option I need to do the following
for staging
- name: build
run: CI=false yarn build-staging
for non-prod
- name: build
run: CI=false yarn build
Could you please provide me with some pointers on how this can be achieved?
The simplest way to go about it would be to use an if condition on the jobs within your workflow, for example:
on:
workflow_dispatch:
branches:
- main
inputs:
environment:
type: choice
description: 'Select environment to deploy in'
required: true
options:
- dev
- non-prod
- prod
- staging
jobs:
staging:
runs-on: ubuntu-latest
if: inputs.environment == 'staging'
steps:
- name: build
run: CI=false yarn build-staging
prod:
runs-on: ubuntu-latest
if: inputs.environment == 'prod'
steps:
- name: build
run: CI=false yarn build

How to configure manual approval between terraform plan and apply while using Github environment

I'm using GitHub Environment to deploy into my testing account before merging to my master.I have specified the environment keyword in my workflow as "testing". My workflow will be triggered on a push to test branch which will then run plan and apply to testing account. I would like to have a manual approval after plan runs so I can see the output before approving to deploy into testing account. Please how can I configure manual approval so that after plan runs i can check the plan output before approving to deploy into my test account.
name: Testing Environment
on:
push:
branches:
- test
jobs:
plan&apply:
name: "Run Terragrunt Init,Plan and Apply"
runs-on: ubuntu-20.04
environment: testing
defaults:
run:
working-directory: ${{ env.TERRAFORM_WORKING_DIR }}
steps:
- name: 'Checkout'
uses: actions/checkout#v2
- name: Setup Terraform
uses: hashicorp/setup-terraform#v1.3.2
with:
terraform_version: ${{ env.TERRAFORM_VERSION }}
terraform_wrapper: true
- name: Setup Terragrunt
uses: autero1/action-terragrunt#v1.1.0
with:
terragrunt_version: ${{ env.TERRAGRUNT_VERSION }}
- name: configure AWS credentials
uses: aws-actions/configure-aws-credentials#v1.6.1
with:
aws-region: us-east-1
role-to-assume: ${{ env.ORCHESTRATION_ROLE_ARN }}
- name: Terragrunt Init
id: init
run: terragrunt run-all init -no-color --terragrunt-non-interactive
- name: Terragrunt Plan
id: plan
run: |
terragrunt run-all plan -no-color --terragrunt-non-interactive >/dev/null -out=tfplan
- name: terragrunt Apply
id: apply
run : terragrunt run-all apply -no-color --terragrunt-non-interactive
continue-on-error:true
There are two ways to do this.
Approach 1:
In your GH actions environment settings, add reviewers. Create two jobs - Plan and Apply. Then, add "needs" in apply job. This approach also requires to upload plan output as an artifact as plan and apply are two separate jobs.
name: Testing Environment
on:
push:
branches:
- test
jobs:
plan:
name: "Run Terragrunt Plan"
runs-on: ubuntu-20.04
defaults:
run:
working-directory: ${{ env.TERRAFORM_WORKING_DIR }}
steps:
- name: 'Checkout'
uses: actions/checkout#v2
- name: Setup Terraform
uses: hashicorp/setup-terraform#v1.3.2
with:
terraform_version: ${{ env.TERRAFORM_VERSION }}
terraform_wrapper: true
- name: Setup Terragrunt
uses: autero1/action-terragrunt#v1.1.0
with:
terragrunt_version: ${{ env.TERRAGRUNT_VERSION }}
- name: configure AWS credentials
uses: aws-actions/configure-aws-credentials#v1.6.1
with:
aws-region: us-east-1
role-to-assume: ${{ env.ORCHESTRATION_ROLE_ARN }}
- name: Terragrunt Init
id: init
run: terragrunt run-all init -no-color --terragrunt-non-interactive
- name: Create Artifact Folder
shell: bash
run: |
sudo mkdir -p -m777 ${{ github.workspace }}/tfplanoutput
- name: Terragrunt Plan
id: plan
run: |
terragrunt run-all plan -no-color --terragrunt-non-interactive >/dev/null -out=${{ github.workspace }}/tfplanoutput/tf.plan
- name: Upload Artifact
uses: actions/upload-artifact#v3
with:
name: artifact
path: ${{ github.workspace }}/tfplanoutput/
if-no-files-found: error
apply:
name: "Run Terragrunt Apply"
needs: plan
runs-on: ubuntu-20.04
environment: testing
- name: 'Checkout'
uses: actions/checkout#v2
- name: Setup Terraform
uses: hashicorp/setup-terraform#v1.3.2
with:
terraform_version: ${{ env.TERRAFORM_VERSION }}
terraform_wrapper: true
- name: Setup Terragrunt
uses: autero1/action-terragrunt#v1.1.0
with:
terragrunt_version: ${{ env.TERRAGRUNT_VERSION }}
- name: configure AWS credentials
uses: aws-actions/configure-aws-credentials#v1.6.1
with:
aws-region: us-east-1
role-to-assume: ${{ env.ORCHESTRATION_ROLE_ARN }}
- name: Terragrunt Init
id: init
run: terragrunt run-all init -no-color --terragrunt-non-interactive
- name: Download Build Artifact
uses: actions/download-artifact#v3
with:
name: artifact
path: ${{ github.workspace }}/tfplanoutput
- name: terragrunt Apply
run : terragrunt run-all apply tf.plan -no-color --terragrunt-non-interactive
continue-on-error:true
Approach 2:
You can create composite actions - plan and apply same as above.
Hope this helps!!!

Download terraform plan as a file from GitHub

I am working on a GitHub Actions pipeline where I am creating a terraform plan and then after downloading and reviewing the plan in a file authentication the apply stage. Everything is working smoothly, I get a plan that I am then saving as a tt file using the 'out' flag, but I am not able to figure out how to download the plan file from the runner to my local machine or even save it as an artifact. Please help me out if there is a workaround.
name: 'Terraform PR'
on:
push:
branches:
- main
pull_request:
jobs:
terraform:
name: 'Terraform'
runs-on: ubuntu-latest
defaults:
run:
working-directory: infrastructure/env/dev-slb-alpha/dev
permissions:
id-token: write
contents: write
steps:
- name: Clone Repository (Latest)
uses: actions/checkout#v2
if: github.event.inputs.git-ref != ''
- name: Clone Repository (Custom Ref)
uses: actions/checkout#master
if: github.event.inputs.git-ref == ''
with:
ref: ${{ github.event.inputs.git-ref }}
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials#master
with:
role-to-assume: arn:aws:iam::262267462662:role/slb-dev-github-actions
aws-region: us-east-1
# role-session-name: GithubActionsSession
- name: Setup Terraform
uses: hashicorp/setup-terraform#v1
with:
terraform_version: 1.1.2
- name: Terraform Format
id: fmt
run: terraform fmt -check
- name: Terraform Init
id: init
run: |
# cd infrastructure/env/dev-slb-alpha/dev
terraform init
- name: Terraform Validate
id: validate
run: terraform validate -no-color
- name: Terraform Plan
id: plan
if: github.event_name == 'pull_request'
continue-on-error: true
run: |
# cd infrastructure/env/dev-slb-alpha/dev
touch tfplan.txt
# terraform force-unlock -force d5f2d86a-e0f6-222f-db3f-2c1d792ed528
# terraform force-unlock -force QOCDA86JVO02CCFV3SB010RGP3VV4KQNSO5AEMVJF66Q9ASUAAJG
terraform plan -lock=false -input=false -out=tfplan.txt
readlink -f tfplan.txt
- name: terraform plan upload
uses: actions/upload-artifact#v2
with:
name: plan
path: tfplan.txt
retention-days: 5
- uses: actions/download-artifact#v3
with:
name: my-plan
path: tfplan.txt
- name: Terraform Apply
id: apply
if: github.event_name == 'pull_request'
run: |
cd infrastructure/env/dev-slb-alpha/dev
terraform force-unlock -force 8de3f689-282e-12fd-72b2-cdd27f94e4da
terraform apply

Github Actions reusable workflow access repo issues

I moved some common ci work into its own repo for linting, static checks etc. Multiple repos will then use this to avoid duplication. Issue I am having is, obviously the checks need to be carried out on the repo that invokes the workflow. How is this made possible? When the common workflow is executed it has no access to the contents of the initial repo. It only checks out itself.
Example source repo:
name: Perform Pre Build Check
on:
push:
workflow_dispatch:
jobs:
checks:
uses: <org>/<common-repo>/.github/workflows/checks.yml#main
Common workflow:
name: Perform Pre-Build Checks
on:
workflow_call:
jobs:
formatting-check:
name: Formatting Check
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v2
- name: Run linting check
run: xxxxxx
- name: Install cppcheck
run: sudo apt-get -y install cppcheck
- name: Run cppcheck
run: xxxxx
continue-on-error: true
This is what I ended up doing. Not sure if this was right approach but it worked. Basically just passing in the repo & ref name from current repo that triggered the workflow,
Example source repo:
name: Perform Pre Build Check
on:
push:
workflow_dispatch:
jobs:
checks:
uses: <org>/<common-repo>/.github/workflows/checks.yml#main
with:
repo-name: ${{ github.GITHUB_REPOSITORY }}
ref-name: ${{ github.GITHUB_REF_NAME }}
Common workflow:
name: Perform Pre-Build Checks
on:
workflow_call:
inputs:
repo-name:
required: true
type: string
ref-name:
required: true
type: string
jobs:
formatting-check:
name: Formatting Check
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v2
with:
repository: ${{inputs.repo-name}}
ref: ${{inputs.ref-name}}
- name: Run linting check
run: xxxxxx
- name: Install cppcheck
run: sudo apt-get -y install cppcheck
- name: Run cppcheck
run: xxxxx
continue-on-error: true