Need help on yaml side - github

I have one repo it has 3 yaml files for frontend, backend & admin frontend. Developer push on backend works on every time.
My 3 yamls running but frontend & admin frontend folder code have no changes in that time i need only backend yaml only run
remaining two yaml simply do nothing,
For my scenario if user pushes to backend code only backend ci only i need to run
How to configure that?

You can achieve what you want by using the paths subtype in each of your workflow triggers.
I recommend to check the Github official documentation for more details.
In your case, supposing that you have 3 folders following the structure below:
repository
|__ backend
|__ frontend
|__ admin_fronted
For each workflow, you could user the following implementation:
name: Backend
on:
push:
paths:
- 'backend/**'
jobs:
[ ... ]
name: Frontend
on:
push:
paths:
- 'frontend/**'
jobs:
[ ... ]
name: Admin Frontend
on:
push:
paths:
- 'admin_frontend/**'
jobs:
[ ... ]
That way, each push will trigger a workflow only if at least a file in the specific path has been updated.
Note that there is also a paths-ignore subtype when you can use the opposite behavior if needed.

Related

GitHub action - how to parameterize container image hostname

I have a GitHub action with a workflow that uses a container to run it, using a private docker registry (myhostname.com - see below).
jobs:
myjob:
name: My Job
runs-on: [ some-tag-on-runners ]
container:
image: myhostname.com/dir/subdir/image:latest
I tried parameterizing myhostname.com using the {{ $secrets.SOME_SECRET }} feature but it doesn't work. I need to have this workflow in several repos and if the hostname changes I would like to be able to change it in only one place. Can't seem to find anything in the docs at GitHub.

Github actions: Post comment to PR workflow that triggered the current workflow

I have two workflows, the first one runs a build script and generates an artifact.
The first one is triggered when a pull request is created like this:
name: build
on:
pull_request:
types: [opened, edited, ready_for_review, reopened]
The second flow runs when the first is done, by using the workflow_runtrigger like this:
on:
workflow_run:
workflows: ["build"]
types:
- "completed"
The second flow has to be separate and run after the first one. When done it is supposed to post a comment on the PR that triggered the first workflow, but I am unable to find out how.
According to the Github Action Docs this is one of the typical use cases, as per this qoute:
For example, if your pull_request workflow generates build artifacts, you can create
a new workflow that uses workflow_run to analyze the results and add a comment to the
original pull request.
But I can't seem to find out how. I can get the first workflow's id in the 2nd workflow's context.payload.workflow_run.id, but workflow_run should also have about the pull request, but they`re empty.
What am I doing wrong, and where can I find the necessary info to be able to comment on my created pull request?
You're not doing anything wrong, it's just that the Pull Request datas from the first workflow are not present in the Github Context of the second workflow.
To resolve your problem, you could send the Pull Request datas you need from the first workflow to the second workflow.
There are different ways to do it, for example using a dispatch event (instead of a workflow run), or an artifact.
For the artifact, it would look like something as below:
In the FIRST workflow, you get the PR number from the github.event. Then you save that number into a file and upload it as an artifact.
- name: Save the PR number in an artifact
shell: bash
env:
PULL_REQUEST_NUMBER: ${{ github.event.number }}
run: echo $PULL_REQUEST_NUMBER > pull_request_number.txt
- name: Upload the PULL REQUEST number
uses: actions/upload-artifact#v2
with:
name: pull_request_number
path: ./pull_request_number.txt
In the SECOND workflow, you get the artifact and the Pull Request number from the FIRST workflow, using the following GitHub Apps:
- name: Download workflow artifact
uses: dawidd6/action-download-artifact#v2.11.0
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
workflow: <first_workflow_name>.yml
run_id: ${{ github.event.workflow_run.id }}
- name: Read the pull_request_number.txt file
id: pull_request_number_reader
uses: juliangruber/read-file-action#v1.0.0
with:
path: ./pull_request_number/pull_request_number.txt
- name: Step to add comment on PR
[...]

how to define workflow to run based on two push rules

Is there a way to define 2 push rules in same workflow file or work around ?
How to combine and write below rules into single workflow file :
Run when any file is pushed on non master branch
On:
push:
branches-ignore:
- 'master'
paths:
- 'path-to-package/**'
Run Only when particular(package.json) file pushed in master branch
On:
push:
branches:
- 'master'
paths:
- 'path-to-package/package.json'
Your specific request doesn't appear to be supported by the syntax.
According to the workflow syntax for GitHub Actions documentation, two trigger configurations appear unrelated.
GitHub allows free users to open support requests. You could always make a feature request at support.github.com/contact
The closest workaround I know at the moment would be something like the workflow below, using a conditional inside your jobs.
on:
push:
paths:
- 'path-to-package/package.json'
jobs:
build_pom:
runs-on: ubuntu-latest
steps:
- run: echo 'this is master'
if: github.ref == 'refs/heads/master'

GitHub Actions: How to dynamically set environment url based on deployment step output?

I found out about a really nice GitHub Actions Feature called Environments. Using the appropriate syntax a Environment could also be created inside a GitHub Action workflow.yml like this:
environment:
name: test_environment
url: https://your-apps-url-here.com
As the docs state thats a valid way to create GitHub Action Environments:
Running a workflow that references an environment that does not exist
will create an environment with the referenced name.
But inside my current GitHub Action workflow is there a way I dynamically set the url based on a deployment step output? I have a dynamic URL resulting from the deployment process to AWS which I can't define up-front.
The job workflow docs tell us that there's also a way of using expressions inside the url field:
environment:
name: test_environment
url: ${{ steps.step_name.outputs.url_output }}
Now imagine a ci.yml workflow file that uses AWS CLI to deploy a static website to S3, where we used a tool like Pulumi to dynamically create a S3 Bucket inside our AWS account. We can read the dynamically created S3 url using the following command pulumi stack output bucketName. The deploy step inside the ci.yml could then look like this:
- name: Deploy Nuxt.js generated static site to S3 Bucket via AWS CLI
id: aws-sync
run: |
aws s3 sync ../dist/ s3://$(pulumi stack output bucketName) --acl public-read
echo "::set-output name=s3_url::http://$(pulumi stack output bucketUrl)"
working-directory: ./deployment
There are 2 crucial points here: First we should use id inside the deployment step to define a step name we could easily access via step_name inside our environment:url. Second we need to define a step output using echo "::set-output name=s3_url::http://$(pulumi stack output bucketUrl)". In this example I create a variable s3_url. You could replace the pulumi stack output bucketUrl with any other command you'd like or tool you use, which responds with your dynamic environment url.
Be also sure to add a http:// or https:// in order to prevent an error message like this:
Environment URL 'microservice-ui-nuxt-js-hosting-bucket-bc75fce.s3-website.eu-central-1.amazonaws.com' is not a valid http(s) URL, so it will not be shown as a link in the workflow graph.
Now the environment definition at the top of our ci.yml can access the s3_url output variable from our deployment step like this:
jobs:
ci:
runs-on: ubuntu-latest
environment:
name: microservice-ui-nuxt-js-deployment
url: ${{ steps.aws-sync.outputs.s3_url }}
steps:
- name: Checkout
...
Using steps.aws-sync we reference the deployment step directly, since we defined it with the id. The appended .outputs.s3_url then directly references the variable containing our S3 url. If you defined everything correctly the GitHub Actions UI will render the environment URL directly below the finished job:
Here's also a fully working workflow embedded inside a example project.

Selecting codeship steps based in a single repo based on files changed

I have a repository that build and deploys two different components - a frontend and a backend. Each of these have a specific set of steps that need to be executed for the CICD. Is there a way to run a selective set of steps based on which component has actually changed. For e.g. let us say all my frontend is under frontend/ and all my backend is under backend/. Is there a way to run a selective set of steps when there are changes only in the frontend ?
The closest approach would be to adopt branch naming conventions that separate frontend and backend test builds.
For example, you could manage all frontend work with a frontend- prefix and all backend work with a backend- prefix. The codeship-steps.yml would then be implemented as:
- name: Frontend tests
service: your-app
type: serial
tag: ^frontend-
steps:
- service: your-app
command: ./run-frontend-test.sh
- [other step commands...]
- name: Backend tests
service: your-app
type: serial
tag: ^backend-
steps:
- service: your-app
command: ./run-backend-test.sh
- [other step commands...]
See here for more.