How to read from github actions cache without writing to it - github

I'm using github actions cache for persisting remotely downloaded dependencies from tests across CI executions. https://docs.github.com/en/actions/using-workflows/caching-dependencies-to-speed-up-workflows
The issue I'm having is that I only want the action write to the cache when it's running on the push action on the master branch. If the action is a pull_request, I'd like it to read from the cache, but not write to it.
The reason for this is that caches that are originated from master are mostly reusable for any PR, but caches generated from a PR may not be super useful for other CI invocations because the code is yet to be reviewed and the developer may be trying out things which may just mess up the cache for other invocations.
Right now I'm doing something like this
- name: Cache packages
uses: actions/cache#v3
with:
key: 'cache-${{ github.event_name }}'
restore-keys: |
cache-push
path: |
/path/to/cache
This way I have 2 cache keys, one for PRs and one for master, master will always use the cache from he previous master invocation because it will only match cache-push, but prs will use a different key, cache-pull_request and fallback to cache-push if it doesnt exist. This way master pushes never use a cache that was generated from a pr, only caches that were generated from the previous master push.
Ideally I'd like the cache-pull_request key to not even exist and just have PRs use cache-push but not write to it at the end of the execution. Is this possible?

EDIT: Github Actions now officially supports this as of version 3.2.0!
Original comment:
I've been looking for the same thing and unfortunately it does not seem to be possible. There are open PRs and issues in the repo on Github
https://github.com/actions/cache/pull/489
So until it get merged or implemented in some other way it is not a possibility with the official Github cache workflow.
I also noticed that this PR had been closed
https://github.com/actions/cache/pull/474
The author closed it himself due to inactivity, but forked it to another repo and implemented it there. See https://github.com/MartijnHols/actions-cache
I have not used this repo myself but it might be worth checking out

Check actions/cache/restore#v3 and actions/cache/save#v3.
You can restore or save cache separately.

Related

How to run a custom command unique to a PR on merging a PR?

I am wondering if there is any way to do the following. Say I have an "open data" repo, which allows people to submit content. The repo saves all the data, and the changes to the structured JSON/YAML is reviewed in a PR. But then because I am in a serverless system (like Vercel), I need to upload the changes to the data to the production database, on merge of the branch. So there should be required a custom data migration in the PR, which runs when the PR is approved and merged.
How can that be accomplished? All I can imagine as a solution is having a special "code block" in markdown with some JSON config explaining what script to run for the data migration, and you add that marked code snippet as a comment to the PR, then parse the PR comments and figure out what script to run from that. But that would be of course (seemingly) a super hack, so is there a right way to do something like this?
The other option is to have to run the script/command manually after you merge the PR, but ideally there would be a more automatic way of doing it.
GitHub itself can run code on various events through actions. Actions are configured through YAML files in the directory .github/workflows in the repository. Some actions relative to a branch use the workflow files from that branch, while “global” actions use the workflow files from the default branch (typically called main, or master for older repositories).
For example, this workflow runs bin/update-production-database whenever the main branch is updated (whether from a pull request merge or by pushing directly):
name: Update database
on:
push:
branches:
- main
jobs:
update-database:
runs-on: ubuntu-latest
steps:
- run: bin/update-production-database
See more examples in Deploying with GitHub Actions. To pass the credentials needed to access the database, set up an encrypted secret.
To only run the job on a PR merge and not on other pushes, see Running your workflow when a pull request merges.
If you use Vercel (which I know nothing about), it claims it “automatically deploys your GitHub projects” so there may be a built-in solution there (either using actions so that the trigger comes from GitHub, or using some Vercel-owned server which polls GitHub).

Can I prevent Github Actions from updating status checks?

I have recently started creating Github actions workflows to automate some processes, however one issue I've noticed is that each of the actions creates a new status check on the PRs. This is an issue because I have a few checks I use to limit what PRs should be merged to master (these are tests run through Jenkins that then update the check through the Github API) and these actions checks add clutter and may prevent a PR from advancing if they fail for some reason. I would like the ability to force Github Actions to run quietly and not update the PR's checks, but I haven't been able to find anything. Is there some hidden way to do this?

Merging blocked indefinitely on GitHub

The context is as follows: -
I configure my GitHub CI workflow file (the YAML file) such that the workflow runs only when there are changes to certain directories:
name: testing
on:
pull_request:
branches:
- develop
paths:
- 'dir_1/**'
- '!dir_1/README.md'
- 'dir_2/**'
- '!dir_2/README.md'
I have set a branch protection rule on the develop branch that makes a merge into it possible only when the status checks are successful.
Now when I create a branch based off of the develop branch, make some changes to dir_3 (please note it is different from dir_1 and dir_2 mentioned in the YAML file code snippet), push that branch and create a pull request, GitHub expects status checks to be completed and merging is blocked till the time they are, as follows:
When I check the Actions tab, I find no action running.
So the merging is blocked indefinitely. I think that's because the branch protection rule and the YAML file code snippet contradict each other (the branch protection rule is waiting for the status check to be completed but due to the restriction in the YAML file, no status check is run). I have the following questions: -
Is my reasoning correct?
If yes, is there a way to protect certain subdirectories of a branch instead of the whole branch on GitHub? I want to allow merging if the 'protected subdirectories' are unchanged.
If the answer to 1 is yes and 2 is no, is there some other way to allow merging if the subdirectories not specified in the YAML file are changed (while retaining the branch protection rule)?
Thank you for taking the time to read the question.
On Googling the question, I found this result but it wasn't very helpful.
One of my office colleagues suggested an alternate solution to the above problem. Using paths-filter action instead of using the paths or paths-ignore key as mentioned in the Github Actions documentation solves the problem. So if the changed path is not supposed to trigger a step in a workflow, the step will be shown as skipped while running the tests and GitHub will not wait indefinitely for the tests to finish running (i.e. it will show that the pull request can be accepted).
This problem is also described in this comment of the issue. My colleague has posted the solution as a comment in the same issue. You can refer to it to see how the paths-filter action can be used to solve the above issue.
If someone has a better solution to solve this problem, please do post it. For now, I am marking this as the accepted solution. Thank you.

github actions not discovering artifacts uploaded between runs

I have a set of static binaries that I am currently re-downloading every CI run. I need these binaries to test against. I would like to cache these OS specific binaries on github actions so i don't need to re-download them everytime.
A key consideration here is the binaries do not change between jobs, they are 3rd party binaries that I do not want to re-download from the 3rd party site every time a PR is submitted to github. These binaries are used to test against, and the 3rd party publishes a release once every 6 months
I have attempted to do this with the upload-artifact and download-artifact flow with github actions.
I first created an action to upload the artifacts. These are static binaries I would like to cache repository wide and re-use everytime a PR is opened.
Here is the commit that did that:
https://github.com/bitcoin-s/bitcoin-s/runs/2841148806
I pushed a subsequent commit and added logic to download-artifact on the same CI job. When it runs, it claims that there is no artifact with that name despite the prior commit on the same job uploading it
https://github.com/bitcoin-s/bitcoin-s/pull/3281/checks?check_run_id=2841381241#step:4:11
What am i doing wrong?
Next
Artifacts and cache achieve the same thing, but should be used for different use cases. From the GitHub docs:
Artifacts and caching are similar because they provide the ability to store files on GitHub, but each feature offers different use cases and cannot be used interchangeably.
Use caching when you want to reuse files that don't change often
between jobs or workflow runs.
Use artifacts when you want to save
files produced by a job to view after a workflow has ended.
In your case you could use caching and set up a cache action. You will need a key and a path, and it will look something like this:
- name: Cache dmg
uses: actions/cache#v2
with:
key: "bitcoin-s-dmg-${{steps.previoustag.outputs.tag}}-${{github.sha}}"
path: ${{ env.pkg-name }}-${{steps.previoustag.outputs.tag}}.dmg
When there's a cache hit (your key is found), the action restores the cached files to your specified path.
When there's a cache miss (your key is not found), a new cache is created.
By using contexts you can update your key and observe changes in files or directories. E.g. to update the cache whenever your package-lock.json file changes you can use ${{ hashFiles('**/package-lock.json') }}.

How to auto merge pull request on github?

Is it possible to merge pull request automaticaly to master branch on github after success of travis test webhook?
You can use Mergify to do this.
It allows to configure rules and define criteria for your pull request to be automatically merged. In your case, setting something like "Travis check is OK and one reviewer approved the PR" would allow the PR to be automatically merged.
(Disclosure: I'm part of the Mergify team.)
You can most probably add an after_success action to your .travis.yml that would merge the PR using GitHub API. I do not know of any ready to use script for this, but there is no reason for it to be hard. Special care needed for authentication ...
GitHub recently shipped this auto-merge feature in beta. To use this, you can enable it in the repo settings. Just keep in mind you will need to add branch protection rules as well.
See the documentation for more info.
https://docs.github.com/en/free-pro-team#latest/github/collaborating-with-issues-and-pull-requests/automatically-merging-a-pull-request
I work on a project that requires pull requests to be up to date with the target branch, and also to have passed all the checks before merging.
This means we can often be waiting for checks to finish, only to find a new commit has been made to the target branch, which requires the pull request to be synchronised and the checks to run all over again. I wanted a simple app to merge the PR automatically once the checks are successful, so I created one.
Mergery is:
Free, including for private repositories.
Fast. It's event-driven, it doesn't run on a schedule.
Simple. No configuration required. Just label your PRs with automerge.