Github Actions: Run .sh without checking out whole project? - github

Due to bandwidth limits, I'm trying to checkout a subfolder of my project to Github Actions and found this Action: https://github.com/marketplace/actions/checkout-files
New (broken) Script:
name: Create Build Target
run-name: ${{ github.actor }} is creating ${{ github.ref_name }}
on: create
jobs:
Create:
runs-on: ubuntu-latest
steps:
- name: Checkout to access bash script
uses: Bhacaz/checkout-files#v2
with:
files: CICD
- name: Create Buildtarget info on Unity Cloud Build
env:
api_key: ${{ secrets.api_key }}
org_id: ${{ secrets.org_id }}
project_id: ${{ secrets.project_id }}
branch_name: ${{ github.ref_name }}
credential_id: ${{ secrets.credential_id }}
run: CICD/CreateBuildTarget.sh
I get an error in the Github Actions terminal when triggering the above .yaml file to check out a subdirectory instead of the whole project:
/home/runner/work/_temp/2ddc6165-7186-415a-8d87-bc4d746f659f.sh: line 1: CICD/CreateBuildTarget.sh: Permission denied
60
Error: Process completed with exit code 126.
I had this working before, and made sure the files had the correct permissions:
myUserName#myUserNames-MacBook-Pro cicd % ls -l
total 16
-rwxr-xr-x# 1 myUserName staff 2239 Feb 3 11:17 CreateBuildTarget.sh
-rwxr-xr-x# 1 myUserName staff 449 Feb 3 11:18 DeleteBuildTarget.sh
The only thing I changed was the checkout action.
Before (working):
- name: Checkout to access bash script
uses: actions/checkout#v2.6.0
After (not working):
- name: Checkout to access bash script
uses: Bhacaz/checkout-files#v2
with:
files: CICD
Is what I'm trying to do even possible? For now, I changed the script to trigger on pull request open/reopen, instead of create, but I still want to only check out a subdirectory instead of the whole project.

The action you're using isn't setting the execute bits on the .sh files. It's relatively simple to add them manually after restoring the files, but you might want to fork the action and make it do the right thing.
Try using a Sparse Checkout action instead, it would rely on Git to restore the files and has a lot more expected default behaviors built-in:
gogaille/sparse-checkout

While your files are executable locally (on your machine), they might not have been added to your Git repository with +x (the executable bit).
Which means, once the GitHub Action checks out your file (even limiting itself to one subfolder), said sh script files are not executable.
Try locally to do, using git add --chmod=+x:
cd /path/to/repository
cd CICD
git add --chmod=+x *.sh
git commit -m "Add executable bit in CICD folder"
git push
Then check if your GitHub Action has the same issue.

Related

How to execute a a remote script in a reusable github workflow

I have this workflow in a repo called terraform-do-database and I'm trying to use a reusable workflow coming from the public repo foo/git-workflows/.github/workflows/tag_validation.yaml#master
name: Tag Validation
on:
pull_request:
branches: [master]
push:
branches:
- '*' # matches every branch that doesn't contain a '/'
- '*/*' # matches every branch containing a single '/'
- '**' # matches every branch
- '!master' # excludes master
# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:
jobs:
tag_check:
uses: foo/git-workflows/.github/workflows/tag_validation.yaml#master
And this is the reusable workflow file from the public git-workflows repo that has the script that should run on it. What is happening is that the workflow is trying to use a script inside the repo terraform-do-database
name: Tag Validation
on:
pull_request:
branches: [master]
workflow_call:
jobs:
tag_check:
# The type of runner that the job will run on
runs-on: ubuntu-latest
# Steps represent a sequence of tasks that will be executed as part of the job
steps:
# Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
- uses: actions/checkout#v3
# Runs a single command using the runners shell
- name: Verify the tag value
run: ./scripts/tag_verify.sh
So the question: How can I make the workflow use the script stored in the git-worflows repo instead of the terraform-do-database?
I want to have a single repo where I can call the workflow and the scripts, I don't want to have everything duplicated inside all my repos.
I have found that if I wrap the script into a composite action. I can use GitHub context github.action_path to locate the scripts.
Example:
run: ${{ github.action_path }}/scripts/foo.sh
One way to go about this is perform a checkout inside your reusable workflow that essentially clones the content of the repo where your scripts are and only then you can access it. It's not the cleanest solution but it works.
Perform a second checkout, to clone your repo that has the reusable workflow into a dir reusable-workflow-repo
- name: Checkout reusable workflow dir
uses: actions/checkout#v3
with:
repository: <your-org>/terraform-do-database
token: ${{ secrets.GIT_ACCESS_TOKEN }}
path: reusable-workflow-repo
Now you have all the code you need inside reusable-workflow-repo. Use ${GITHUB_WORKSPACE} to find the current path and simply append the path to the script.
- name: Verify the tag value
run: ${GITHUB_WORKSPACE}/reusable-workflow-repo/scripts/tag_verify.sh
I was able to solve it adding a few more commands to manually download the script and execute it.
steps:
# Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
- uses: actions/checkout#v3
# Runs a single command using the runners shell
- name: Check current directory
run: pwd
- name: Download the script
run: curl -o $PWD/tag_verify.sh https://raw.githubusercontent.com/foo/git-workflows/master/scripts/tag_verify.sh
- name: Give script permissions
run: chmod +x $PWD/tag_verify.sh
- name: Execute script
run: $PWD/tag_verify.sh
Following Kaleby Cadorin example but for the case where the script is in a private repository
- name: Download & run script
run: |
curl --header "Authorization: token ${{ secrets.MY_PAT }}" \
--header 'Accept: application/vnd.github.raw' \
--remote-name \
--location https://raw.githubusercontent.com/COMPANY/REPO/BRANCH/PATH/script.sh
chmod +x script.sh
./script.sh
Note: GITHUB_TOKEN doesn't seem to work here, a PAT is required.
According to this thread on github-community the script needs to be downloaded/checked out separatly.
The "reusable" workflow you posted is not reusable in this sense, because since it is not downloading the script the workflow can only run within its own repository (or a repository that already has the script).

Permission denied error in github actions

I have written a github action to retrieve the changed sql files and lint those changed files using sqlfluff.
Here is my github action code:
name: files_lint
on:
- pull_request
jobs:
lint:
runs-on: ubuntu-latest
steps:
- name: checkout
uses: actions/checkout#v2
- name: Install Python
uses: "actions/setup-python#v2"
with:
python-version: "3.7"
- name: install sqlfluff
run: "pip install sqlfluff"
- name: Get changed .sql files
id: linting
run: some code to get the changed files
- name: Linting files started
id: sql_linting
if: steps.linting.outputs.lintees != ''
shell: bash -l {0}
run: ${{ steps.linting.outputs.lintees }} > sqlfluff fix --force
But when I run ${{ steps.linting.outputs.lintees }} > sqlfluff fix --force on the changed sql files in the above github action, I'm getting an error
/home/runner/work/_temp/a41i1c89a4.sh: line 1: test.sql: Permission denied
Error: Process completed with exit code 126.
You can’t redirect files like this:
run: ${{ steps.linting.outputs.lintees }} > sqlfluff fix --force
This is attempting to write the output of whatever that command is - but I’d guess it’s a list of files rather than a command?
You should pass as parameters (assuming it’s a list of files):
run: sqlfluff fix --force ${{ steps.linting.outputs.lintees }}
Also I presume you’re going to do something with it afterwards? If not the fixed files will not do anything. If you just want to check the files sqlfluff lint would be better than sqlfluff fix (and catches more issues as sqlfluff fix only looks at rules it can fix).
For all developers who created shell script (.sh) locally on Windows or in Windows Subsystem Linux (WSL), or cloned the git repository without knowing on which file system this shell script was created, make sure that shell script is Linux executable!
Linux
chmod +x script.sh
Windows
git update-index --chmod=+x script.sh
Finally, don't forget to push your changes.
git add script.sh
git commit -m'Making script.sh executable'
git push

GitHub Actions - How clean up unchanged/uncommited files before upload to SFTP Server

I´m tring to config a GitHub Action to deploy my application to SFTP file.
My application has 6700 files and I would like to upload only changed/commited files.
How can I remove unchanged and/or uncommited files before upload to SFTP?
This way, my one file modification deploy would be so faster than upload 6k files.
name: CI
on:
push:
branches: [ main ]
workflow_dispatch:
jobs:
deploy:
runs-on: ubuntu-latest
name: Deploy Job
steps:
- name: Checkout
uses: actions/checkout#v2
with:
fetch-depth: 2
- name: Deploy files
uses: wlixcc/SFTP-Deploy-Action#v1.0
with:
username: 'deploy_user'
server: 'server_ip'
ssh_private_key: ${{ secrets.SSH_PRIVATE_KEY }}
local_path: './www/*'
remote_path: '/www'
args: '-o ConnectTimeout=10'
To list down all the files that are updated/committed in the given commit, you can use this command:
$ git diff-tree --no-commit-id --name-only -r $GITHUB_SHA
index.html
src/application.js
Then you can use this to delete all the files that are not on this list. This needs some bash digging. One hack I can think of is to make a temporary directory to copy updated files and only upload these files.

GitHub Action checkout from specific directory

I am trying to upload a repo to server via ftp on push to master branch. I have it set up and working. However in the repo there is a folder /public. I only want to upload the files in this folder to the server. Not other files or the folder itself. I have tried to set up a working directory for the job but this doesn't seem to do the trick.. any ideas?
on:
push:
branches:
- master
name: 🚀 Deploy website on push
jobs:
ftp-web-deploy:
name: 🎉 Deploy
runs-on: ubuntu-latest
defaults:
run:
working-directory: ./public
steps:
- name: 🚚 Get latest code
uses: actions/checkout#v2.4.0
working-directory: ./public
with:
fetch-depth: 2
- name: 📂 Sync files
uses: SamKirkland/FTP-Deploy-Action#4.2.0
with:
server: ****
username: ****
password: ${{ secrets.prod_ftp_password }}
server-dir: public_html/
Checking out only one directory is not possible, but has been requested in the actions/checkout repository before: https://github.com/actions/checkout/issues/483
There's an action to check out specific files, but I haven't tried it and I'm not sure if it does what you want: https://github.com/marketplace/actions/checkout-files
You might want to ask yourself why you're trying to limit the number of files transferred. Is it because you're concerned about traffic? Or because of the input expected in the subsequent action?
If it's the latter, you could also manually "fix" the structure by running some mv and rm commands.

github actions – where are the compilation results?

I have defined a little github action workflow, which is supposed to compile a kss-styleguide from scss.
The steps of that workflow basically trigger building the resulting css and the respective kss-styleguide.
When I run the build process locally on my dev machine the built styleguide is written to the styleguide folder located in the root of my project.
However on github, despite everything being marked off green, I don't know, what or where the resulting files are being written to.
How can I deploy the generated styleguide, if I don't know where it is?
Here's my yaml file for this workflow:
name: Node.js CI
on:
push:
branches: [ mk-node-ci ]
pull_request:
branches: [ mk-node-ci ]
jobs:
build:
name: Build Styleguide
runs-on: ubuntu-latest
strategy:
matrix:
node-version: [14.x]
steps:
- uses: actions/checkout#v2
- name: Use Node.js ${{ matrix.node-version }}
uses: actions/setup-node#v1
with:
node-version: ${{ matrix.node-version }}
- uses: borales/actions-yarn#v2.0.0
with:
cmd: install
env:
NODE_ENV: development
- name: "build CSS files"
uses: borales/actions-yarn#v2.0.0
with:
cmd: "build:css"
- name: "build styleguide files"
uses: borales/actions-yarn#v2.0.0
with:
cmd: "build:styleguide"
Updated 2020.10.14 19:25 GMT
GitHub Actions are performed on a separate "clean" Runner Machine.
actions/checkout#v2 is an action that copies your repository to that machine — typically, to perform tests etc.
In order to get produced results (like modified files) from runner machine back to the original repository, we can use:
(1) upload-artifact action.
(2) git push.
For example, here is my script to modify files from the source directory and put them into the output directory (I run it as an action (bash script): - run: wrap.sh). The script wrap.sh:
echo "Copy directory structure from 'in' to 'out':";
find ./in -type d | while read i;
do
if [ ! -d "${i/in/out}" ]; then
mkdir "${i/in/out}"
echo "${i/in/out}";
fi
done
echo "Wrap files:";
find ./in -type f -name "*" | while read i;
do
echo "${i/in/out}";
cat ./tpl/header.html "$i" ./tpl/footer.html >"${i/in/out}"
git add "${i/in/out}"
done
git config user.name "chang-zhao"
git commit . -m "Wrapping"
git push origin main
Here git add "${i/in/out}" is adding to git a new file with that name. git config user.name "..." is required for the commit to work. git commit . -m "Wrapping" is the commit that puts new files into the repository ("Wrapping" is a name I gave to such commits).
This way files produced on a runner server get pushed to the original repository.