Flutter: Running flutter pub get in github actions with private dependencies - flutter

I'm using plugins that hosted privately in Github with an ssh access. When running flutter pub get in Github actions this command fails. I followed a tutorial that uses a deploy key and I tried this:
jobs:
build:
runs-on: ubuntu-18.04
steps:
- uses: actions/checkout#v1
- name: Setup SSH Keys and known_hosts
env:
SSH_AUTH_SOCK: /tmp/ssh_agent.sock
run: |
mkdir -p ~/.ssh
ssh-keyscan github.com >> ~/.ssh/known_hosts
ssh-agent -a $SSH_AUTH_SOCK > /dev/null
ssh-add - <<< "${{ secrets.SSH_PRIVATE_KEY }}"
- name: Some task that fetches dependencies
env:
SSH_AUTH_SOCK: /tmp/ssh_agent.sock
run: flutter pub get
also tried:
- uses: webfactory/ssh-agent#v0.4.0
with:
ssh-private-key: ${{ secrets.SSH_PRIVATE_KEY }}
- name: Fetch flutter dependencies
run: flutter pub get
But the command still fails. What am I doing wrong and is there another way to make this command fetch private keys?

You can use this action to add your ssh key.
https://github.com/marketplace/actions/install-ssh-key
Insert the private key and the known hosts in the secrets of your repository.
NOTE: OPENSSH format (key begins with -----BEGIN OPENSSH PRIVATE KEY-----) may not work due to OpenSSH version on VM. Please use PEM format (begins with -----BEGIN RSA PRIVATE KEY-----) instead. In order to convert your key inline to PEM format simply use ssh-keygen -p -m PEM -f ~/.ssh/id_rsa.
You can get the known hosts using:
ssh-keyscan github.com
After this add the ssh in your workflow:
- uses: shimataro/ssh-key-action#v2
with:
key: ${{ secrets.SSH }}
name: id_rsa
known_hosts: ${{ secrets.KNOWN_HOSTS }}
I hope this can help you

Related

npm run prod not actually running in github action despite showing successful

I don't believe that nmp run prod is actually running(?) in my github action despite not throwing any kind of error. The reasons why I believe that are:
If I delete my public/js/app.js file locally and push the change, it doesn't get rebuilt and my production site breaks as there's no app.js file.
If I leave the file in place and push my code to production, it's not minified, and one of the keys I need to reference still contains the dev value.
If I replace the aforementioned key with a different value and run npm run prod locally, then app.js is minified and contains my updated value.
Why would the npm run prod command not work within a github action, and also indicate that it ran successfully?
Here's my entire workflow file:
name: Prod
on:
push:
branches: [ main ]
jobs:
laravel_tests:
runs-on: ubuntu-20.04
env:
DB_CONNECTION: mysql
DB_HOST: localhost
DB_PORT: 3306
DB_DATABASE: testdb
DB_USERNAME: root
DB_PASSWORD: root
steps:
- name: Set up MySQL
run: |
sudo systemctl start mysql
mysql -e 'CREATE DATABASE testdb;' -uroot -proot
mysql -e 'SHOW DATABASES;' -uroot -proot
- uses: actions/checkout#main
- name: Copy .env
run: php -r "file_exists('.env') || copy('.env.example', '.env');"
- name: Install Dependencies
run: composer install -q --no-ansi --no-interaction --no-scripts --no-progress
- name: Generate key
run: php artisan key:generate
- name: Directory Permissions
run: chmod -R 777 storage bootstrap/cache
- name: Clean Install
run: npm ci
- name: Compile assets
run: npm run prod
- name: Execute tests (Unit and Feature tests) via PHPUnit
run: vendor/bin/phpunit
forge_deploy:
runs-on: ubuntu-20.04
needs: laravel_tests
steps:
- name: Make Get Request
uses: satak/webrequest-action#master
with:
url: ${{ secrets.PROD_DEPLOY_URL }}
method: GET
UPDATE:
My suspicion is that running the build process in the action isn't actually updating the repo (actually I'm fairly certain it's probably not as that would likely not be the desired behavior). So then the deploy url that I'm using to push the code is likely just grabbing the repo as-is and deploying it.
I need a way to update only the public folder on the repo with the output of the npm run prod command. Not sure if this is possible, or advisable, but I'm nearly positive that's what's going on.

Introducing secret variables to dockerfile on Github Actions

I am trying to configure my etc/pip.conf file to download a private PyPi artifactory while using a secret variable on my dockerfile.
Dockerfile
FROM python
WORKDIR ./app
COPY . /app
RUN pip install --upgrade pip
RUN pip install -r pre-requirements.txt
RUN echo ${{ secrets.PIP }} > etc/pip.conf
RUN pip install -r post-requirements.txt
CMD ["python", "./simpleflask.py"]
docker-image.yml
name: CI
on:
push:
branches: [ "main" ]
pull_request:
branches: [ "main" ]
workflow_dispatch:
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Setup JFrog CLI
uses: jfrog/setup-jfrog-cli#v2
env:
JF_ARTIFACTORY_SERVER: ${{ secrets.JFROG_CLI }}
- name: Checkout
uses: actions/checkout#v3
- name: Build
run: |
docker build -t simple-flask .
docker tag simple-flask awakzdev.jfrog.io/docker-local/simple-flask:latest
docker push awakzdev.jfrog.io/docker-local/simple-flask:latest
pretty simple and straightfoward but my pipeline returns the following
Step 6/8 : RUN echo ${{ secrets.PIP }} > etc/pip.conf
---> Running in deb3e3f4167f
/bin/sh: 1: Bad substitution
The command '/bin/sh -c echo ${{ secrets.PIP }} > etc/pip.conf' returned a non-zero code: 2
Error: Process completed with exit code 2.
Edit :
Trying a slightly difference approach and went to install dependencies in the pipeline
my .yml looks like this now
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Setup JFrog CLI
uses: jfrog/setup-jfrog-cli#v2
env:
JF_ARTIFACTORY_SERVER: ${{ secrets.JFROG_CLI }}
- name: Checkout
uses: actions/checkout#v3
- name: install dependencies
run: |
pip config -v list
echo "${{ secrets.PIP }}" > /etc/pip.conf
pip install ganesha-experimental==2.0.1
- name: Build
run: |
docker build -t simple-flask .
docker tag simple-flask awakzdev.jfrog.io/docker-local/simple-flask:latest
docker push awakzdev.jfrog.io/docker-local/simple-flask:latest
but the following error is being returned:
1s
Run pip config -v list
For variant 'global', will try loading '/etc/xdg/pip/pip.conf'
For variant 'global', will try loading '/etc/pip.conf'
For variant 'user', will try loading '/home/runner/.pip/pip.conf'
For variant 'user', will try loading '/home/runner/.config/pip/pip.conf'
For variant 'site', will try loading '/usr/pip.conf'
/home/runner/work/_temp/09382b8f-ce09-4646-816f-fb337f40ad4b.sh: line 2: /etc/pip.conf: Permission denied
Error: Process completed with exit code 1.
I've placed the secret on my .yml file instead.
as for the broken pip permissions I used
sudo chown runner /etc/
echo ${{ secrets.PIP }} > /etc/pip.conf
which resulted in another error with the contents of the pip.conf file (it was configured correctly through secrets)
so I found you can specify the url like so
ganesha_experimental==5.0.0 --find-links=https://awakzdev.jfrog.io/artifactory/

GitHub Actions with multiple private submodules

I'm trying to create a GH Actions job, which will download two submodules from private repositories. I want them to be downloaded with SSH keys which I have already generated.
I've been trying to it as so:
- uses: actions/checkout#v2
with:
submodules: repo_1
ssh-key: ${{ secrets.REPO_1 }}
- uses: actions/checkout#v2
with:
submodules: repo_2
ssh-key: ${{ secrets.REPO_2 }}
This code will create the folders of repo_1 and repo_2, but will be empty.
I have not found a possible solution. Does anyone know how to download multiple private submodules with separate SSH keys?
The documentation mentions:
# Whether to checkout submodules: `true` to checkout submodules or `recursive` to
# recursively checkout submodules.
#
# When the `ssh-key` input is not provided, SSH URLs beginning with
# `git#github.com:` are converted to HTTPS.
#
# Default: false
submodules: ''
So submodules: repo_2 should not be correct.
For instance, this is a workflow with a recursive checkout of submodules (inside an existing repository reference)
# Submodules recursive
- name: Checkout submodules recursive
uses: ./
with:
ref: test-data/v2/submodule-ssh-url
path: submodules-recursive
submodules: recursive
- name: Verify submodules recursive
run: __test__/verify-submodules-recursive.sh
It will checkout the repo github.com/actions/checkout branch test-data/v2%2Fsubmodule-ssh-url, which includes a .gitmodules with the names and SSH URL of the submodules.
To answer your original question:
change your .gitmodules URL with
repo1:org1/repo1
repo2:org2/repo2
Add GIT_SSH_COMMAND environment variable to ssh -F config, with config being a file with:
Host repo2
Hostname github.com
User git
IdentityFile key2
Host repo2
Hostname github.com
User git
IdentityFile key2
I don't know if it is possible to reference that file, generated with the right secrets.REPO_x, but what I can see from the checkout action is that you won"t have a native way to specify multiple keys for multiple submodule repositories.
Found a workaround:
steps:
- name: Switch to HTTPS git
run: |
rm -f ~/.gitconfig
git config --local --unset url."git#github.com".insteadOf https://github.com || echo "OK"
git config --local --unset url."git://".insteadOf https:// || echo "OK"
- uses: actions/checkout#v2
- name: Switch to SSH git
run: |
git config --local --replace-all url."git#github.com".insteadOf https://github.com
git config --local --add url."git://".insteadOf https://
- name: Checkout submodules
env:
GIT_SSH_COMMAND: "ssh -o StrictHostKeyChecking=no"
run: |
eval `ssh-agent -s`
echo "${{secrets.REPO_1}}" | ssh-add -
git submodule update --init repo_1
ssh-add -D
echo "${{secrets.REPO_2}}" | ssh-add -
git submodule update --init repo_2
ssh-add -D
eval `ssh-agent -k`

github action azure/login#v1 not working on self hosted git runner?

Anyone familiar with this issue? The example from https://github.com/Azure/cli does not work on self-hosted github runner it seems as az is missing
gitaction.yml
name: auzure-deployment
on:
push:
branches: [ main ]
jobs:
myjob:
runs-on: [self-hosted, linux]
steps:
- uses: azure/login#v1
with:
creds: ${{ secrets.AZURE_CREDENTIALS }}
- uses: azure/CLI#v1
with:
azcliversion: 2.0.72
inlineScript: |
az account list
error
Runner group name: 'Default'
Machine name: '98de1add3979'
GITHUB_TOKEN Permissions
Prepare workflow directory
Prepare all required actions
Getting action download info
Download action repository 'azure/login#v1'
Download action repository 'azure/CLI#v1'
0s
Run azure/login#v1
Error: Az CLI Login failed. Please check the credentials. For more information refer https://aka.ms/create-secrets-for-GitHub-workflows
Error: Error: Unable to locate executable file: az. Please verify either the file path exists or the file can be found within a directory specified by the PATH environment variable. Also check the file mode to verify the file is executable.
I hacked a workaround to not have to use loging#v1 but its not elegant, as they secrets are printed to the git log prompt:
name: auzure-deployment
on:
push:
branches: [ main ]
jobs:
buildandpush:
runs-on: [self-hosted, linux]
env:
credentials: ${{ secrets.AZURE_CREDENTIALS }}
AZURE_CLIENT_ID: ${{ fromJSON(secrets.AZURE_CREDENTIALS)['clientId'] }}
AZURE_CLIENT_SECRET: ${{ fromJSON(secrets.AZURE_CREDENTIALS)['clientSecret'] }}
AZURE_TENANT_ID: ${{ fromJSON(secrets.AZURE_CREDENTIALS)['tenantId'] }}
- uses: azure/CLI#v1
with:
azcliversion: 2.0.72
inlineScript: |
az login --service-principal -u $AZURE_CLIENT_ID -p $AZURE_CLIENT_SECRET --tenant $AZURE_TENANT_ID
az account list
There's now an open issue to install the cli in the login action if it doesn't exist: https://github.com/Azure/login/issues/154
The work around on self hosted runners is to install the cli before the login action using another action like this https://github.com/elstudio/action-install-azure-cli
or to not be dependent on someone elses action, run the commands directly from the script in the above repo.
- name: Install Azure cli
run: |
sudo apt-get install ca-certificates curl apt-transport-https lsb-release gnupg
curl -sL https://packages.microsoft.com/keys/microsoft.asc | gpg --dearmor | sudo tee /etc/apt/trusted.gpg.d/microsoft.gpg > /dev/null
AZ_REPO=$(lsb_release -cs)
echo "deb [arch=amd64] https://packages.microsoft.com/repos/azure-cli/ $AZ_REPO main" | sudo tee /etc/apt/sources.list.d/azure-cli.list
sudo apt-get update
sudo apt-get install azure-cli
- uses: azure/login#v1
with:
creds: ${{ secrets.AZURE_CREDENTIALS }}
- uses: azure/CLI#v1
with:
azcliversion: 2.0.72
inlineScript: |
az account list

Github actions scp into VPS via ssh only

This is currently my workflow
name: CI
on: [push]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v1
- uses: actions/setup-node#v1
with:
node-version: '10.x'
- run: npm install
- run: npm install -g #angular/cli > /dev/null
- run: ng build --prod
- run: scp -o StrictHostKeyChecking=no -r ./dist/pwa/* user#domain.com://home/user/domain.com/pwa
The above is roughly a translation of what I have on CircleCI. However, obviously the above fails.
CircleCI allowed adding 'SSH Permissions' to a project, so as during setting up build to run, it attaches that to the environment, thus making any ssh commands to the VPS easy.
How can I accomplish a similar approach in Github? Github Actions supports SSH Permissions? If not, is there a workaround?
How do you folks copy files from your workflow builds to an external server via ssh (i.e scp)?
This is what I do, after adding the SSH key to github secrets:
run: |
mkdir -p ~/.ssh
echo "${{ secrets.SSH_KEY }}" > ~/.ssh/id_rsa
chmod 700 ~/.ssh/id_rsa
ssh-keyscan -H domain.com >> ~/.ssh/known_hosts
scp -o StrictHostKeyChecking=no -r ./dist/pwa/* user#domain.com://home/user/domain.com/pwa