Running a Docker Image via SSH Github Actions - github

so I'm currently trying to make GitHub Actions/CI SSH into my VPS and run a docker image. Although the main problem is that the job doesn't finish up after running the final command.
This is my YML file:
name: SSH & Deploy Image
on:
workflow_run:
workflows: ["Timmy Docker Build"]
branches: [ main ]
types:
- completed
jobs:
build:
name: Build
runs-on: ubuntu-latest
steps:
- name: Run Docker CMD
uses: appleboy/ssh-action#master
with:
host: ${{ secrets.HOST }}
username: ${{ secrets.USERNAME }}
password: ${{ secrets.PASSWORD }}
port: ${{ secrets.PORT }}
script: |
docker stop ss-timmy && docker rm ss-timmy
docker pull spaceturtle0/ss-timmy:latest
docker run --env-file=Timmy-SchoolSimplified/.env spaceturtle0/ss-timmy &
Regardless of having put the & sign at the final script command, the process just hangs until the process is killed. Is there something to fix this?

You should use -d flag that means detached instead & sign for last docker command. So full command will be:
docker run -d --env-file=Timmy-SchoolSimplified/.env spaceturtle0/ss-timmy

Related

Java Springboot Deployment using GitHub

I am trying to deploy my Springboot App to my Linux VM using GitHub. The deployment itself works, but the GitHub Action does not stop running since the last command executed is still running but should not be stopped. How can I solve this?
name: Backend Deployment to Linux VM
on:
push:
branches:
- main
jobs:
build-and-deploy:
name: Backend Deployment to Linux VM
runs-on: ubuntu-latest
steps:
- name: update and start project
uses: appleboy/ssh-action#master
with:
host: ${{ secrets.HOST }}
username: ${{ secrets.USERNAME }}
password: ${{ secrets.PASSWORD }}
script: |
kill -9 $(lsof -t -i:8080)
cd /home/github_deploy_backend
cd backend-P2cinema
git pull
mvn clean package
nohup java -jar target/*.jar &

How to pull a private image from Docker Hub using github actions

I have a workflow where I need to pull a image from a private repository from Docker Hub. My job is the following:
run-flake8:
name: Run Flake 8
runs-on: "ubuntu-20.04"
needs: [django-dev-image]
container:
image: docker://index.docker.io/v1/<repository_name>/<image_name>:latest
credentials:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_PASSWORD }}
steps:
- name: Print something
run: echo "Testing flake8 job"
Job fails with repository does not exist or may require 'docker login': denied: requested access to the resource is denied. I feel like my docker hub registry url is wrong, but I can't figure out what is the correct one, any help is more than appreciated.
Thank you all
When referring to DockerHub, you should not need to specify the docker registry.
run-flake8:
name: Run Flake 8
runs-on: "ubuntu-20.04"
needs: [django-dev-image]
container:
image: <repository_name>/<image_name>:latest
credentials:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_PASSWORD }}
steps:
- name: Print something
run: echo "Testing flake8 job"

Github CI/CD reuse step

I'm kinda beginner with CI/CD, but I wrote a code that deploys Vue/Vite project to Ubuntu VPS. But, it's not as it should be. So what am I doing actually?
First as usual, installing the project and building it.
jobs:
build:
name: "Build"
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v2
- name: Install
run: yarn
- name: Build
run: yarn build
So the problem is when that passes. I'm connecting to ssh like:
deploy:
name: "Deploy"
needs: project_setup
runs-on: ubuntu-latest
steps:
- name: Deploy to server
uses: appleboy/ssh-action#master
env:
GIT_REPO: Comet-Frontend
GIT_SSH: ${{ github.repositoryUrl }}
with:
host: ${{ secrets.VPS_IP }}
username: ${{ secrets.VPS_USER }}
password: ${{ secrets.VPS_PASSWORD }}
port: ${{ secrets.VPS_PORT }}
envs: GIT_SSH, GIT_REPO
and at the very bottom:
script: |
cd /var/www/vue
git pull
ls
yarn
yarn build
cp -R /root/Frontend/dist /var/www/vue
So I would like to define ssh connection once and run those scripts separately with different step names. Is that possible or I have to connect to ssh for every step?
If each step needs SSH to access either a remote repository URL or your VPS target server, then yes, you would need SSH to each step.
The alternative being to copy a deployment script to the server (through SSH): the steps included in that script could be executed directly on that server (where the script has been copied). No need for SSH then for that script execution, since it is already at target.

Github Actions "unauthorized: You don't have the needed permissions to perform this operation, and you may have invalid credentials"

I have created a github workflow to deploy to GCP. But when it comes to push the docker image to GCP I get this error
...
346fddbbb0ff: Waiting
a6fc7a8843ca: Waiting
unauthorized: You don't have the needed permissions to perform this operation, and you may have invalid credentials. To authenticate your request, follow the steps in: https://cloud.google.com/container-registry/docs/advanced-authentication
Error: Process completed with exit code 1.
Here is my yaml file :
name: Build for Dev
on:
workflow_dispatch:
env:
GKE_PROJECT: bi-dev
IMAGE: gcr.io/bi-dev/bot-dev
DOCKER_IMAGE_TAG: JAVA-${{ github.sha }}
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v2
with:
ref: ${{ github.event.inputs.commit_sha }}
- name: Build Docker Image
run: docker build -t ${{env.IMAGE}} .
- uses: google-github-actions/setup-gcloud#v0.2.0
with:
project_id: ${{ env.GKE_PROJECT }}
service_account_key: ${{ secrets.GKE_KEY }}
export_default_credentials: true
- name: Push Docker Image to GCP
run: |
gcloud auth configure-docker
docker tag ${{env.IMAGE}} ${{env.IMAGE}}:${{env.DOCKER_IMAGE_TAG}}
docker push ${{env.IMAGE}}:${{env.DOCKER_IMAGE_TAG}}
- name: Update Deployment in GKE
env:
GKE_CLUSTER: bots-dev-test
GKE_DEPLOYMENT: bot-dev
GKE_CONTAINER: bot-dev
run: |
gcloud container clusters get-credentials ${{ env.GKE_CLUSTER }} --zone us-east1-b --project ${{ env.GKE_PROJECT }}
kubectl set image deployment/$GKE_DEPLOYMENT ${{ env.GKE_CONTAINER }}=${{ env.IMAGE }}:${{ env.TAG }}
kubectl rollout status deployment/$GKE_DEPLOYMENT
Surprisingly when I manually run docker push it works fine
Also I am using the similar yaml file to push other projects and they work totally fine. Its just this github action that fails.
Any leads would be appreciated.
Found out that I missed a step and didnt add the Service Account keys in Secrets for Github actions and that led to the failure of this particular actions.

Github Workflow Actions And EC2: Error Loading Key Invalid Format

I am trying to set up CI for my nodejs server. I would like to use github actions to ssh into my ec2 instance, where I can then git clone/pull my updated repo.
I can ssh into my ec2 instance on my local machine w no issues. I just do something like: "ssh -i keypar.pem username#some-ip.region.compute.amazonaws.com" and it connects. However, I can't seem to get a connection working on the worflow/actions script. Here is what I have in my workflow yml file:
name: CI
on: [push]
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Connect
env:
DEPLOY_KEY: ${{ secrets.EC2 }}
run: |
eval `ssh-agent`
ssh-add - <<< "${DEPLOY_KEY}"
ssh ec2-user#ec2-instance-ip-here.us-east-2.compute.amazonaws.com
This script gets me the error "Error loading key "(stdin)": invalid format". Also when I look at the deploy key section under repo settings, it says the key has never been used.
(Obviously I would need to install, clone, and perform other steps in addition to what is listed above.)
In summary:
1 how to I fix the invalid format error?
2 how do I load and reference the key pair?
There is a better way to perform SSH commands in a EC2:
name: CI
on: [push, pull_request]
jobs:
# test:
# ...
deploy:
name: "Deploy to staging"
runs-on: ubuntu-latest
if: github.event_name == 'push' && github.ref == 'refs/heads/master'
# needs: test
steps:
- name: Configure SSH
run: |
mkdir -p ~/.ssh/
echo "$SSH_KEY" > ~/.ssh/staging.key
chmod 600 ~/.ssh/staging.key
cat >>~/.ssh/config <<END
Host staging
HostName $SSH_HOST
User $SSH_USER
IdentityFile ~/.ssh/staging.key
StrictHostKeyChecking no
END
env:
SSH_USER: ${{ secrets.STAGING_SSH_USER }}
SSH_KEY: ${{ secrets.STAGING_SSH_KEY }}
SSH_HOST: ${{ secrets.STAGING_SSH_HOST }}
- name: Stop the server
run: ssh staging 'sudo systemctl stop my-application'
- name: Check out the source
run: ssh staging 'cd my-application && git fetch && git reset --hard origin/master'
- name: Start the server
if: ${{ always() }}
run: ssh staging 'sudo systemctl start my-application'
Credit: GitHub Actions: How to run SSH commands (without third-party actions)