How to gain SSH access from an AWS instance to another without private key? - github

I have an SSH keypair: private lives on my local Mac, public lives on several AWS cloud machines.
From my Mac, I can SSH to a cloud instance, call it "deploy server". From there, I need to deploy my application to several instances (I cannot deploy locally).
I authenticate to the other instances with my private key. I can do this by either leaving my private key on the deploy server (insecure), or SSH Agent Forwarding (probably not much better).
Moreover, the deploy takes a while, so I do it in a gnu screen or tmux session; then I just detach and end the SSH session with the deploy server meaning I cannot use SSH Agent Forwarding (as I believe it requires the SSH connection to remain open).
What other options are available to me?

You can use a deploy key. That is a server specific key that has read only access to the repository.
To use this, you need to:
Generate a private key for the server (ssh-keygen on the server)
Set it at the github repo as a deploy key (https://github.com/<user>/<repo>/settings/keys). That will grant read only permissions to the repo. You have a checkbox if you also need write access to it.
Read more on this github help guide. There you can see more methods for deploying from a server accessing a repository.

Related

configuring Airflow ssh connection in google cloud composer

I'm trying to configure a SSH connection from Airflow UI on google cloud composer environment to an on premise posgresql server
Where I should store my private key ?
How to pass to SSH connection config the private key location ?
First, you will need to add an SSH connection under:
Airflow -> Admin -> Connections -> Connection Type (SSH)
That will allow you to use this connection in an operator to access the remote instance. Add your key to the Extra field (check key_file & host_key).
Documentation here: https://airflow.apache.org/docs/apache-airflow-providers-ssh/stable/connections/ssh.html
Adding the file under the same GCS bucket having the dags will make it reachable by the Airflow workers. You can have a new directory under gads and name it keys if you want.
Then you will need to design your pipeline (dag) to be able to get your private key from the remote instance.
You can use the SSHExecuteOperator or any other operator based on your design.
Check this question for more helpful details:
Airflow: How to SSH and run BashOperator from a different server

Create new SSH keys w/ new server, or use existing key?

In the myhome/.ssh/ directory are local SSH keys for my personal GitHub account, the directory has the id_rsa and id_rsa.pub files that allow me to authenticate with GitHub.
I am setting up a server (compute engine on GCP), and this server needs these keys because it needs to authenticate to my GitHub to pull a repo. Should I either:
transfer over / SCP the currently-existing id_rsa, `id_rsa.pub that I have locally onto the server, and use those on the server.
create a brand new SSH key from on the server, and use this key. Add it to my GitHub profile.
it doesn't matter, either (1) or (2) is fine.
or something else?
(2) seems like the right approach, but we are not certain.
Indeed option 2. As a best practice, you should not share the same private key.
Go ahead and generate a new SSH key following the docs: https://docs.github.com/en/github/authenticating-to-github/connecting-to-github-with-ssh

How to authenticate with GitHub when using a shared machine

My current workflow includes typing the password. Log into a server, pull (or sometimes clone, checkout or even push), type in the creds and leave. I do not want to store my credentials on that machine and I do not always have the chance to access my own password manager on the same machine.
How are we supposed to do this after a password can no longer be used with GitHub on the command line? Should I actually carry a paper slip with an access token? Or am I obliged to configure SSH deploy key for every project on every server? It seems to require logging into github website and it's not like I have a GUI on those machines.
Is there any sane way? How would you do it, if you sit down in front of a linux bash and have to deploy a project on that machine, using that machine?
How you should handle this depends on what your needs are.
If you want to automate a deployment process for a machine, then using a deploy key for that machine is a good idea, since that's the exact purpose for which they're designed. Ideally your deployment processes are automated, and deploy keys are a good way to do that.
If your goal is to log into several machines via SSH and perform Git operations with a remote, you can use an SSH key. If you're logging in via SSH, then add your SSH key to your agent and forward your agent to the remote system with the -A option, which will let you perform the access as if you had that key on the remote system. This is the easiest and simplest solution if you can do so, and is even more convenient than typing your username and password.
If you need to log in to machines at the console, then generate an SSH key, add it to GitHub, and store it on a flash drive, at which point you can mount the flash drive and use the keys with Git by setting the environment variable GIT_SSH_COMMAND to ssh -oIdentitiesOnly=yes -i /mnt/path-to-key (substituting the path to the key).

Capistrano 3 deployment with multiple developers from GitHub using forward agent

I have an existing capistrano 3 deployment script which I run on my local machine (MacBook). I use agent forwarding and connect with my public SSH key. This all works fine, I have my SSH key added to GitHub and deployments works like a treat with no password. So, now I have a new developer that needs to also be able to deploy from his own machine. Firstly I have added his public SSH key to the server and added to known hosts so he has SSH access.
What do I need to do now so that agent forwarding works for him too?
I tried to copy his public key to the SSH keys in my GitHub account, but showed an error saying the key was already added. I don't understand how why I get this error as only my own SSH key has been added. Should I give him access to the GitHub repository and then he adds his SSH key to his own account?
Does the public key named in the deploy script need to be the same as it is named on the server or as it is on his machine?
Thanks for any help with this, I can't find anything online for this scenario.
Should I give him access to the GitHub repository and then he adds his SSH key to his own account?
Yes. The preferred way to do this is to give the new developer access to the GitHub project via his account.
You also add his public key to .ssh/authorized_keys on the server so that he can deploy. At this point, deployment should work for both of you using your own keys.

Git pulling onto a vm without an ssh key

I'm trying to pull an existing github repo made on my local machine onto a vm running on EC2 that will be used by multiple people. I have some concerns with using an ssh key without a password, so I was wondering if there was any way to pull directly onto the VM either anonymously, or by providing the username and password of the account that originally pushed the repo, so that my personal information won't have to be stored on the vm, and there's no security risk in having someone get ahold of a password-less ssh key for the vm. Is this possible?
Currently running Ubuntu 12.04
I recommend generating a new key and adding it as deployment key to your specific repo.
These keys are linked to a specific repo, not your account.
Alot of options are also available here.
https://help.github.com/articles/managing-deploy-keys