Manage ssh keys within Bluemix - ibm-cloud

How do I clean up my ssh keys for my Virtual Servers. I have defined a number of keys but want to delete all of them and start a fresh

I guess right now your only option to delete them is through an OpenStack client on the command line. See here for details and on where to obtain that client.
List available ssh keys:
openstack keypair list
Delete a specific ssh key:
openstack keypair delete myKey

I connected to bluemix via the openstack client (installed on my workstation) and was able to delete the ssh keys with the commands above

Related

Setting up an SSH Key and

I am new to coding and have been tasked with setting up a new SSH Key and connecting to GitHub. I have followed all the steps, and when I check if I have successfully paired I get this message:
$ ssh -T git#github.com
The authenticity of host 'github.com (140.82.121.4)' can't be established.
ED25519 key fingerprint is SHA256:+DiY3wvvV6TuJJhbpZisF/zLDA0zPMSvHdkr4UvCOqU.
This key is not known by any other names
Are you sure you want to continue connecting (yes/no/[fingerprint])? yes
Warning: Permanently added 'github.com' (ED25519) to the list of known hosts.
Hi indiataylor1! You've successfully authenticated, but GitHub does not provide shell access.
Can anyone point me in the right direction ?
Have tried creating a new SSH Key and starting from scratch.
You are good to go.
The "ED25519 key fingerprint" message is seen only at the first SSH connection, and yours does match the official GitHub ones.
If you try again ssh -T git#github.com, you will only see:
Hi indiataylor1!
You've successfully authenticated, but GitHub does not provide shell access.
From there, start cloning your repository with:
git clone git#github.com:You/yourRepository

How to save ssh password to vscode?

I am using vscode to connect to a remote host. I use Remote-SSH (ms-vscode-remote.remote-ssh) extension to do so. Every time I want to connect to the remote host, I need to enter the password.
Is there a way to save the ssh password to vscode?
To setup password-less authentication for ssh on Visual Studio Code, perform the following steps.
These examples assume the following (replace with your actual details)
Host: myhost
Local User: localuser
Remote User: remoteuser
Remote User Home Dir: remoteuserhome
SSH Port: 22
I'm using a Mac so Windows will be a bit different but the basics are the same
Tell VS Code and your machine in general how you will be connecting to myhost
Edit:
/Users/<localuser>/.ssh/config
Add:
Host <myhost>
HostName <myhost>
User <remoteuser>
Port 22
PreferredAuthentications publickey
IdentityFile "/Users/<localuser>/.ssh/<myhost>_rsa"
Next generate a public and a private key with something like OpenSSL
ssh-keygen -q -b 2048 -P "" -f /Users/<localuser>/.ssh/keys/<myhost>_rsa -t rsa
This should make two files:
<myhost>_rsa (private key)
<myhost>_rsa.pub (public key)
The private key (<myhost>_rsa) can stay in the local .ssh folder
The public key (<myhost>_rsa.pub) needs to be copied to the server (<myhost>)
I did it with FTP but you can do it however you wish but it needs to end up in a similar directory on the server.
ON THE SERVER
There is a file on the server which has a list of public keys inside it.
<remoteuserhome>/.ssh/authorized_keys
If it exists already, you need to add the contents of <myhost>_rsa.pub to the end of the file.
If it does not exist you can use the <myhost>_rsa.pub and rename it to authorized_keys with permissions of 600.
If everything goes according to plan you should now be able to go into terminal and type
ssh <remoteuser>#<myhost>
and you should be in without a password. The same will now apply in Visual Studio Code.
Let's answer the OP's question first:
How to 'save ssh password'?
Since there is no such thing as "ssh password", the answer to "how to save the remote user password" is:
This is not supported by VSCode.
VSCode proposes to setup an SSH Agent in order to cache the passphrase (in case you are using an encrypted key)
But if the public key was not properly registered to the remote account ~/.ssh/authorized_key, SSH daemon will default to the remote user credentials (username/password).
It is called PasswordAuthentication, often the remote user password.
And caching that password is not supported for SSH sessions.
It is only supported by a Git credential helper, when using HTTPS URLs.
(it defers to the OS underlying credential manager)
But I don't know of a remote user password cache when SSH is used.
As Chagai Friedlander comments, the answer to the original question is therefore:
No, but you can use SSH keys and that is better.
Speaking of SSH keys:
"ssh password": Assuming you are referring to a ssh passphrase, meaning you have created an encrypted private key, then "saving the ssh password" would mean caching that passphrase in order to avoid entering it every time you want to access the remote host.
Check first if you can setup the ssh-agent, in order to cache the passphrase protecting your private key.
See "VSCode: Setting up the SSH Agent"
This assumes you are using an SSH key, as described in "VSCode: Connect to a remote host", and you are not using directly the remote user password.
Using an SSH key means its public key would have been registered to the remote account ~/.ssh/authorized_keys file.
This section is the workaround the OP ended up accepting: registering the public key on the remote user account, and caching the local private key passphrase worked.
For those trying to connect through Vscode Remote SSH Extension steps provided at https://code.visualstudio.com/docs/remote/troubleshooting#_ssh-tips)
For Windows(Host) --> Linux(Remote)
Create an SSH .pub key in your windows ssh-keygen -t rsa -b 4096
Copy the contents of the .pub key (default path C:\Users\username/.ssh/id_rsa.pub)
SSH into Remote machine and append the contents of the pub key in authorized keys echo "pub-key" >> ~/.ssh/authorized_keys

Kubernetes cluster in GCloud with koops: public key denied

I am deploying a 2-node Kubernetes cluster on GCloud using kops with the following commands:
kops create cluster part1.k8s.local --zones europe-west3-a --node-count 2 --node-image ubuntu-os-cloud/ubuntu-2004-focal-v20210129 --node-size "e2-standard-2" --ssh-public-key ~/.ssh/id_rsa.pub --state ${KOPS_STATE_STORE}/ --project=${PROJECT}
kops update cluster --name part1.k8s.local --yes --admin
I then wait for the cluster to be ready and get the external IP of one of the nodes using:
kubectl get nodes -o wide
However when I try to login to the node I get:
ssh -i ~/.ssh/id_rsa admin#<PUBLIC_IP>
admin#<PUBLIC_IP>: Permission denied (publickey).
Checking the permissions the nodes should be able to accept SSH connections and I can connect to the VMs using the GCloud UI.
What am I doing wrong? Thanks!
The answer can be found here: https://github.com/kubernetes/kops/issues/10770
I've encounter the issue when I tested some scenarios with SSH keys (add, remove, overwrite, etc).
When you are logging to GKE console, your ssh keys are stored in ~/.ssh. If folder it's empty, those keys will be created ocne you will connect to VM (google_compute_engine and google_compute_engine.pub).
$ ls ~/.ssh
google_compute_engine google_compute_engine.pub google_compute_known_hosts known_hosts
Information about SSH Key is also stored in your project. You can find it in Navigation Menu > Compute Engine > Metadata. Next select SSH Keys tab to view instance SSH keys.
Additional information about SSH Keys can be found in Managing SSH keys in metadata guide.
If you will encounter this kind of issue, you can remove SSH key from UI, remove google_compute_engine and google_compute_engine.pub. While you want to SSH to machine, GKE will ask you to create new SSH key and issue with Permission denied (publickey) should be fixed.
Commands which should be used to ssh to GCE vm is gcloud ssh
gcloud compute ssh <instance-name> --zone <zone>
Why?
gcloud compute ssh is a thin wrapper around the ssh(1) command that takes care of authentication and the translation of the instance name into an IP address.
In addition, if you will encounter other SSH issues on GKE you can check Troubleshooting SSH guide.

Failed to add the SSH key to the ssh-agent with an empty passphrase (Bitrise CLI)

Summary:
As I'm integrating CI to the development workflow, I'm also trying to move the executions of Bitrise workflows to our local iOS Mac Computer which is setup as a Jenkins slave.
The projects that I'm trying to build therefore needs to be built on this iOS Computer.
Problem:
I'm trying to establish an ssh connection to an integration user (a GitHub account that has access to my repositories) and I have created a key and added it to the GitHub user as well as to the .bitrise.secrets.yml file.
But when the initial step, the activate-ssh-key step is executed, it results with an error that I can't add the SSH key to the ssh-agent with empty passphrase. (Is this somehow configurable? Can I just evade this?)
Here is the output log:
https://pastebin.com/FCHhZNDb
Step in bitrise.yml:
- activate-ssh-key#4.0.2: {getenv "SSH_RSA_PRIVATE_KEY"}
.bitrise.secrets.yml:
envs:
- SSH_RSA_PRIVATE_KEY: ssh-rsa *KEY*
|------------------------------------|
I have also tried putting the ssh key directly in the .ssh directory which did not work.
Any help is really appreciated! :)
TL;DR
Trying to connect bitrise cli with github via ssh, doesn't work.
The SSH key you used seem to be protected with a passphrase. You should generate one that does not require a passphrase to be specified, and register that for the repository.
How to generate such an SSH key: https://devcenter.bitrise.io/faq/how-to-generate-ssh-keypair/
ssh-keygen -t rsa -b 4096 -P '' -f ./bitrise-ssh -m PEM
Alternatively you can replace the Activate SSH Key step with a script one and activate the SSH key any way you like.
Or if you prefer to not to use SSH keys you could switch to using https:// git clone urls (instead of the SSH / git# one) and replace the Activate SSH Key step with the Authenticate with GitHub OAuth one (https://www.bitrise.io/integrations/steps/authenticate-with-github-oauth).

ssh-copy-id is copying my public key but I still need to enter my credentials when logging in to IAE

I'm trying to setup passwordless ssh access to my cluster.
I've used ssh-copy-id clsadmin#my-clusterhostname and entered the cluster password when promoted. The output from ssh-copy-id shows:
Number of key(s) added: 1
However, when I try to ssh into the cluster, I'm prompted for my password. If I log in to the cluster, I can see the key has been added to ~/.ssh/authorized_keys.
Why is passwordless ssh not working after these steps?
The problem seemed to be because I had used a DSA key. After creating a RSA key and copying that to the cluster, I was able to login over ssh without entering my credentials.