I want to copy file from Ubuntu 16.04 azuredevops agent to remote linux host (also have azuredevops agent installed).
I copied public key to ~/.ssh/authorized_keys
From terminal, all works fine
scp myagent/_work/10/s/docker-compose.yml root#192.168.1.76:/opt
docker-compose.yml 100% 1036 1.0KB/s 00:00
I created step in pipeline to execute exactly same command
But now getting error
2020-07-08T08:54:43.5359334Z [command]/bin/bash --noprofile --norc /home/user/myagent/_work/_temp/3ce8bc1e-7842-4f97-bc35-884893882d3c.sh
2020-07-08T08:54:43.5442624Z Pseudo-terminal will not be allocated because stdin is not a terminal.
2020-07-08T08:54:43.6019929Z Host key verification failed.
2020-07-08T08:54:43.6074975Z
2020-07-08T08:54:43.6245687Z ##[error]Bash exited with code '255'
You can use Copy Files Over SSH task to copy files to remote server.
First you need to create a SSH service connection to connect with with the remote server.
Go to project settings--> Pipelines-->Service connections-->new service connection-->Select SSH.
Then add Copy Files Over SSH task in your pipeline to copy the files to the remote server.
If you want to run script in remote server, you can use SSH task.
Related
What I am trying to do is to run a few lines of shell script in a remote machine via an azure pipeline. I used the ssh Deployment Task to accomplish this. I have used the script path argument to point the .sh file that contains the script that should be ran. The ssh task was able to connect to the remote host, but the following permission error pops up.
Can someone tell me what's going wrong here. The .sh file that i am using was created in the Linux box itself and has got the permission level set to 777 before moving to the repo.
There is an another CopyFilesOverSSH#0 task in the pipeline in the same stage which works perfectly without any permission issues for the same user.
2021-12-31T12:41:42.1763039Z ##[section]Starting: SSH
2021-12-31T12:41:42.1894277Z ==============================================================================
2021-12-31T12:41:42.1894676Z Task : SSH
2021-12-31T12:41:42.1895010Z Description : Run shell commands or a script on a remote machine using SSH
2021-12-31T12:41:42.1895347Z Version : 0.189.0
2021-12-31T12:41:42.1895637Z Author : Microsoft Corporation
2021-12-31T12:41:42.1896023Z Help : https://learn.microsoft.com/azure/devops/pipelines/tasks/deploy/ssh
2021-12-31T12:41:42.1896437Z ==============================================================================
2021-12-31T12:41:42.8200834Z Trying to establish an SSH connection to ***#80.xxx.xxx.xxx:22
2021-12-31T12:41:43.1333018Z Successfully connected.
2021-12-31T12:41:43.5698433Z ##[error]Failed to copy script to remote machine. Error: Error: put: Permission denied //checkFileAvailability.sh.
2021-12-31T12:41:43.6050230Z ##[section]Finishing: SSH
Firstly, if you want to copy files to the remote machine, then it's recommend to use Copy Files Over SSH task. This task allows you to connect to a remote machine using SSH and copy files matching a set of minimatch patterns from specified source folder to target folder on the remote machine. Supported protocols for file transfer are SFTP and SCP via SFTP.
For the SSH Deployment task. This task enables you to connect to a remote machine using SSH and run commands or a script.
According to your error message, the SSH is successfully connected, but failed to copy script to the remote machine. It appears that the service account doesn't have the permission to copy the specified file to the specific path on the remote machine. Please check your source file path permission settings. Please also try to using inline script instead of the Script File to check if it works.
I had the same issue when run SSH script task under a user which was not a root. So for inline script to run under different user, that user should have:
Read/Write/Execute access to root folder, as TFS put all commands into generated bash script file and copy it to target machine root folder (below is another command, which is executed on already copied script file)
tr -d '\015' <./sshscript_099d4e8c-44ac-482d-b1bf-84a52c7ab810> ./sshscript_099d4e8c-44ac-482d-b1bf-84a52c7ab810._unix
User should have home directory as TFS switch to it
So to fix this issue I have granted rwx permissions to everyone for the root folder
chmod 777 /
ls -ld /
drwxrwxrwx 20 root root 4096 Feb 10 14:54 /
And make sure that home folder for my user exists
What I want to do:
Deploy docker-compose solution from Github to my virtual private server which has docker and docker-compose installed.
I saw that there are Github Actions that allow me to copy files over SSH after push to master, but I don't know how to run docker-compose up on my server after source has been copied.
On my VPS I have Ubuntu 18.4 installed.
I believe Github actions also allow you to run arbitrary commands on remote servers via ssh (there are a few in their library).
Assuming you copy your docker-compose.yml into, /home/user/app/docker-compose.yml, you could run a command like so:
ssh user#yourserver.example.com "cd /home/user/app/ && docker-compose up -d"
Im trying to copy files over SSH, im using the same SSH Service Connection and it's just fine with other SSH tasks but copying files seems to run into trouble, Heres what it looks when i monitor for user logins
sshd[32240]: Accepted publickey for azurePPL1 from 13.69.175.211 port 1984 ssh2: ECDSA SHA256:0...
and this seems to be fine but it's not?
heres the error Azure Pipelines is throwing
Error: Failed to connect to remote machine. Verify the SSH service connection details. Error: Cannot parse privateKey: Unsupported key format.
Now i wouldve suspected my SSH Service Connection configuration but since other ssh work im not sure what it could be
Any help is appreciated
Using the same SSH Service Connection and it's just fine with other
SSH tasks but copying files seems to run into trouble
Since it's all work for other SSH task to use the same SSH Service Connection just Copy Files over SSH has failed, it means there's no error on your SSH key pair and connection. In fact, the issue relevant with the parser which used in Copy Files over SSH task.
See the function about the script of the copy file task which open source in github: function run in CopyFileOverSSH.ts, and the definition of class SshHelper: sshhelper.ts. In fact, the Copy Files over SSH task uses Ssh2 npm package for the SSH connection and verify, the error message you are facing is coming from there. For the copy file task itself, it does not do any key parsing.
About key parsing, see this source function: keyParser.js. Locate to line 1447, you will see that it is the error message you received in the task of Azure Devops.
As I know, from the task v0.148 is using ssh2 library v0.8, but now ssh2 library has been updated into v0.8.5.
So to solve this issue Please regenerate the key pair with the command ssh-keygen -t rsa -m PEM, to force ssh-keygen to export as PEM format. Thus the key can work in the copy file task.
Its now clear that the Azure Task is using an old version of ssh2
where Ed25519 keys are not supported which results in this issue so ill just have to use RSA for now.
I am trying to use VSCode - Insiders to run code on a docker container in a remote AWS machine using the Remote - SSH plugin. I have opened a terminal and set up port forwarding like so: ssh -L 2201:localhost:2222 user#host -N -i ~/.ssh/id_rsa. Then in VSCode I try to connect to root#localhost and it starts up, but then gives me an error message:
> Found existing installation...
> Found running server...
>
> bash: no job control in this shell
"install" terminal command done
Received install output: bash: no job control in this shell
Failed to parse remote port from server output: bash: no job control in this shell
I started doing this process a couple days ago and it worked. Yesterday it was in and out a bit, and today it's not working at all. I've tried turning it off and on again, but can't get it to work. In case it's relevant, I am on MacOS with the Mojave OS.
Edit:
Magically, it worked today (the following day) the first time. I would still be interested in knowing how to fix this next time it breaks. In case this helps, here's the output from when it is working:
SSH Resolver called for "ssh-remote+7b22686..."
SSH Resolver called for host: root#localhost
Setting up SSH remote "localhost"
Using commit id "473af338..." and quality "insider" for server
Using SSH config file "/Users/user/config"
Install and start server if needed
> Found existing installation...
> Found running server...
>
> bash: no job control in this shell
> 368805d0-03...==38466==
"install" terminal command done
Received install output: 368805d0-03...==38466==
Server is listening on port 38466
Using SSH config file "/Users/user/config"
Spawning tunnel with: ssh -F /Users/user/config root#localhost -N -L localhost:39003:localhost:38466
Spawned SSH tunnel between local port 39003 and remote port 38466
Waiting for ssh tunnel to be ready
Tunneling remote port 38466 to local port 39003
Resolving "ssh-remote+7b22686f737..." to "localhost:39003", attempt: 1
Edit 2: And now (the following following day) it's not working again.
Edit 3: I have a config file at ~/config. Here are the contents:
Host *
User root
Port 2201
IdentityFile ~/id_rsa
In the specific implementation shown above, you have User root in your config and are logging in with root#localhost, so you have your username twice. Leave the config file as is and just enter localhost in VSCode. This still doesn't solve the instability issue, but it does fix one problem.
I have the same issue when configuring my server. It solved by this issue. After save your config file for remote server, change the remote shell path like this issue, and then connect, you will in.
https://github.com/microsoft/vscode-remote-release/issues/220#issuecomment-490374437
Excuse my dev ops naiveté but I assume all you need to deploy to a machine is a proper SSH key, a port to expose, the machine's IP address a login and the code to deploy.
So are there any simple solutions that deploy code to a remote server with the only input being an SSH key, a Dockerfile and the code itself? I'm thinking it could be set up in a deterministic (almost functional) manner where the input is server IP address, login, and the output is a running server.
I've tried setting up Dokku on digital ocean (https://www.digitalocean.com/community/tutorials/how-to-use-the-digitalocean-dokku-application) and that requires a DNS record, and git. I don't need those as dependencies.
Thanks
If I understand your question correctly, you don't needed anything more than scp, ssh and a couple of shell scripts.
Let's say you want to deploy your code from serverA to serverB.
On serverB, create a directory with you Dockerfile. Also, create a shell script, let's call it build_image.sh, that runs your docker build command using sudo.
Also, on serverB, create a shell script that builds your code from source (if necessary).
Finally, on serverB, create a shell script that calls your code build script, your docker build script and at the end runs your new docker image. Let call this script do_it_all.sh.
Make sure that you chmod 755 all shell scripts.
Now, on serverA, you have a directory with the source code. scp that directory to serverB into the directory with the Dockerfile.
Next, from serverA use ssh to call do_it_all.sh on serverB. This will build your code, build your image and deploy a container without the need for extra software, packages, git, DNS records, etc.
You can even automate this process using cron or something else to have nightly deployments, if you wish, or deployments under other conditions.
Example scripts/commands:
On serverB:
build_image.sh:
#!/bin/bash
sudo docker build -t my_image
build_code.sh (optional, adjust to your code):
#!/bin/bash
cd /path/to/my/code
./configure
make
do_it_all.sh:
#!/bin/bash
cd /path/to/my/dockerfile
sudo docker stop my_container #stop the old container
sudo docker rm my_container #remove the old container
sudo docker rmi my_image #remove the old image
./build_code.sh #comment out if not needed
./build_image.sh
sudo docker run -d --name my_container my_image
On serverA:
scp -r /path/to/my/code serverB:/path/to/my/dockerfile
ssh serverB '/path/to/my/dockerfile/do_it_all.sh'
That should be it. Adjust for your system.
To deploy to a brand new system, just write a script on serverA that uses ssh to copy create necessary directories on serverB ssh serverB 'mkdir /path/to/dockerfile'. Next, copy your Dockerfile and your build scripts and your code from serverA to serverB using scp. Then run do_it_all.sh on serverB from serverA using ssh.