What is the best practice to move my dockerfile and related files to the server? (private files) - deployment

I have a dockerfile, a .sh file, a nginx config file and the private keys. But on a clean server how to add those files for the first time (before making the docker image) to the server?
Should I FTP and put those files there?
Should I git pull my project? // but I still need the keys or I can use password
What you do?
I'm not using digital ocean, and I would not like to have a private paid docker repo like https://registry.hub.docker.com/plans/

You use a physical server or a vps ?
If you can ssh to your server, add files have so many ways.
1.The most easy way is use sftp (you can find some sftp client to do this or use ftp command line tools), only need your ssh login permission, and you can upload these file to your user home directory.
2.The other way is use scp, command like:
scp YOUR_FILE username#ipaddressORhostname:/home/username/
this also only need your ssh login permission
git or ftp is not a good way to push files fisrt time to server.

Most git remotes repos (GitHub, GitLab etc) will support a https access mechanism, the data is encrypted in transit much like sftp and ssh. you'll get a password challenge. No keys...
git remote add myHttpsRemote https://my/foo/bar/project.git
git pull myHttpsRemote [branch]

Related

Ansible playbook: pipeline local cmd output (e,g. git archive) to server?

So my project has a special infrastructure, the server has only SSH connection, I have to upload my project code to server using SSH/SFTP everytime, manually. The server can not fetch.
Basically I need something like git archive master | ssh user#host 'tar -zxvf -' automatically done using playbook.
I looked at docs, local_action seems to work but it requires a local ssh setup. Are there other ways around?
How about something like this. You may have to tweak to suit your needs.
tasks:
- shell: git archive master /tmp/master.tar.gz
- unarchive: src=/tmp/master.tar.gz dest={{dir_to_untar}}
I still do not understand it requires a local ssh setup in your question.

Private Github Repositories with Envoy

Anybody has any problems deploying with Laravel's envoy when using private Github repos?
When manually cloning my repo from the production server, the ssh key seems to be accessible but when using Envoy, I always get a "Permission denied (publickey) error.
Thanks
It is probably because the ssh key on your remote server requires a password.
If you change the Envoy.blade.php to perform some other task you should be able to establish whether you are connecting to your remote correctly.
#servers(['web' => 'user#domain.com'])
#task('deploy')
cd /path/to/site
git status
#endtask
Should return something like:
[user#domain.com]: On branch master
Your branch is up-to-date with 'origin/master'.
nothing to commit, working directory clean
If you are connecting using a Mac or Linux you probably don't have to enter your password because your terminal is using ssh-agent which silently handles your authentication.
Wikipedia article on ssh-agent
When connecting over ssh, ssh-agent isn't running and the script is being prompted for a password which is where it is failing.
To get around this you could to generate a new key on the remote machine that doesn't use a password.
If you want to restrict the ssh key to a single repository on GitHub have a look at deploy keys
You need to pass the -A (as per the man page it - Enables forwarding of the authentication agent connection. This can also be specified on a per-host basis in a configuration file) in you ssh string.
You will also need add your ssh key for agent forwarding (on the machine which can access the git remote which I assume be your localhost)
ssh-add -K ~/.ssh/your_private_key
Something like this
#servers(['web' => '-A user#domain.com'])
#task('deploy')
cd /path/to/site
git status
#endtask
Git remote commands should now work.

SSH versus FTP: Push local files to a server via Terminal

I am a junior front-end developer and I am working on my process using command line. I would like to push my local changes to my server without having to use an FTP client like Filezilla. I am manually dragging and dropping files using the client and would like to learn how developers perform this process. I am building a static site using SiteLeaf on a Mac. Thanks in advance for help with this workflow.
If your target has SSH installed you can use SCP:
$ scp -r your_remote_user#remote_address:/path/to/save/dir /local/dir/to/transfer
This can also be used to transfer single files: just remove the -r (recursive) option and specify files path instead of directories.

How do I specify the key file that capistrano will use when cloning the repository on the remote server?

Ideally, I want something like set :scm_keyfile, "~/.ssh/server-deploy-key". The path specified would of course be a path on the remote server.
If the remote user already has a ~/.ssh/id_rsa or ~/.ssh/id_dsa then git will use it by default.
If you wish to use an alternate file name for your private key, you can do this. Create a file on your remote server ~/.ssh/config and put these lines in it
Host github.com
User git
IdentityFile ~/.ssh/server-deploy-key
Now when you attempt to run a command like git clone git#github.com/xxx/yyy.git, Your ~/.ssh/server-deploy-key will be used.
Another method is to use ssh-agent forwarding. In this method, you don't need to put your deploy key on the remote server. As long as it is on your local machine, and you have enabled ssh-agent forwarding, your remote server will have access to the key and will use it . There is a nice article on github explaining this.

Download file while in ssh mode?

I use to navigate my remote servers with ssh. Sometimes i would like to download a file to open in my computer.
But the only way i know how to do it is to open a new command line window and use scp from local to remote.
is there a way to do this directly from the ssh server?
like a command that know my current ip so can set up everything automatically?
(wonderful would also be to do the upload in such a way...)
There is no easy way to do it - I used ssh & scp many years the way you just described. But, you may configure ssh & scp in such a way that they don't require password each time, which is very comfortable! For this, you need:
generate keys by ssh-keygen - they can be also passphrase (= password) protected
copy the keys to remote machine to ~/.ssh/authorized_keys
And then, each time you start a session, you run ssh-agent and ssh-add. You just enter the password once. And then you can just run scp/ssh many times, from scripts, etc., without the need to enter the password each time!
I don't remember the exact way how to configure all this, but have a look at manpages of all those useful tools! Many things can be automatized by placing them into ~/.bash_profile or ~/.bashrc files.
I found this while trying to answer your question for myself:
https://askubuntu.com/a/13586/137980
Just install zssh and use Ctrl-# to go into file transfer mode.