A folder on my Chromebook that is synced with the server is what I want. I want to create "cloned" folders on my Chromebook and server. When altering files using rsync on a Chromebook or server, both directories will be updated. I'd appreciate some pointers on how to use the rsync command to accomplish this.
Maybe this would work:
from Chromebook to server.
rsync -av /Volumes/NewHDD1_DATA/Projects_archive_Small bernardo#128.139.17.11:/home/bernardo/Projects_archive_Small
from server to Chromebook
rsync -av /home/bernardo/Projects_archive_Small/Projects_archive_Small bernardo#dhcp17-207.agri.huji.ac.il:/Volumes/NewHDD1_DATA/
I also want to delete old files from local or remote locations. Perhaps there is a superior choice to rsync.
Related
I have Proxmox VE, and I want to install Git enterprises in it via the qcow2 file I have downloaded the qcow2 of git enterprise, in my local but I don't know how to upload it in the Proxmox, because there is no option in proxmox to upload other than uploading an ISO image.
How should I upload a qcow2 file from my local system to Proxmox?
The easiest way would be to scp the .qcow2 file to your root user. After that you have to import it, and <storagepool> is per your set up, then rescan your storage. So something like the following:
scp git.qcow2 root#proxmox1:/root/
ssh root#proxmox1
qm importdisk 9000 git.qcow2 <storagepool>
qm rescan
I have a dockerfile, a .sh file, a nginx config file and the private keys. But on a clean server how to add those files for the first time (before making the docker image) to the server?
Should I FTP and put those files there?
Should I git pull my project? // but I still need the keys or I can use password
What you do?
I'm not using digital ocean, and I would not like to have a private paid docker repo like https://registry.hub.docker.com/plans/
You use a physical server or a vps ?
If you can ssh to your server, add files have so many ways.
1.The most easy way is use sftp (you can find some sftp client to do this or use ftp command line tools), only need your ssh login permission, and you can upload these file to your user home directory.
2.The other way is use scp, command like:
scp YOUR_FILE username#ipaddressORhostname:/home/username/
this also only need your ssh login permission
git or ftp is not a good way to push files fisrt time to server.
Most git remotes repos (GitHub, GitLab etc) will support a https access mechanism, the data is encrypted in transit much like sftp and ssh. you'll get a password challenge. No keys...
git remote add myHttpsRemote https://my/foo/bar/project.git
git pull myHttpsRemote [branch]
I am a junior front-end developer and I am working on my process using command line. I would like to push my local changes to my server without having to use an FTP client like Filezilla. I am manually dragging and dropping files using the client and would like to learn how developers perform this process. I am building a static site using SiteLeaf on a Mac. Thanks in advance for help with this workflow.
If your target has SSH installed you can use SCP:
$ scp -r your_remote_user#remote_address:/path/to/save/dir /local/dir/to/transfer
This can also be used to transfer single files: just remove the -r (recursive) option and specify files path instead of directories.
Netbean is great and I use it with FTP remote connection all the time. However, one of my client currently only have a SSH connection. Is there anyway to connect to it and up/down files?
Like it was mentioned, SFTP is supported in Netbeans by default.
So select "remote connection" in your project's run configuration and use your SSH connection information (host, login and pass). You don't have to provide any private key file.
I've had luck using sshfs (ssh file system) on ubuntu. I create created a mount folder in my home folder and run the following
$ sshfs domain\\user#server:/path/to/remote/folder ~/mount/local-mount-point
From there I start a new (or existing) project in Netbeans at that local folder ~/mount/local-mount-point
For a nicer set up, do a key exchange between your local box and the server (ssh-copy-id) for password-less ssh connections. Then, put the above command line in your .bashrc file.
I do the same as Richard.
In general is easier just to mount the remote filesystem and use netbeans in the mounted directory.
I just do the following :
sudo sshfs -o allow_other root#www.khosmos.com:/var/www/html /mnt/droplet/
I am using EclipsePHP in Ubuntu 10.10 and try to use Mercurial (HG) to work with a repository that's located on my network-connected staging server (samba share).
When trying to refresh the repository from within Eclipse (hg status really) , I get the following error thrown in my face: abort: Operation not permitted: /media/sharename/myrepository/.hg/.dirstate* .
Whilst trying to find out what's wrong, I went to the network share from terminal and wrote hg status - the same error occurs, so it's not only occuring from within Eclipse. I tested to CHMOD the files from both my computer as well as the server - chmod 777 /media/sharename/myrepository/ -R - nothing changes.
But when I accidentally ran sudo hg status from the repo directory, Mercurial started the fireworks and worked like a charm.
What on earth is going wrong with my computer? Why can't i run my hg commands without being root?
You can mount your network drive like this. Open your /etc/fstab/, and then enter the following line.
//IP_OR_HOSTNAME/DIRECTORY_NAME /MOUNT_DIRECTORY cifs user=sambauser,pass=sambapassword,auto,exec,umask=002,gid=1000,uid=1000,file_mode=0777,dir_mode=0777 0 0
Hope it works.
chmod will not help you here I guess. The ownership and permissions on the server side are not replicated to the client (no unix extensions on server) or your UID/GID differ between both machines. You can override file ownership when mounting via:
mount -t cifs //SERVER/SHARE /MOUNTPOINT -o uid=USERNAME
This is from memory though, check man mount.cifs for details. Also, alternative networked filesystems like NFS might serve you better in this case, or try sshfs.