Is there a command I can enter via SSH (CentOS 6.3) that will monitor a certain directory, and if any new files/folders are created in in, copy those files to another folder at all?
I have looked at various sync programes, and rather than mirror the folder I need to keep a copy of any new files/folders even if they are deleted from the original directory.
I am hoping the cp command can be used somehow, but I couldn't work out how to do it myself.
Thanks for any help, and please let me know if you need further information to help or there is a better way to achieve my needs.
Rsync would do it, it doesn't have to delete files from the destination if they are deleted from the source.
Do you need it to run periodically or monitor constantly for changes? If the latter you might want to look into something using inotify or FAM.
Related
I have used rsync to move files across previously however I want to know the best solution for moving across files from a directory that has new files added to it regularly and have this sync on the other remote server what is the best approach to do this?
You could use cron to update your folder constantly, for example like in this case: Using crontab to execute script every minute and another every 24 hours
I've been working on a project that's fairly far a long now and I decided it's time to use some sort of version control etc. I decided to go with github. Before I get in too deep let me state explicitly that I am new to github.
My project resides in a directory that contains myriad subdirectories and files of all different kinds. I'd like to take my project directory as is (structure and all) and put it in my github repo.
I followed the tutorials on github's webpage, created the repo, and manually added some files. Obviously I don't want to manually add every file (there are several hundred). I'd like to know how I can add the root directory or for that matter any parent directory and all files/folders in said said directory. In other words I'm looking for a recursive add.
I read on this SO page (How to create folder in github repository?) that you can just use
git add directory/
That works fine for me when I'm dealing with the lowest level directory, but when I try the same command on a directory with subdirectories my terminal just sits there and I have to ctrl-c. I can't tell if it's just taking a long time (as I mentioned there are lots of files) or if this is just the wrong way to add a directory with subdirectories.
Apologies in advance if this is a super ignorant question -- I have looked at a lot of blogs/posts/etc and I cannot find a solution that seems to work.
Use the Current Working Directory
Assuming you're on Linux or OS X, from the command line you would do the following:
git add .
from the root of your repository tree. That will add all non-ignored files, including non-empty directories, into the repository.
From the root directory (the one with all the subdirectories), use git add -A.
If you have a ton of subdirectories and files, it may take a long while, so just let it sit there until it's done.
I'm working a lot with Virtual Machines and look for an efficient way to easily manipulate the files on the VM while still having them in sync with my local filesystem from where I commit them to the VCS.
I tried the remote Remote Systems Explorer for Eclipse. This gives me easy access to the files on the remote system, but has no synchronize option. So I can work directly on the remote files, but I need to sync it back to my local directory to commit them.
Basically I need some kind of rsync (Windows machine though) so that i only need to manipulate either my local files and sync the VM files or vice versa.
Can anybody help with that issue?
Looking at your reputation, I guess you can probably create some script to synchronize the local files on the command line outside Eclipse. If so, then you can have this script be invoked automatically from Eclipse as part of your normal (automated) project build. To do so, you only have to add a new builder to the project, which invokes that script.
This tutorial shows how to invoke an ant script during each build. You can restrict the invoked builder to changes of specific files and working sets, if you don't want to trigger that script on each file change.
I'm a VSS (Visual source safe) & Dropbox guy but new to GitHub. I'm using Windows Github tool to manage repositories on our remote server as I concluded in my previous SO post. I was glad to have sought this single point easy to use tool without any need for a deeper knowledge of git.
Things have been working fine until one day I had to add a new folder
to my repository. The Windows Github tool wouldn't recognize the
folder as a new content to be pushed! After some struggle I derived
that it "does" maintain sync with my "initial folders" but simply
creating a new folder in the repository directory wouldn't sync it
like Dropbox!
I searched to know how I can do it or if I had to use GitShell. My bad any I tried it, failed. Finally, I decided to purge everything and re-create the repository folder structure with this new folder like I did with my initial setup. But I don't know how or why it kept saying that the /.git/index file was being used by another process. I tried to empty this folder but it wouldn't. Finally, a logoff was able to free that file for me and I re-created everything. Pheew!
I might be doing it wrong as a newbie or even misusing Git due to my Dropbox habits. Pls correct me! What would be the best way?
My usage is more like VSS & dropbox(with version control) in a small remotely connected team. I started with this simple Windows Github tutorial. What about the following two -
TortoiseGit
msysgit
Do they provide better management? Pls suggest if Windows GitHub is the best (if so how to add folders later?)
Just in case, do note that adding a folder won't trigger anything for Git: you won't be able to push it if that folder is empty, because Git will consider it as "no content", and will ignore that new folder.
See also "How do I add an empty directory to a git repository?".
If you add a folder and some files in it, then the GitHub for Windows interface will detect that new content, and ask you to add and commit, which means you will be able to push.
I'm trying to mirror files on FTP server.
Those files can be very large so downloads might be interrupted.
I'd like to keep the original files while downloading partial files to a temporary folder and once completed override local older versions.
Can I do this? how?
Is there another easy to use (command line) tool that I can use?
First, download the files to a temp directory. Use -c so you can resume.
After the download has completed, use copy, rename or rsync to copy the files to the final place.
Note: Consider using rsync for the whole process because it was designed for just this use case and it will cause much less strain on the server and the Internet. Most site admins are happy if you ask them for an rsync access just for this reason.
Looking at the wget manual I can't see this functionality, however you could write a bash script to do what you want, which would essentially run an individual wget for each file then move it using normal mv command.
Alternativly have a look at rsync according to the manual there is a paramater that sets a temp dir
-T --temp-dir=DIR create temporary files in directory DIR
I am not 100% sure wheather this is where it puts the files during downloads as not had chance to try it out.