robocopy - how to copy from source to destination while the source is being copied to - robocopy

I have a source and a destination, i am trying to copy from source to destination using robocopy.
The catch is that I want to start robocopy as soon as i start copying files to a source. This means that the source will keep updating while I copy the source to a destination.
Any idea how this can be accomplished?

Have a look at File System Watcher http://msdn.microsoft.com/en-us/library/system.io.filesystemwatcher.changed(v=vs.110).aspx
Create an app that keeps on running in background, and checks for Directory size change, And whenever it changes, trigger a copy event.

Related

How to clean up and create a Archiving strategy for a folder on Github

I have
A git repository
A folder on this repository
To this folder I upload .SQL files. A new DDL is a new .SQL file and it is uploaded to the same folder as this is the place from which a CICD process kicks off to act upon this new file. I do change the sql code now and then but have no use for them after a certain point as it gets executed to the ultimate database via Liquibase
The Problem
Over time this folder now has close to 5000 .SQL files and growing everyday
Its getting cumbersome to navigate and find anything in this folder
The CICD build out of this folder is taking a lot of time it zips the entire folder
I want to
Archive/Remove everything more than 3 months old from the main folder
Move the old files to an Archived location so that I can refer to them
Get the file count down to a manageable level
Possibly do the archiving in a automated way without manual intervention
I Do not want to
Delete the files as I have to maintain a history
Change the process as may be have only one sql file and keep changing it.
I do not want Delete the files as I have to maintain a history
And yet, this is the simplest solution, as you still list the history of a deleted file.
# filter the deleted file to find one:
git log --diff-filter=D --summary | grep pattern_to_search
# Find the log of a deleted file:
git log --all -- FILEPATH
That means your process would simply:
list all files older than a certain date
add their name to a "catalog" file (in order to query them easily later on)
delete (git rm) them (their history is still there
For any file present in the "catalog" file, you still can check their log with:
git log -- a/deleted/file

Force RoboCopy to copy files even if they have not changed

What is the proper syntax to force RoboCopy to copy files even if they have not changed (ie. replace the file in the destination with the source file even if the source or destination files have not changed)?
File extensions I'm working with are .XMP files (Adobe camera RAW files), and PSD (Photoshop document files).
(I use RoboCopy to automate the process of copying these files from my photo editing drive to a backup drive and want to ensure that RoboCopy always copies all changes made to these files.)

Perforce - Recover deleted file

I opened a file (p4 edit) and made few changes to it. I deleted (rm -rf) the directory which contained this file followed by a p4 sync -f to bring back the depot files (in hopes of getting rid of a lot of untracked/generated files in the directory).
However, it helped me only partially. While I was able to get rid of of the undesired files, the sync step could not bring back the edited file.
I can see that the file is in opened state with p4 opened, but I can't seem to find a way to bring back this opened file along with my changes.
Any clues?
Edited files are not stored on the server; they are only stored locally. Since you removed the modified file with rm -rf you cannot get it back (unless the file was backed up by another process, such as a netapp .snapshot directory).
The server keeps track of the state of files but the changes are not stored until you submit.

Automatically copying new files to another folder (CentOS 6.3)?

Is there a command I can enter via SSH (CentOS 6.3) that will monitor a certain directory, and if any new files/folders are created in in, copy those files to another folder at all?
I have looked at various sync programes, and rather than mirror the folder I need to keep a copy of any new files/folders even if they are deleted from the original directory.
I am hoping the cp command can be used somehow, but I couldn't work out how to do it myself.
Thanks for any help, and please let me know if you need further information to help or there is a better way to achieve my needs.
Rsync would do it, it doesn't have to delete files from the destination if they are deleted from the source.
Do you need it to run periodically or monitor constantly for changes? If the latter you might want to look into something using inotify or FAM.

solving msdeploy recursive skip on sync conflict

I'm trying to use a msdeploy sync command, skipping a given folder that are all around the destination directory.
Command I'm trying is:
msdeploy.exe -verb:sync -source:dirPath="C:\SomeFullSourcePath"
-dest:dirPath="C:\SomeFullDestPath"
-skip:objectName=dirPath,absolutePath=.*\\FolderToIgnoreAtAnyLevel
It works fine, except when trying to delete a folder, it goes like this:
folders are synced up to date
we want to delete a folder, so it is deleted from source
as pretty much any folder, in the destination it contains one of the folders that are skipped
sync is started, but it naturally fails / I've reprod it to the particular scenario, so I'm assuming what happens is the skip rule on these children folders prevents the parents from being deleted (error is "the directory is not empty")
Is there any simple way to workaround this issue?
Constraint: it must only delete the ignored folders If the parent folder is being deleted. The ignored folders can't be present at the source.