Automatically copy files in use - copy

I have some files that are always in use and I want to make a copy of them automatically on another disk. I have tried Shadow Copy, but it does not put them in a specific path so that I can access them and send them in an email automatically
Automatically copy files in use

One way to make a copy of files that are always in use is to use the robocopy command-line utility. robocopy is a robust file copying tool that is built into Windows. It can handle copying files that are in use, and it has a wide range of options that allow you to customize the copy process.
For example, you can use the '/mir' option to mirror the source folder to the destination folder, the /r:n option to retry the copy operation a specified number of times if errors occur, and the '/w:n' option to specify the wait time between retries.
You can also use the '/log' option to create a log file of the copy operation, and the '/np' option to suppress the display of the file-transfer progress.
robocopy C:\SourceFolder D:\DestinationFolder /mir /r:3 /w:3 /log:C:\Robocopy.log /np
You can schedule this command to run at specific intervals using the Task Scheduler. Then you can use this command to copy the files that are in use at specific intervals.
Additionally you can use any language or automation tool to send the files after the copy operation is done

Related

Powershell: Copy-Item -Recurse -Force is not copying all sub files

I have a one liner that is baked into a larger script for some high level forensics. It is just a simple copy-item command and writes the dest folder and its contents back to my server. The code works great, BUT even with the switches:
-Recurse -Force
It is not returning the file with an extension of .dat. As you can guess what I am trying to achieve, I need the .dat file for analysis. I am running this from a privileged account. My only thought was that it is a read/write conflict and the host file was currently utilizing it (or other sys file). What switch am I missing? The "mode" for the file that will not copy over is -a---. Not hidden, just not copying. Suggestions elsewhere have said to use xCopy/robocopy- if possible I do not want to call another dependancy- im already using powershell for the majority of the script, id prefer to stick with it....Any thoughts? Thanks in advance, this one has been tickling my brain for a little...
The only way to copy a file in use is to find the locking handle close it then retry the copy operation(handle.exe).
From your question it looks like you are trying to remotely copy user profiles which includes ntuser.dat and other files that would be needed to keep the profile working properly. Even if you did manage to find a way to unload the dat file(s), you would have to consider the impact that would have on the remote system.
Shadow copy is typically used by backup programs to copy files in use so your best bet would be to find the latest backup of each remote computer and then try to extract the needed files from the backed-up copies or maybe wait for the users to logoff and then try.

Robocopy option to force copy fail if destination folder does NOT exist

I'm trying to use Robocopy as part of my backup procedures, whereby backup files are copied from one disk to another (for secondary and tertiary backups). I basically have one key external drive for primary backups, and then various of the backups get siphoned off to other disks, depending on which disk it is. I want to develop a sequence of Robocopy commands in a batch file that I can use in any situation, so that if the Robocopy command tries to copy a folder from the source to a destination drive where there is no matching folder, it automatically FAILS and moves on to the next command. Robocopy by default creates the destination folder if it does not already exist, and I want to switch this default OFF if at all possible.

robocopy monitor source, save versions

Is it possible to use the robocopy command with the monitor source switch to copy files with a new file name when they change?
I use the command below but would like to explore leave it running for several hours and capturing changes. In its current state the command overwrites changes in the destination folder.
report.txt can have several changes (say at 10:00 and 3:00) I would like to have each version saved as report_1000.txt and report_0300.txt.
robocopy \\temp\output c:\users\eric\desktop\robocopy\ report.txt /mon:1 /r:4000

Execute robocopy powershell continuously between two times established

I have a program that creates temporary files in a specific folder. Then, automatically, after a few seconds, these files are deleted.
I wanted to copy those temporal files to an specific folder, I would like to use a powershell script to do this:
robocopy startFolder destinationFolder *.TIFF *.JPEG *.jpg *.PNG *.GIF *.BMP *.ICO *.PBM *.PGM *.PPM /s /XO
My problem is that I couldn't use a scheduled task (because of the problem with limitation of seconds) or install this powershell as a Windows Service with a powershell script (as far as I know is a bad practice) . I need this powershell running all the time trying to get files at the moment that they are created, before this folders were deleted.
Could you give me a hand please? Thanks!
Not sure it's quite what you want, but robocopy does have directory monitoring funcitonality built-in. You could add /mon:1 which should monitor the source directory and re-run the copy when it detects one change (a new or changed file, for example).
However, a down-side of this perhaps is that using this method, robocopy won't exit - it will run until you kill it.
Edit: I've just noticed you specify in your question title that this should run between two established times, in which case you could add the /rh:hhmm-hhmm option to specify times between which new copies can be started. For example, /rh:1000-1200 should only perform the copies (and hence monitoring) between 10am and midday.
Caveat: I've not tried using the "monitor" option of robocopy, so I'm not sure what sort of delay there would be between a change taking place, and the copy being re-run, but it's worth a shot.

how to check for activity or lack thereof on a unix file directory using perl or unix commands

Scenario:
I have a process where many files are being copied (scp'd) to a DestinationServer by Host1, Host2, Host3, Host4 for example. Going to the same common directory: DestinationServer:/home/target. All the files are unique so no files will be overwritten. Host1-Host4 will have a cronjob that will launch their scp script to DestinationServer. The caveat is the Hosts are in different time zones, locations. So, they will finish at different times.
Need:
Since the files are being scp'd to Destination:/home/target, what is the best way to programmatically check when those scp's from the other Hosts are done??
Options:
My options are to programmatically do this either in perl or shell if possible.
What do I look for, what unix commands or perl modules could I use to help determine when the processes would finish? Any ideas, examples would be great! Thanks.
Use a Maildir kind of approach: copy all files to a temporary directory, then after the transfer is complete have the originating host perform a rename into the target directory via ssh. That way when a file appears in the target directory, you know that it is complete.
I suggest this because if you just scp files into the target directory and monitor the directory in whatever way, you cannot distinguish a complete transfer from an interrupted scp command or a network failure.
SGI::FAM, Sys::Gamin
Similar but alternative way to Jouni is to use semaphore files. Before scp-ing files originating host puts up semaphore-file and when finished, remove it. So you know, it's time.