I have a large directory I'm using Robocopy to sync between new and old storage. Even if there isn't much data to copy and not many files on the destination not on the source I still am getting a very large amount of time in the Extras column. For example, on last sync less than one hour was in the Copied column (around 10K files), but over 30 hours in Extras (only 23 files and 0 directories in Extras though)! What could take so much time in Extras? This is the options being used.
/FFT /S /E /DCOPY:AT /COPY:DAT /PURGE /Z /NP /R:1 /W:1
Related
I'm trying to create a robocopy program that runs every Sunday and pulls in the last 7 days of folders(and its sub-folders) .this is because the folder stores years' worth of data so I only want to pull in the new folders.
is there a way to do this? I've been playing around with the max/min age but its pulling in all folders
what i have so far:
robocopy \\myserver\test\main_folder \\myserver\test\main_folder_archive /MIR
I want to run this every Sunday and it is smart enough to pull in the last 7 days.
Any help is appreciated, thanks
robocopy \server\source_share \server\dest_share /MAXAGE:7 /S /E
Should do it. This is not compatible with /MIR though:
/MIR :: Mirror a complete directory tree.
So either mirror, or select 7 days worth of files.
Edit: added /S, which is what I think was the desired switch instead of /MIR
I am using omake for my project build
As part of my delivery build I need to copy some delivery related files into a folder
How I can use robocopy execution where I can only copy the interested files(which are selected by running a for loop on a list I have)
Into the folder.
List_paths = filepath1
Filepath2
.
FilepathN
For file in list_paths
%exec robocopy $(file,A) $(destin)
End for
By doing that robocopy always complains me that there is no such file by adding \ at the end of eCh file and for destin folder too.
I Understand that robocopy just works fine for copying files from one folder to other but I have condition to copy only selective files from folders (for which I have the list of paths)
Please help on this.
If you have a textfile with complete patnames (i.e list.txt), with content:
d:\temp\a.sql
d:\temp\b.sql
d:\temp\c.sql
and a batchfile roboDo.bat like this:
#echo off
SET dest=d:\temp\temp\
FOR /f "useback tokens=1" %%f IN (`type list.txt`) DO (
ECHO copying %%~nxf from %%~dpf to %dest%
ROBOCOPY /NP /NJH /NJS %%~dpf %dest% %%~nxf >NUL
)
running the batchfile wil output this:
D:\TEMP>robodo
copying a.sql from d:\TEMP\ to d:\temp\temp\
copying b.sql from d:\TEMP\ to d:\temp\temp\
copying c.sql from d:\TEMP\ to d:\temp\temp\
I have a drive with several TB of data, most of which doesn't change often. When I run robocopy the spew contains reams and reams of skipped files. I'd like to run /L to see what will be copied and what will be deleted before mirroring to my backup drive, but there is so much noise in the log due to the skipped files that sorting through it is time-consuming.
How do I tell robocopy to log only those files that are copied or deleted?
i have found this switch useful to reduce the amount of unnecessary output
/ndl Specifies that directory names are not to be logged.
All robocopy Parameters
Here is the simplified version of the situation I'm dealing with:
Folder Files1 containing a.txt, b.txt, c.txt
Folder Files2 containing a.txt, b.txt, c.txt
I want to find the best way to compare these files. All I need to know is which files are different. For example I need to know a.txt is different from a.txt in Files2 folder (I don't need to know what is different inside them).
Right now I load both in Notepad++ and use the compare function, but this is a pain.
I tried fc, but I'm getting cryptic output.
You could use Robocopy for that
robocopy c:\Temp\different\folder1 c:\temp\different\folder2 /e /l /ns /njs /njh /ndl /fp /log:c:\temp\whatIsDifferent.txt
Newer c:\Temp\different\folder1\b.txt
New File c:\Temp\different\folder1\d.txt
the key parameter is /L which allows you to "compare" instead of actually copying.
From Robocopy /? help
::
:: Logging Options :
::
/L :: List only - don't copy, timestamp or delete any files.
Do a dry-run (/l) with robocopy:
robocopy C:\files1 C:\files2 /njh /njs /ndl /l
/l Specifies that files are to be listed only (and not copied, deleted, or time stamped).
Rsync can be used to do this.
rsync -rvnc —delete delme/ delme2/
This will show you which files differ in the 2 directories.
See http://www.techrepublic.com/blog/linux-and-open-source/how-to-compare-the-content-of-two-or-more-directories-automatically/ for me details.
Thanks in advance for your help.
I am using ROBOCOPY to copy some files from on drive to another on the same computer. Every once in awhile, I get an error. And instead of retrying 999 times as it should, it retries once and fails. I have a couple questions:
Why would this error be happening in the first place?
Why isn't ROBOCOPY retrying the 999 times as defined?
Commands are below:
mkdir C:\Users\tempuser\AppData\Local\temp\test1
robocopy /R:999 /W:5 /NP /E /XO /NFL /NDL E:\test1 C:\Users\tempuser\AppData\Local\temp\test1 test*
-------------------------------------------------------------------------------
ROBOCOPY :: Robust File Copy for Windows
-------------------------------------------------------------------------------
Started : Monday, March 25, 2013 4:20:51 AM
Source : E:\test1
Dest : C:\Users\tempuser\AppData\Local\temp\test1
Files : test*
Options : /NDL /NFL /S /E /DCOPY:DA /COPY:DAT /NP /XO /R:999 /W:5
------------------------------------------------------------------------------
2013/03/25 04:20:51 ERROR 32 (0x00000020) Accessing Destination Directory C:\Users\tempuser\AppData\Local\temp\test1
The process cannot access the file because it is being used by another process.
Waiting 5 seconds... Retrying...
------------------------------------------------------------------------------
Total Copied Skipped Mismatch FAILED Extras
Dirs : 1 0 0 0 1 0
Files : 0 0 0 0 0 0
Bytes : 0 0 0 0 0 0
Times : 0:00:05 0:00:00 0:00:05 0:00:00
Ended : Monday, March 25, 2013 4:20:56 AM
This is quite probably a bug in robocopy. It has at least one other bug surrounding Error 32 / in use files: /b (backup) mode will fail with this error even if a file is not exclusively locked (and is copyable with 'copy', 'xcopy', windows explorer, and robocopy without /b), so I suspect there are bugs in how it handles in-use files in general.
There is no "bug" in ROBOCOPY.
Something is "locking" your source folders and files from time-to-time. Not always, as that's proof based on the fact that your copies work "at times".
I would place a script to copy "out" your source files into another location (drive preferably), and then use ROBOCOPY from that new location into your other or final location. Use the move option to keep the intermediate location free for the next backups, etc.
I use this...
robocopy <source path> <target path> <files> /s /j /r:2 /w:5 /log+:robocopy.log
I hope someone finds this helpful. This "fix" has worked for me several times so good luck.
Simply logout of the server with file explorer and PowerShell running... force close any apps running at the sign-out Windows page and log back into the server . Run your robocopy script and your locked process would have been released.
Sorry if it doesn't work for you.