Alternatives to XCopy for copying lots of files? [closed] - xcopy

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
The situation: I have a pieceofcrapuous laptop. One of the things that make it pieceofcrapuous is that the battery is dead, and the power cable pulls out of the back with little effort.
I recently received a non-pieceofcrapuous laptop, and I am in the process of copying everything from old to new. I'm trying to xcopy c:*.* from the old machine to an external hard drive, but because the cord pulls out so frequently, the xcopy is interrupted fairly often.
What I need is a switch in XCopy that will copy eveything except for files that already exist in the destination folder -- the exact opposite of the behavior of the /U switch.
Does anyone know of a way to do this?

I find RoboCopy is a good alternative to xcopy. It supports high latency connections much better and supports resuming a copy.
References
Wikipedia - robocopy
Downloads
Edit Robocopy was introduced as a standard feature of Windows Vista and Windows Server 2008.
Robocopy is shipped as part of the Windows Server 2003 resource kit and can be download from the Microsoft download site.
A very simple GUI has also been release for RoboCopy on technet http://technet.microsoft.com/en-us/magazine/cc160891.aspx

/D may be what you are looking for. I find it works quite fast for backing-up as existing files are not copied.
xcopy "O:\*.*" N:\Whatever /C /D /S /H
/C Continues copying even if errors occur.
/D:m-d-y Copies files changed on or after the specified date.
If no date is given, copies only those files whose source time
is newer than the destination time.
/S Copies directories and subdirectories except empty ones.
/H Copies hidden and system files also.
More information: http://www.computerhope.com/xcopyhlp.htm

I'm a big fan of TeraCopy.

Beyond Compare 3 is the best utility I've seen for things like this. It makes everything really easy to assess, and really easy to manipulate.

It was not clear if you only wanted a command line tool, but Microsoft's free SyncToy program is great for maintaining a replication between a pair of volumes. It supports pushing changes in either or both directions. That is, it support several different types of replication modes.

XcopyGUI. A small, standalone GUI front-end for xcopy. Free. http://lorenstuff.weebly.com/

robocopy c:\sourceDirectory\*.* d:\destinationDirectory\*.* /R:5 /W:3 /Z /XX /TEE
This will work for your alternative to xCopy... best method imho
Good luck!

I would suggest using rsync, several ports are available, but cwrsync seems to work nicely on Windows.

How about unison?

Related

Comparing & Copying Newer Files

I have a series of E-mail templates stored on a DFS fileshare.
I would like to have a logon script so that when a user logs on, it will cycle through each template in \\LAN\Files\Office Templates\Outlook, compare the LastWriteTime and then copy across any of the newer files from the DFS share to the local folder %APPDATA%\Microsoft\Templates
Currently the folders look like this:
(I am aware at the minute they have the same date, but they won't in the future)
If anyone can help me with this then I would appreciate it very much.
Thanks in advance.
I'd use XCOPY, it's built into most versions of windows and is purpose designed for copy operations.
xcopy <Source> <Destination> <Parameters>
It's got many options, so worth reading the documentation link above.
Your copy is the most simple and needs no extra params. By default this will copy any files from Source that are newer, or do not exist, to Destination:
xcopy "\\LAN\Files\Office Templates\Outlook" "%APPDATA%\Microsoft\Templates"
Or the other option is to use Group Policy Preferences, but that's offtopic for here, more suited to ServerFault.

Powershell: Copy-Item -Recurse -Force is not copying all sub files

I have a one liner that is baked into a larger script for some high level forensics. It is just a simple copy-item command and writes the dest folder and its contents back to my server. The code works great, BUT even with the switches:
-Recurse -Force
It is not returning the file with an extension of .dat. As you can guess what I am trying to achieve, I need the .dat file for analysis. I am running this from a privileged account. My only thought was that it is a read/write conflict and the host file was currently utilizing it (or other sys file). What switch am I missing? The "mode" for the file that will not copy over is -a---. Not hidden, just not copying. Suggestions elsewhere have said to use xCopy/robocopy- if possible I do not want to call another dependancy- im already using powershell for the majority of the script, id prefer to stick with it....Any thoughts? Thanks in advance, this one has been tickling my brain for a little...
The only way to copy a file in use is to find the locking handle close it then retry the copy operation(handle.exe).
From your question it looks like you are trying to remotely copy user profiles which includes ntuser.dat and other files that would be needed to keep the profile working properly. Even if you did manage to find a way to unload the dat file(s), you would have to consider the impact that would have on the remote system.
Shadow copy is typically used by backup programs to copy files in use so your best bet would be to find the latest backup of each remote computer and then try to extract the needed files from the backed-up copies or maybe wait for the users to logoff and then try.

Created copy taking too much free space

I found a problem using robocopy in PowerShell. I used this tool to backup files from one disk (around 220GB) using command:
robocopy $source $destination /s /mt:8
The problem is that created copy took a lot of free space in the destination location (I stopped making backup when it reached around 850GB). Does anyone know why it has happened?
May be there're some loops involved.
robocopy has
Ability to skip NTFS junction points which can cause copying failures because of infinite loops
Try to run with /XJ flag or simply list/log what files are copied to check for loops
See robocopy help and
post about it
UP For those who faces same problem:
there were infinite loops which I found using WinDirStat. Mostly it were Application Data/Documments and Settings/Users folders

Execute robocopy powershell continuously between two times established

I have a program that creates temporary files in a specific folder. Then, automatically, after a few seconds, these files are deleted.
I wanted to copy those temporal files to an specific folder, I would like to use a powershell script to do this:
robocopy startFolder destinationFolder *.TIFF *.JPEG *.jpg *.PNG *.GIF *.BMP *.ICO *.PBM *.PGM *.PPM /s /XO
My problem is that I couldn't use a scheduled task (because of the problem with limitation of seconds) or install this powershell as a Windows Service with a powershell script (as far as I know is a bad practice) . I need this powershell running all the time trying to get files at the moment that they are created, before this folders were deleted.
Could you give me a hand please? Thanks!
Not sure it's quite what you want, but robocopy does have directory monitoring funcitonality built-in. You could add /mon:1 which should monitor the source directory and re-run the copy when it detects one change (a new or changed file, for example).
However, a down-side of this perhaps is that using this method, robocopy won't exit - it will run until you kill it.
Edit: I've just noticed you specify in your question title that this should run between two established times, in which case you could add the /rh:hhmm-hhmm option to specify times between which new copies can be started. For example, /rh:1000-1200 should only perform the copies (and hence monitoring) between 10am and midday.
Caveat: I've not tried using the "monitor" option of robocopy, so I'm not sure what sort of delay there would be between a change taking place, and the copy being re-run, but it's worth a shot.

How do I copy from numerous release directories to a single folder

Okay this is and isn't programming related I guess...
I've got a whole bunch of little useful console utilities scattered across a suite of projects that I wrote and I want to dump them all to a single directory to make using them simpler. The only issue is that I have them all compiled in both Debug and Release mode.
Given that I only want the release mode versions in my utilities directory, what switch would allow me to specify that I want all executables from my tree structure but only from within Release folders:
Example:
Projects\
Project1\
Bin\
Debug\
Project1.exe
Release\
Project1.exe
Project2\
etc etc...
To
Utilities\
Project1.exe
Project2.exe
Project3.exe
Project4.exe
...
etc etc...
I figured this would be a cinch with XCopy - but it doesn't seem to allow me to exclude the Debug directories - or rather - only include items in my Release directories.
Any ideas?
You can restrict it to only release executables with the following. However, I do not believe the other requirement of flattening is possible using xcopy alone. To do the restriction:
First create a file such as exclude.txt and put this inside:
\Debug\
Then use the following command:
xcopy /e /EXCLUDE:exclude.txt *.exe C:\target
You can, however, accomplish what you want using xxcopy (free for non-commercial use). Read technical bulletin #16 for an explanation of the flattening features.
If the claim in that technical bulletin is correct, then it confirms that flattening cannot be accomplished with xcopy alone.
The following command will do exactly what you want using xxcopy:
xxcopy /sgfo /X:*\Debug\* .\Projects\*.exe .\Utilities
I recommend reading the technical bulletin, however, as it gives more sophisticated options for the flattening. I chose one of the most basic above.
Sorry, I haven't tried it yet, but shouldn't you be using:
xcopy release*.exe d:\destination /s
I am currently on my Mac so, I cant really check to be for sure.
This might not help you with assembling them all in one place now, but going forward have you considered adding a post-build event to the projects in Visual Studio (I'm assuming you are using it based on the directory names)
xcopy /Y /I /E "$(TargetDir)\$(TargetFileName)" "c:\somedirectory\$(TargetFileName)"
Ok, this is probably not going to work for you since you seem to be on a windows machine.
Here goes anyway, for the logic.
# From the base directory
mkdir Utilities
find . -type f | grep -w Release > utils.txt
for f in $(<utils.txt); do cp $f Utilities/; done
You can combine the find and cp lines into one, I split them for readability.
To do this on a windows machine you'll need Cygwin or some such Unix Utilities handy.
Maybe there are tools in the Windows shell to do this...
This may help get you started:
C:\>for %i in (*) do dir "%~dpi\*.exe"
Used in the dir command as a modifier to i, ~dp uses the drive and path of everything found in (*). If I run the above in a folder that has several subfolders containing executables, I get a dir list of all of the executables in each folder.
You should be able to modify that to add '\bin\release\' following the ~dpi portion and change dir to xcopy. A little experimentation should make it pretty easy.
To use the for statement above in a batch file, change '%' to '%%' in both places.