Compare different versions of the same directory (by date modified) - powershell

This is a multi-part question. I can fill in details once I get to a working prototype.
Situation: Due to a comedy of errors, I have three copies of a very large directory, each copy has some new files/versions of files that are unique. I would like to combine these, keeping the newest version of every file.
Breakdown of things I don't know: How to compare, recursively, directories to one another (probably going to do two at a time; 1 vs 2 = 1+2, then 1+2 vs 3 = 1+2+3). Step crucial to this, how to use the path/filename of a file in directory 1 to first see if it can be found in directory 2, then, if found, use date modified to determine whether to make a copy from 1 or 2 to the new combined directory.
I think with these 3 pieces of information (recursively compare files b/t two directories, by path, and by date modified), I can piece together how to script this. While I can look up these bits separately, it's going to be tough to convince myself this process was done correctly and I'd like to have a little help with the actual assessment/moving step so I have less worry that I've overlooked some small but crucial detail.
Will post the script when I have it put together, along with any caveats about my confidence in it.

Don't waste time writing a script when robocopy is built for file copying and has enough options to cover pretty much any situation...
By default it will only copy a file if the source and destination have different time stamps or different file sizes.
Using /XO will exclude older files that differ, so you will only end up with the newest files in destination.
/E includes subfolders inc empty ones, change to /S to not include empty.
robocopy C:\source1 C:\destination /E /XO
robocopy C:\source2 C:\destination /E /XO
[etc]

Related

using powershell to only read last 1month of folders. copy in 7 days worth of folders then using robocopy

apologies to start as im new to powershell and robocopy.
i have a robocopy command that pulls in any files within its many subfolders that are within a maxage of 7. however, the main folder has a huge amount of folders dating back years(and i only need last 7 days each week it runs) so its slow reading each file in each folder before it even copies using robocopy.
it looks like powershell commands may be a way for me to limit the search of files for my robocopy, would this be possible? currently robocopy search each files in each folder in my main folder, ideally i would want it to be smart enough to only search even a months worth of files and then copy over last 7 days. this would speed up the run time hugely.
if possible even further, i only want csv files in each of the folders in my main folder but current robocopy searches the other folders and its files as well which takes time. all the csv files are in a folder called "run" in each parent folder(parent folder is a unique number within the "mainfolder".
my robocopy command:
robocopy \\server\mainfolder \\server\new_main_folder /S /maxage:7 /r:0 /w:0
I was going to point to you either FastCopy or FreeFileSync, both handle long file name paths and work well for me. But found problems running FastCopy when trying to filter folders the way you described. I wasn't getting the results I expected, so that leaves FreeFileSync. There is a little bit of a learning curve with FreeFileSync, but really, the only problem/complaint I've had with it is the xml based batch script that you can use to automate the program kept changing formats and they haven't been providing a way to read the old xml batch scripts with the new version of the software. Maybe that has changed, I haven't looked into that lately.
Maybe other people have had better experience with RoboCopy, but I found it to take literally many multiples longer to do the same job as many other copy programs. I don't think FreeFileSync is as fast as FastCopy, but I've never seen it act as bad as what I experienced with RoboCopy.
The way FreeFileSync works is:
You define 1 or more source/destination pairs.
There is a global setting at the top to set the defaults for all copy pairs.
There are individual settings per each copy pair that when set override the global settings.
In the filter tab of the settings you can set "Time span:" to "Last x days:" and set it to the 7 days that you want.
You can change include from * to something like \run\*.csv. I didn't try that exact pattern, but the patterns I did try worked as expected (Unlike FastCopy).
The Synchronization tab is the tricky/fun one. You can do logs, versioning, tell the system to shutdown or restart when done, maintain a database for tracking moved files ("Detect moved files" checkbox), and all kinds of adjustments to how it behaves when files don't match.
When done, there is I believe at least 2 options for saving the configuration - though I've always just created the xml based batch script and called that from another scripting language or an icon on the desktop.

how to create a script that allows to use the path list as a reference for copying files in PowerShell in .bat script

I'm looking for a way to automate archiving where after I plug my two external drives I can copy all my resources. The problem is that I have different file structures on my laptop and on both external drives so I need to select specific folders to be copied. It means that I can't select one root folder and copy it straightforward. I tried to find a way to declare more than one path in the cp command and in the copy command, without success. An example path:
/my_programming_stuff
/folder1
/folder2
/folder3
/folder4
I want to select only the first 3 folders to copy them into external drive1 and external drive 2. The idea is to create a .bat file that will copy everything at once ( in the best case scenario it will be copied simultaneously on both external drives, so it will be much faster). Another problem is that there needs to be a bypass the ntfs long path limitations (max. 260 characters).
Flags that I want to use:
Copy the files and directories and all of their attributes,
including ownerships and permissions.
Recursively copy directories and their contents.
When copying files from one directory to another, only
copy files that either doesn't exist or are newer than the
existing corresponding files, in the destination
directory.
data verification (so it's certain that the copy was verified)
progression bar with time eta
Until now I was using Total Commander to do this but every day I need to pick only a few folders to be copied which takes time and is inefficient.
I have experience with Bash and PowerShell but I am not sure how to handle this topic.
Create a static batch file with robocopy commands. I think /copyall is the only switch you need to specify for all this. Other defaults should satisfy requirements.
https://learn.microsoft.com/en-us/windows-server/administration/windows-commands/robocopy
I think your time will be better spent learning how to use either FastCopy or FreeFileSynce. I used FreeFileSync some years ago but got disgusted with the it's constantly changing format of its xml file used for starting a backup, so I switched to FastCopy. But it looks like FreeFileSync may be getting their act together and I aim to do some experiments over the summer to see if I want to switch back to it.
Both can handle the long filename format issues, both can be executed by a batch file, both seem to have a lot of quality, but FreeFileSync has more features - and more bloated because of the features. But speed wise, I think FastCopy is probably one of the better products out there and very streamline in use and design.

Using diff3 where filenames contain a dash (-)

I'm trying to use diff3 in this way
diff3 options... mine older yours
My problem is that I probably can't use it, since all my 3 files contain a "dash" within.
The manual mentions:
At most one of these three file names may be `-', which tells diff3 to read the standard input for that file.
so I probably have to rename filenames before running diff3.
If you know for a better solution or a workaround, please let me know about. Thank you!
At most one of these three file names may be `-', which tells diff3 to read the standard input for that file.
It does not state, that your filenames should not contain dash symbols. It simply says, that if you want to, you can put - instead of one of the names, in which case the standard input will be read instead of reading one of the files.
So, you can have as many dashes in your filenames as you like and diff3 should work just fine.
However, on Windows putting filenames in "" for escaping space characters does not work, and I failed to find a suitable workaround. However, you can automatize the process of renaming files (if the files are relatively small, this would not even be too inefficient):
#echo off
copy %1 tempfile_1.txt
copy %2 tempfile_2.txt
copy %3 tempfile_3.txt
"C:\Program Files (x86)\KDiff3\bin\diff3.exe" -E tempfile_1.txt tempfile_2.txt tempfile_3.txt
del tempfile_1.txt tempfile_2.txt tempfile_3.txt
Put this in a file like diff3.cmd, then run diff3.cmd "first file.txt" "second file.txt" "third file.txt".
P.S. Moving files would be more efficient (if they are on the same disk volume as the script, which they are not in your case), you could even move them back to where they were initially, but for some time they would not be present at their original folder.

Erasing old folders when moving folders/files with Robocopy

I am using Robocopy to archive files/folders over X days on our server and am finding that my filters must not be correctly set. The move executes correctly, but the old folders are left on the source server once the move is complete, leaving me with many empty folders and subfolders.
Here is my script:
Robocopy "source" "destination" /DCOPY:T /tee /mt:16 /MOVE /MINAGE:120 /LOG+:Log.txt
What am I missing?
You need /E to copy (empty) subfolders
http://ss64.com/nt/robocopy.html
One problem I have found with some versions of Robocopy is that if you use the /mt switch with the /move switch, it appears to leave behind folders that are now empty. Try to remove the /mt switch and see if that works better for you since that helped for me.
The /MT: option has nothing to do with the date, it is the number of threads used by robocopy. The original question remains: if you use robocopy to MOVE a lot of folders with subfolders, the subfolder at the deepest level is indeed MOVED, the folders higher up in the tree remain (albeit empty). This has nothing to do with "by design", it's a bug. In the older versions it worked as expected. If you moved a folder with subfolders 10 levels deep, everything was MOVED. Now the deepest one is moved, all the rest remain as empty folders. Files are moved as expected.
You may remove the /MT switch, it doesn't change anything because the default value of 8 is automatically applied.
If your folder was modified less than 16 days ago, it will not be "moved" (and so deleted) since it does not fit in the filter /MINAGE:120.
You may need a routine before robocopy to set the folder's date to the one of the last modified file it contained.

Copy all files of one type from directory into one folder

This is a new one for me so I'm pretty much flying blind.
I have a folder at 192.168.1.2\mainFolder that contains folder1, folder2, and folder3. Inside each of those folders are a handful of different file extentions, and a couple of files of each type. I need to take all files that exist inside mainFolder of the .dep type, and copy them to 192.168.1.2\copyFolder
copyFolder will not have any folders inside it, but just many many files.
What is the best way to go about doing this? I have been told by TPTB that robocopy would be helpful, however I have never used robocopy and thought you guys may know of something better
So you don't want .dep files inside folder1, 2 etc.? Robocopy / xcopy is usually a good choice. Powershell is slow for such a simple operation. If you just want the .dep files in mainfolder but not those inside the subfolders, try:
robocopy \\192.168.1.2\mainFolder \\192.168.1.2\copyFolder *.dep