I am using Robocopy to archive files/folders over X days on our server and am finding that my filters must not be correctly set. The move executes correctly, but the old folders are left on the source server once the move is complete, leaving me with many empty folders and subfolders.
Here is my script:
Robocopy "source" "destination" /DCOPY:T /tee /mt:16 /MOVE /MINAGE:120 /LOG+:Log.txt
What am I missing?
You need /E to copy (empty) subfolders
http://ss64.com/nt/robocopy.html
One problem I have found with some versions of Robocopy is that if you use the /mt switch with the /move switch, it appears to leave behind folders that are now empty. Try to remove the /mt switch and see if that works better for you since that helped for me.
The /MT: option has nothing to do with the date, it is the number of threads used by robocopy. The original question remains: if you use robocopy to MOVE a lot of folders with subfolders, the subfolder at the deepest level is indeed MOVED, the folders higher up in the tree remain (albeit empty). This has nothing to do with "by design", it's a bug. In the older versions it worked as expected. If you moved a folder with subfolders 10 levels deep, everything was MOVED. Now the deepest one is moved, all the rest remain as empty folders. Files are moved as expected.
You may remove the /MT switch, it doesn't change anything because the default value of 8 is automatically applied.
If your folder was modified less than 16 days ago, it will not be "moved" (and so deleted) since it does not fit in the filter /MINAGE:120.
You may need a routine before robocopy to set the folder's date to the one of the last modified file it contained.
Related
I have a set of folders which have new files being added to them regularly, but I have to process those files as they come in. This can be a time consuming process to dig into each folder one by one. I need to figure out how to write a script that will filter out the new files, and copy them into a new directory.
So far I have figured out how to use the Get-ChildItem -Path -Recurse command in powershell to list the new items in the corresponding folders as shown in the third script on this Microsoft page.
So I can see the new files in their folders. How do I copy those items to the destination folder while replicating their original folder structure? I want to be able to recreate the original folders so that I can just overwrite the originals with the edited versions later.
I discovered robocopy, and was able to use it to solve my problem. The /maxage:x option was perfect for my needs.
apologies to start as im new to powershell and robocopy.
i have a robocopy command that pulls in any files within its many subfolders that are within a maxage of 7. however, the main folder has a huge amount of folders dating back years(and i only need last 7 days each week it runs) so its slow reading each file in each folder before it even copies using robocopy.
it looks like powershell commands may be a way for me to limit the search of files for my robocopy, would this be possible? currently robocopy search each files in each folder in my main folder, ideally i would want it to be smart enough to only search even a months worth of files and then copy over last 7 days. this would speed up the run time hugely.
if possible even further, i only want csv files in each of the folders in my main folder but current robocopy searches the other folders and its files as well which takes time. all the csv files are in a folder called "run" in each parent folder(parent folder is a unique number within the "mainfolder".
my robocopy command:
robocopy \\server\mainfolder \\server\new_main_folder /S /maxage:7 /r:0 /w:0
I was going to point to you either FastCopy or FreeFileSync, both handle long file name paths and work well for me. But found problems running FastCopy when trying to filter folders the way you described. I wasn't getting the results I expected, so that leaves FreeFileSync. There is a little bit of a learning curve with FreeFileSync, but really, the only problem/complaint I've had with it is the xml based batch script that you can use to automate the program kept changing formats and they haven't been providing a way to read the old xml batch scripts with the new version of the software. Maybe that has changed, I haven't looked into that lately.
Maybe other people have had better experience with RoboCopy, but I found it to take literally many multiples longer to do the same job as many other copy programs. I don't think FreeFileSync is as fast as FastCopy, but I've never seen it act as bad as what I experienced with RoboCopy.
The way FreeFileSync works is:
You define 1 or more source/destination pairs.
There is a global setting at the top to set the defaults for all copy pairs.
There are individual settings per each copy pair that when set override the global settings.
In the filter tab of the settings you can set "Time span:" to "Last x days:" and set it to the 7 days that you want.
You can change include from * to something like \run\*.csv. I didn't try that exact pattern, but the patterns I did try worked as expected (Unlike FastCopy).
The Synchronization tab is the tricky/fun one. You can do logs, versioning, tell the system to shutdown or restart when done, maintain a database for tracking moved files ("Detect moved files" checkbox), and all kinds of adjustments to how it behaves when files don't match.
When done, there is I believe at least 2 options for saving the configuration - though I've always just created the xml based batch script and called that from another scripting language or an icon on the desktop.
This is a multi-part question. I can fill in details once I get to a working prototype.
Situation: Due to a comedy of errors, I have three copies of a very large directory, each copy has some new files/versions of files that are unique. I would like to combine these, keeping the newest version of every file.
Breakdown of things I don't know: How to compare, recursively, directories to one another (probably going to do two at a time; 1 vs 2 = 1+2, then 1+2 vs 3 = 1+2+3). Step crucial to this, how to use the path/filename of a file in directory 1 to first see if it can be found in directory 2, then, if found, use date modified to determine whether to make a copy from 1 or 2 to the new combined directory.
I think with these 3 pieces of information (recursively compare files b/t two directories, by path, and by date modified), I can piece together how to script this. While I can look up these bits separately, it's going to be tough to convince myself this process was done correctly and I'd like to have a little help with the actual assessment/moving step so I have less worry that I've overlooked some small but crucial detail.
Will post the script when I have it put together, along with any caveats about my confidence in it.
Don't waste time writing a script when robocopy is built for file copying and has enough options to cover pretty much any situation...
By default it will only copy a file if the source and destination have different time stamps or different file sizes.
Using /XO will exclude older files that differ, so you will only end up with the newest files in destination.
/E includes subfolders inc empty ones, change to /S to not include empty.
robocopy C:\source1 C:\destination /E /XO
robocopy C:\source2 C:\destination /E /XO
[etc]
I know this sort of question has been asked many times, but never does anyone highlight the issue i am facing.
I have a script that will look at user profiles and mark ones that are over X amount of days as ones to delete, then remove them. Remove-Item, with -Force and -Recurse make it remove all folders/files apart from the standard NTFS junction points for all users. For these sort of folders it gets access denied. I have even tried taking ownership of user folders first - still it happens. These folders on W7 being like:
C:\Users\<NAME>\My Documents
C:\Users\<NAME>\Start Menu
No matter how i make the script it cannot delete the top level user folder. With the same account, same PC - if i just use windows explorer to right click and delete, the folder will be removed along with the sub-folders.
For the record these are the methods i have tried:
Remove-Item (with -force -recurse)
[io.directory]::delete()
$variablename.delete()
I could post the script but it is kind of irrelevant, as the bulk of it works its just these junction points.
I suppose this is my question - How do i invoke the same delete command Windows Explorer is using from within PowerShell?
Thanks in advance.
Try this answer, it shows you how to remove symlinks and you can incorporate that into your code: : Delete broken link
Post errors if it doesn't work.
For me this has to do with the read-attribute on folders set. If I create a junction, I have the habit of immediately changing the icon for it as shown in Explorer. This will set the readonly attribute of the junction which you can't change with explorer, but I can change it with the attrib command :
attrib -r /d /s Junk
where Junk is a symbolic link to a folder. After that, I can remove the folder with the 'rm'-command
This is a new one for me so I'm pretty much flying blind.
I have a folder at 192.168.1.2\mainFolder that contains folder1, folder2, and folder3. Inside each of those folders are a handful of different file extentions, and a couple of files of each type. I need to take all files that exist inside mainFolder of the .dep type, and copy them to 192.168.1.2\copyFolder
copyFolder will not have any folders inside it, but just many many files.
What is the best way to go about doing this? I have been told by TPTB that robocopy would be helpful, however I have never used robocopy and thought you guys may know of something better
So you don't want .dep files inside folder1, 2 etc.? Robocopy / xcopy is usually a good choice. Powershell is slow for such a simple operation. If you just want the .dep files in mainfolder but not those inside the subfolders, try:
robocopy \\192.168.1.2\mainFolder \\192.168.1.2\copyFolder *.dep