So I created a subfolder of Local Folders (I'll call it "bugs" here), a long time ago, and there are some messages in it, and also some subfolders.
Now I created a new subfolder of Local Folders (I'll call it "bugs-save" here), and at first I couldn't find it... I expected it to appear just after "bugs" from the sort order. But instead, it is way down at the bottom, after folders named "Sent", "Archives", "Trash" and "Outbox" (which I don't use, but don't have options to remove either). I guess since they were at the bottom, I was ignoring them. But this has caused me to realize that the subfolders of Local Folders are not in alphabetical order.
Why not?
What is the order?
Can I change the order to alphabetical?
Related
I have a set of folders which have new files being added to them regularly, but I have to process those files as they come in. This can be a time consuming process to dig into each folder one by one. I need to figure out how to write a script that will filter out the new files, and copy them into a new directory.
So far I have figured out how to use the Get-ChildItem -Path -Recurse command in powershell to list the new items in the corresponding folders as shown in the third script on this Microsoft page.
So I can see the new files in their folders. How do I copy those items to the destination folder while replicating their original folder structure? I want to be able to recreate the original folders so that I can just overwrite the originals with the edited versions later.
I discovered robocopy, and was able to use it to solve my problem. The /maxage:x option was perfect for my needs.
I am really new to powershell, really need help with this.
I have a directory with multiple folders with multiple files, some files in different folders are having the same name with or without the same content. All I need to do is to filter out all the files with the same name and with the same content and move them to the other folder.
I tried the way using in this link
https://sid-500.com/2020/04/26/find-duplicate-files-with-powershell/
the issue here is that get-filehash can find the duplicates by contents, but not by name (for example: dir\a\a.txt and dir\a\b.txt will be considered as duplicates but I want dir\a\a.txt and dir\b\a.txt be the duplicates) and also it cannot identify empty files.
apologies to start as im new to powershell and robocopy.
i have a robocopy command that pulls in any files within its many subfolders that are within a maxage of 7. however, the main folder has a huge amount of folders dating back years(and i only need last 7 days each week it runs) so its slow reading each file in each folder before it even copies using robocopy.
it looks like powershell commands may be a way for me to limit the search of files for my robocopy, would this be possible? currently robocopy search each files in each folder in my main folder, ideally i would want it to be smart enough to only search even a months worth of files and then copy over last 7 days. this would speed up the run time hugely.
if possible even further, i only want csv files in each of the folders in my main folder but current robocopy searches the other folders and its files as well which takes time. all the csv files are in a folder called "run" in each parent folder(parent folder is a unique number within the "mainfolder".
my robocopy command:
robocopy \\server\mainfolder \\server\new_main_folder /S /maxage:7 /r:0 /w:0
I was going to point to you either FastCopy or FreeFileSync, both handle long file name paths and work well for me. But found problems running FastCopy when trying to filter folders the way you described. I wasn't getting the results I expected, so that leaves FreeFileSync. There is a little bit of a learning curve with FreeFileSync, but really, the only problem/complaint I've had with it is the xml based batch script that you can use to automate the program kept changing formats and they haven't been providing a way to read the old xml batch scripts with the new version of the software. Maybe that has changed, I haven't looked into that lately.
Maybe other people have had better experience with RoboCopy, but I found it to take literally many multiples longer to do the same job as many other copy programs. I don't think FreeFileSync is as fast as FastCopy, but I've never seen it act as bad as what I experienced with RoboCopy.
The way FreeFileSync works is:
You define 1 or more source/destination pairs.
There is a global setting at the top to set the defaults for all copy pairs.
There are individual settings per each copy pair that when set override the global settings.
In the filter tab of the settings you can set "Time span:" to "Last x days:" and set it to the 7 days that you want.
You can change include from * to something like \run\*.csv. I didn't try that exact pattern, but the patterns I did try worked as expected (Unlike FastCopy).
The Synchronization tab is the tricky/fun one. You can do logs, versioning, tell the system to shutdown or restart when done, maintain a database for tracking moved files ("Detect moved files" checkbox), and all kinds of adjustments to how it behaves when files don't match.
When done, there is I believe at least 2 options for saving the configuration - though I've always just created the xml based batch script and called that from another scripting language or an icon on the desktop.
In the Matlab function precedence page, it states that function precedence goes:
Functions in the current folder.
Functions elsewhere on the path, in order of appearance.
My question is, when they say "Functions in the current folder" does this exclude functions in subfolders of the current folder? If so, is there a way for me to have subfolders be called preferentially without changing the order of my folders in the path?
I need to do this because I have 2 folders (each with subfolders) of code that run functions with the same name. It seems the subfolders aren't given automatic precedence. I really don't want to have to change my path order every time I run one folder, and I really don't want to have to rename 100s of functions and function calls that my team has written.
The only solution I can think of would be to remove the whole subfolder system and just have a jumbled mess of files in one folder. Are there any other things I can do?
Thanks in advance for the help!
I am using Robocopy to archive files/folders over X days on our server and am finding that my filters must not be correctly set. The move executes correctly, but the old folders are left on the source server once the move is complete, leaving me with many empty folders and subfolders.
Here is my script:
Robocopy "source" "destination" /DCOPY:T /tee /mt:16 /MOVE /MINAGE:120 /LOG+:Log.txt
What am I missing?
You need /E to copy (empty) subfolders
http://ss64.com/nt/robocopy.html
One problem I have found with some versions of Robocopy is that if you use the /mt switch with the /move switch, it appears to leave behind folders that are now empty. Try to remove the /mt switch and see if that works better for you since that helped for me.
The /MT: option has nothing to do with the date, it is the number of threads used by robocopy. The original question remains: if you use robocopy to MOVE a lot of folders with subfolders, the subfolder at the deepest level is indeed MOVED, the folders higher up in the tree remain (albeit empty). This has nothing to do with "by design", it's a bug. In the older versions it worked as expected. If you moved a folder with subfolders 10 levels deep, everything was MOVED. Now the deepest one is moved, all the rest remain as empty folders. Files are moved as expected.
You may remove the /MT switch, it doesn't change anything because the default value of 8 is automatically applied.
If your folder was modified less than 16 days ago, it will not be "moved" (and so deleted) since it does not fit in the filter /MINAGE:120.
You may need a routine before robocopy to set the folder's date to the one of the last modified file it contained.