This is similar to PowerShell Flatten Directory Structure , but different.
I have a bunch of folders from ripping albums of the form artist/title/<disc number (an int)>. This is good if there's more than one disc, but if not, you have artist/title/1/, where it'd make more sense to move the contents of 1 up and delete the now empty 1 directory.
This would be only for the case where 1 is the only sub-dir; and note that there may be more multiple titles under artist.
Any suggestions on doing this in Powershell, or another way without going to Python?
Related
I am really new to powershell, really need help with this.
I have a directory with multiple folders with multiple files, some files in different folders are having the same name with or without the same content. All I need to do is to filter out all the files with the same name and with the same content and move them to the other folder.
I tried the way using in this link
https://sid-500.com/2020/04/26/find-duplicate-files-with-powershell/
the issue here is that get-filehash can find the duplicates by contents, but not by name (for example: dir\a\a.txt and dir\a\b.txt will be considered as duplicates but I want dir\a\a.txt and dir\b\a.txt be the duplicates) and also it cannot identify empty files.
apologies to start as im new to powershell and robocopy.
i have a robocopy command that pulls in any files within its many subfolders that are within a maxage of 7. however, the main folder has a huge amount of folders dating back years(and i only need last 7 days each week it runs) so its slow reading each file in each folder before it even copies using robocopy.
it looks like powershell commands may be a way for me to limit the search of files for my robocopy, would this be possible? currently robocopy search each files in each folder in my main folder, ideally i would want it to be smart enough to only search even a months worth of files and then copy over last 7 days. this would speed up the run time hugely.
if possible even further, i only want csv files in each of the folders in my main folder but current robocopy searches the other folders and its files as well which takes time. all the csv files are in a folder called "run" in each parent folder(parent folder is a unique number within the "mainfolder".
my robocopy command:
robocopy \\server\mainfolder \\server\new_main_folder /S /maxage:7 /r:0 /w:0
I was going to point to you either FastCopy or FreeFileSync, both handle long file name paths and work well for me. But found problems running FastCopy when trying to filter folders the way you described. I wasn't getting the results I expected, so that leaves FreeFileSync. There is a little bit of a learning curve with FreeFileSync, but really, the only problem/complaint I've had with it is the xml based batch script that you can use to automate the program kept changing formats and they haven't been providing a way to read the old xml batch scripts with the new version of the software. Maybe that has changed, I haven't looked into that lately.
Maybe other people have had better experience with RoboCopy, but I found it to take literally many multiples longer to do the same job as many other copy programs. I don't think FreeFileSync is as fast as FastCopy, but I've never seen it act as bad as what I experienced with RoboCopy.
The way FreeFileSync works is:
You define 1 or more source/destination pairs.
There is a global setting at the top to set the defaults for all copy pairs.
There are individual settings per each copy pair that when set override the global settings.
In the filter tab of the settings you can set "Time span:" to "Last x days:" and set it to the 7 days that you want.
You can change include from * to something like \run\*.csv. I didn't try that exact pattern, but the patterns I did try worked as expected (Unlike FastCopy).
The Synchronization tab is the tricky/fun one. You can do logs, versioning, tell the system to shutdown or restart when done, maintain a database for tracking moved files ("Detect moved files" checkbox), and all kinds of adjustments to how it behaves when files don't match.
When done, there is I believe at least 2 options for saving the configuration - though I've always just created the xml based batch script and called that from another scripting language or an icon on the desktop.
I'm looking for a way to automate archiving where after I plug my two external drives I can copy all my resources. The problem is that I have different file structures on my laptop and on both external drives so I need to select specific folders to be copied. It means that I can't select one root folder and copy it straightforward. I tried to find a way to declare more than one path in the cp command and in the copy command, without success. An example path:
/my_programming_stuff
/folder1
/folder2
/folder3
/folder4
I want to select only the first 3 folders to copy them into external drive1 and external drive 2. The idea is to create a .bat file that will copy everything at once ( in the best case scenario it will be copied simultaneously on both external drives, so it will be much faster). Another problem is that there needs to be a bypass the ntfs long path limitations (max. 260 characters).
Flags that I want to use:
Copy the files and directories and all of their attributes,
including ownerships and permissions.
Recursively copy directories and their contents.
When copying files from one directory to another, only
copy files that either doesn't exist or are newer than the
existing corresponding files, in the destination
directory.
data verification (so it's certain that the copy was verified)
progression bar with time eta
Until now I was using Total Commander to do this but every day I need to pick only a few folders to be copied which takes time and is inefficient.
I have experience with Bash and PowerShell but I am not sure how to handle this topic.
Create a static batch file with robocopy commands. I think /copyall is the only switch you need to specify for all this. Other defaults should satisfy requirements.
https://learn.microsoft.com/en-us/windows-server/administration/windows-commands/robocopy
I think your time will be better spent learning how to use either FastCopy or FreeFileSynce. I used FreeFileSync some years ago but got disgusted with the it's constantly changing format of its xml file used for starting a backup, so I switched to FastCopy. But it looks like FreeFileSync may be getting their act together and I aim to do some experiments over the summer to see if I want to switch back to it.
Both can handle the long filename format issues, both can be executed by a batch file, both seem to have a lot of quality, but FreeFileSync has more features - and more bloated because of the features. But speed wise, I think FastCopy is probably one of the better products out there and very streamline in use and design.
is there a way to get a file list recursively based on one file collection that points to a directory in fileadmin?
Currently I only got it to work with files directly in that directory, not also with files in sub-directories of that directory.
So instead of setting lots of file collections for each (sub directory)
I'd like to set only the "top"level directory (here "Kurs77") and have the files, even from sub directories, displayed.
Reason is, editors may add an unknown amount of (sub)sdirectories, and I'd like to have the files automagically displayed in the file list in the front end -- without the need to create an increasing amount of file collections.
cheers,
Tom
it seems that this is a missing feature. Check out https://forge.typo3.org/issues/61238. It seems that the underlaying API is able to do that.
So one solution would be to use TypoScript to make that work.
To give the correct answer now: The recursive option is of course available but it is part of the sys_file_collection record.
In TYPO3 9 this is working out of the box. pity is not showing folder as title, but recursive works:
I would like to delete multiple files that contain a substring. Say for example I would like to delete all the files that has the substring my. Assume that my directory contains 4 files: photo.jpg, myPhoto.jpg, beachMyPhoto.jpg, anyPhoto.jpg, since the term of search is my the files that I am interested to delete are myPhoto.jpg and beachMyPhoto.jpg (case insensitive).
My proposed solution (which I know how to do) is to use NSFileManager class, and use the function contentsOfDirectoryAtPath:error: to read all the directory contents, and then search by a loop for a hit. If a hit is found I delete that file.
What I don't like in my proposed solution is that it is not that efficient especially if the directory contains too many files and the hit is a small number. Is there a more efficient way to do this?
If you don't want a big array loaded into memory, you can try -[NSFileManager enumeratorAtURL:includingPropertiesForKeys:options:errorHandler:]. Since you only want the immediate contents of the directory, you would invoke -[NSDirectoryEnumerator skipDescendants] for each directory that it returns.
If your concern is iterating over all of the items in the directory, testing for your match pattern, well that's unavoidable. Any technique you would hope to use has to somehow iterate over all of the items in the directory and test for a match. The only question is whether that iteration is exposed to you or not. In Cocoa, it is. You could drop down to the glob() function if you want an alternative where it isn't.