How to delete a folder with content to recycle bin in Matlab - matlab

I am running Matlab R2020b on Windows 10.
Is there a way to move a folder with content to recycle bin programmatically in Matlab?
For example, the following works for single files
previousState=recycle('on');
delete(filename); % if replaced with rmdir(DIR,'s');, folder is deleted permanently
recycle(previousState);
but the same toggle doesn't work for folders. Is there a way?
The only possible workaround I can think of is to use wildcard: delete(fullfile(DIR,'*') and then rmdir(DIR) on the empty folder. But that doesn't work for my application. I wish to preserve temporary copies of folders in recycle bin in case my script that manipulate them throws warning in some unexpected way, in which case I can have a second chance to see the original files. There are hundreds of folders, each contain hundreds to thousands of files in this particular use case. The wildcard approach does put individual files into recycle bin but it loses the original folder structure, making it impractical to selectively recover folders. Hence the question.

Since the OS is specifically Windows 10, one can use powershell as shown here.
For example, with the following recycleFile.ps1 file in C:\
Param(
[string]$filePath # can be folder
)
$sh = new-object -comobject "Shell.Application"
$ns = $sh.Namespace(0).ParseName($filePath)
$ns.InvokeVerb("delete")
the following passage in Matlab will move the folder fullfilepath to recycle bin,
[status,cmdout]=system(['powershell.exe -ExecutionPolicy Bypass -File "',...
fullfile('C:','recycleFile.ps1'),'" "',...
fullfilepath),'"']);
There are several downsides, however.
The operation is equivalent to context menu -> delete. It will bring up the recycle progress UI which seizes foreground. It is also somewhat slow. It may even require user confirmation if the folder in question is not eligible for recycling but only for direct deletion.

Related

Scripting help for copying new items added to a set of directories

I have a set of folders which have new files being added to them regularly, but I have to process those files as they come in. This can be a time consuming process to dig into each folder one by one. I need to figure out how to write a script that will filter out the new files, and copy them into a new directory.
So far I have figured out how to use the Get-ChildItem -Path -Recurse command in powershell to list the new items in the corresponding folders as shown in the third script on this Microsoft page.
So I can see the new files in their folders. How do I copy those items to the destination folder while replicating their original folder structure? I want to be able to recreate the original folders so that I can just overwrite the originals with the edited versions later.
I discovered robocopy, and was able to use it to solve my problem. The /maxage:x option was perfect for my needs.

using powershell to only read last 1month of folders. copy in 7 days worth of folders then using robocopy

apologies to start as im new to powershell and robocopy.
i have a robocopy command that pulls in any files within its many subfolders that are within a maxage of 7. however, the main folder has a huge amount of folders dating back years(and i only need last 7 days each week it runs) so its slow reading each file in each folder before it even copies using robocopy.
it looks like powershell commands may be a way for me to limit the search of files for my robocopy, would this be possible? currently robocopy search each files in each folder in my main folder, ideally i would want it to be smart enough to only search even a months worth of files and then copy over last 7 days. this would speed up the run time hugely.
if possible even further, i only want csv files in each of the folders in my main folder but current robocopy searches the other folders and its files as well which takes time. all the csv files are in a folder called "run" in each parent folder(parent folder is a unique number within the "mainfolder".
my robocopy command:
robocopy \\server\mainfolder \\server\new_main_folder /S /maxage:7 /r:0 /w:0
I was going to point to you either FastCopy or FreeFileSync, both handle long file name paths and work well for me. But found problems running FastCopy when trying to filter folders the way you described. I wasn't getting the results I expected, so that leaves FreeFileSync. There is a little bit of a learning curve with FreeFileSync, but really, the only problem/complaint I've had with it is the xml based batch script that you can use to automate the program kept changing formats and they haven't been providing a way to read the old xml batch scripts with the new version of the software. Maybe that has changed, I haven't looked into that lately.
Maybe other people have had better experience with RoboCopy, but I found it to take literally many multiples longer to do the same job as many other copy programs. I don't think FreeFileSync is as fast as FastCopy, but I've never seen it act as bad as what I experienced with RoboCopy.
The way FreeFileSync works is:
You define 1 or more source/destination pairs.
There is a global setting at the top to set the defaults for all copy pairs.
There are individual settings per each copy pair that when set override the global settings.
In the filter tab of the settings you can set "Time span:" to "Last x days:" and set it to the 7 days that you want.
You can change include from * to something like \run\*.csv. I didn't try that exact pattern, but the patterns I did try worked as expected (Unlike FastCopy).
The Synchronization tab is the tricky/fun one. You can do logs, versioning, tell the system to shutdown or restart when done, maintain a database for tracking moved files ("Detect moved files" checkbox), and all kinds of adjustments to how it behaves when files don't match.
When done, there is I believe at least 2 options for saving the configuration - though I've always just created the xml based batch script and called that from another scripting language or an icon on the desktop.

how to create a script that allows to use the path list as a reference for copying files in PowerShell in .bat script

I'm looking for a way to automate archiving where after I plug my two external drives I can copy all my resources. The problem is that I have different file structures on my laptop and on both external drives so I need to select specific folders to be copied. It means that I can't select one root folder and copy it straightforward. I tried to find a way to declare more than one path in the cp command and in the copy command, without success. An example path:
/my_programming_stuff
/folder1
/folder2
/folder3
/folder4
I want to select only the first 3 folders to copy them into external drive1 and external drive 2. The idea is to create a .bat file that will copy everything at once ( in the best case scenario it will be copied simultaneously on both external drives, so it will be much faster). Another problem is that there needs to be a bypass the ntfs long path limitations (max. 260 characters).
Flags that I want to use:
Copy the files and directories and all of their attributes,
including ownerships and permissions.
Recursively copy directories and their contents.
When copying files from one directory to another, only
copy files that either doesn't exist or are newer than the
existing corresponding files, in the destination
directory.
data verification (so it's certain that the copy was verified)
progression bar with time eta
Until now I was using Total Commander to do this but every day I need to pick only a few folders to be copied which takes time and is inefficient.
I have experience with Bash and PowerShell but I am not sure how to handle this topic.
Create a static batch file with robocopy commands. I think /copyall is the only switch you need to specify for all this. Other defaults should satisfy requirements.
https://learn.microsoft.com/en-us/windows-server/administration/windows-commands/robocopy
I think your time will be better spent learning how to use either FastCopy or FreeFileSynce. I used FreeFileSync some years ago but got disgusted with the it's constantly changing format of its xml file used for starting a backup, so I switched to FastCopy. But it looks like FreeFileSync may be getting their act together and I aim to do some experiments over the summer to see if I want to switch back to it.
Both can handle the long filename format issues, both can be executed by a batch file, both seem to have a lot of quality, but FreeFileSync has more features - and more bloated because of the features. But speed wise, I think FastCopy is probably one of the better products out there and very streamline in use and design.

Extracting Multiple 7z Files Overrides Same Folder

I'm currently working on a project where I take multiple 7z files and extract the contents of these files in a folder named the same way as the 7z file itself. I also apologize if something like this has been answered already; I spent time trying to investigate this issue but I can't seem to find anyone else who has had a similar issue.
For example:
a1.7z -> <targetpath>/a1/<contents within a1.7z>
The following shell line: a1.7z | % {& "C:\Program Files\7-Zip\7z.exe" "x" $_.fullname "-o<targetpath>\a1" -y -r}
Works like a dream, but only for one 7z file. However, whenever I start extracting a second 7z file, it won't create a new folder but instead will continue to add into the same first folder that is created; second folder is never made. When I manually highlight all of the 7z files I want to extract, right click and select "Extract to "*\", it does what I would like it to do but I can't figure out how to script this action. I should also mention that some of the 7z files, when extracted, can contain subfolders of the same name. I'm not sure if this is throwing off the recursion cycle, but I'm assuming this might be the case.
Any help or advice on this topic would be greatly appreciated!
If you get all the .7z files as IOFileInfo objects (Using get-ChildItem) you can use Mathias comment, as one way to do this with the pipeline, but I recommend you put this inside a loop and look for a better way to choose the names of the folders I.e. "NofFolder_$_.BaseName" just in case of more than 1 folder with the same name.
It really depends on the format you want.

Delete user folder even with all contents, Powershell

I know this sort of question has been asked many times, but never does anyone highlight the issue i am facing.
I have a script that will look at user profiles and mark ones that are over X amount of days as ones to delete, then remove them. Remove-Item, with -Force and -Recurse make it remove all folders/files apart from the standard NTFS junction points for all users. For these sort of folders it gets access denied. I have even tried taking ownership of user folders first - still it happens. These folders on W7 being like:
C:\Users\<NAME>\My Documents
C:\Users\<NAME>\Start Menu
No matter how i make the script it cannot delete the top level user folder. With the same account, same PC - if i just use windows explorer to right click and delete, the folder will be removed along with the sub-folders.
For the record these are the methods i have tried:
Remove-Item (with -force -recurse)
[io.directory]::delete()
$variablename.delete()
I could post the script but it is kind of irrelevant, as the bulk of it works its just these junction points.
I suppose this is my question - How do i invoke the same delete command Windows Explorer is using from within PowerShell?
Thanks in advance.
Try this answer, it shows you how to remove symlinks and you can incorporate that into your code: : Delete broken link
Post errors if it doesn't work.
For me this has to do with the read-attribute on folders set. If I create a junction, I have the habit of immediately changing the icon for it as shown in Explorer. This will set the readonly attribute of the junction which you can't change with explorer, but I can change it with the attrib command :
attrib -r /d /s Junk
where Junk is a symbolic link to a folder. After that, I can remove the folder with the 'rm'-command