I know this sort of question has been asked many times, but never does anyone highlight the issue i am facing.
I have a script that will look at user profiles and mark ones that are over X amount of days as ones to delete, then remove them. Remove-Item, with -Force and -Recurse make it remove all folders/files apart from the standard NTFS junction points for all users. For these sort of folders it gets access denied. I have even tried taking ownership of user folders first - still it happens. These folders on W7 being like:
C:\Users\<NAME>\My Documents
C:\Users\<NAME>\Start Menu
No matter how i make the script it cannot delete the top level user folder. With the same account, same PC - if i just use windows explorer to right click and delete, the folder will be removed along with the sub-folders.
For the record these are the methods i have tried:
Remove-Item (with -force -recurse)
[io.directory]::delete()
$variablename.delete()
I could post the script but it is kind of irrelevant, as the bulk of it works its just these junction points.
I suppose this is my question - How do i invoke the same delete command Windows Explorer is using from within PowerShell?
Thanks in advance.
Try this answer, it shows you how to remove symlinks and you can incorporate that into your code: : Delete broken link
Post errors if it doesn't work.
For me this has to do with the read-attribute on folders set. If I create a junction, I have the habit of immediately changing the icon for it as shown in Explorer. This will set the readonly attribute of the junction which you can't change with explorer, but I can change it with the attrib command :
attrib -r /d /s Junk
where Junk is a symbolic link to a folder. After that, I can remove the folder with the 'rm'-command
Related
I am running Matlab R2020b on Windows 10.
Is there a way to move a folder with content to recycle bin programmatically in Matlab?
For example, the following works for single files
previousState=recycle('on');
delete(filename); % if replaced with rmdir(DIR,'s');, folder is deleted permanently
recycle(previousState);
but the same toggle doesn't work for folders. Is there a way?
The only possible workaround I can think of is to use wildcard: delete(fullfile(DIR,'*') and then rmdir(DIR) on the empty folder. But that doesn't work for my application. I wish to preserve temporary copies of folders in recycle bin in case my script that manipulate them throws warning in some unexpected way, in which case I can have a second chance to see the original files. There are hundreds of folders, each contain hundreds to thousands of files in this particular use case. The wildcard approach does put individual files into recycle bin but it loses the original folder structure, making it impractical to selectively recover folders. Hence the question.
Since the OS is specifically Windows 10, one can use powershell as shown here.
For example, with the following recycleFile.ps1 file in C:\
Param(
[string]$filePath # can be folder
)
$sh = new-object -comobject "Shell.Application"
$ns = $sh.Namespace(0).ParseName($filePath)
$ns.InvokeVerb("delete")
the following passage in Matlab will move the folder fullfilepath to recycle bin,
[status,cmdout]=system(['powershell.exe -ExecutionPolicy Bypass -File "',...
fullfile('C:','recycleFile.ps1'),'" "',...
fullfilepath),'"']);
There are several downsides, however.
The operation is equivalent to context menu -> delete. It will bring up the recycle progress UI which seizes foreground. It is also somewhat slow. It may even require user confirmation if the folder in question is not eligible for recycling but only for direct deletion.
I have a set of folders which have new files being added to them regularly, but I have to process those files as they come in. This can be a time consuming process to dig into each folder one by one. I need to figure out how to write a script that will filter out the new files, and copy them into a new directory.
So far I have figured out how to use the Get-ChildItem -Path -Recurse command in powershell to list the new items in the corresponding folders as shown in the third script on this Microsoft page.
So I can see the new files in their folders. How do I copy those items to the destination folder while replicating their original folder structure? I want to be able to recreate the original folders so that I can just overwrite the originals with the edited versions later.
I discovered robocopy, and was able to use it to solve my problem. The /maxage:x option was perfect for my needs.
I am moving CIFS share files and subfolders from one system to another, and I want to set the top level folder at the destination to have same ACLs as the top level folder at the source. In some cases this is up to 25 users and groups.
Is there a way to get the ACLs from the source top level folder, and pipe that output so it is applied to the top level destination folder?
You can copy an ACL very easily:
Get-Acl -Path <SourceFolder> | Set-Acl -Path <DestinationFolder>
But this isn't very eloquent. It will only take the ACL from one folder and apply it to another. Given you are going to copy a whole tree your milage may vary.
Robocopy is often used in these situations with the /COPYALL parameter. You can create the tree without copying with /CREATE. You may have to tinker around to get it to do only one folder. Hard to say without knowing the particulars of your project, but if you're interested check the help file.
I'd also point out, there is an awesome NTFS module here. I use it all the time it's very capable, and very easy to script around.
Let me know if this is helpful.
I'm currently working on a project where I take multiple 7z files and extract the contents of these files in a folder named the same way as the 7z file itself. I also apologize if something like this has been answered already; I spent time trying to investigate this issue but I can't seem to find anyone else who has had a similar issue.
For example:
a1.7z -> <targetpath>/a1/<contents within a1.7z>
The following shell line: a1.7z | % {& "C:\Program Files\7-Zip\7z.exe" "x" $_.fullname "-o<targetpath>\a1" -y -r}
Works like a dream, but only for one 7z file. However, whenever I start extracting a second 7z file, it won't create a new folder but instead will continue to add into the same first folder that is created; second folder is never made. When I manually highlight all of the 7z files I want to extract, right click and select "Extract to "*\", it does what I would like it to do but I can't figure out how to script this action. I should also mention that some of the 7z files, when extracted, can contain subfolders of the same name. I'm not sure if this is throwing off the recursion cycle, but I'm assuming this might be the case.
Any help or advice on this topic would be greatly appreciated!
If you get all the .7z files as IOFileInfo objects (Using get-ChildItem) you can use Mathias comment, as one way to do this with the pipeline, but I recommend you put this inside a loop and look for a better way to choose the names of the folders I.e. "NofFolder_$_.BaseName" just in case of more than 1 folder with the same name.
It really depends on the format you want.
I have a need to familiarize myself with Powershell, and was looking for a point of reference for the particular problem I am trying to solve using Powershell.
To keep it short, awhile back someone gave permissions to 'Everyone' in a public facing web directory with X number of websites running, in a live environment and a lot of files and folders have the permissions applied but it's like finding a needle in a hay stack. We are trying to patch the server for security reasons, and as such need to locate these vulnerabilities (as we have no reason to have these permissions) and I myself have an immediate need to learn Powershell for another project, so I would like to solve this problem with Powershell scripting or Powershell commands. Ultimately exporting the results would be good, but I've found resourced for that.
Can someone provide me a jumping point to start from? I've experience writing batch files, ASP.NET, VB.NET, jQuery, HTML etc...and I can figure out the code I just can't seem to turn any results up by Google.
Thank you!
This is pretty straightforward. You simply need to get all child directories, recursively, from the root directory (eg. c:\test), and then filter that list where the directory's access control list (ACL) contains "Everyone."
Here is the code to achieve this:
# Get child items (directories only) recursively, where the ACL contains 'Everyone'
Get-ChildItem c:\test -Directory -Recurse | Where-Object -FilterScript { (Get-Acl -Path $PSItem.FullName).AccessToString -match 'Everyone'; };