Somehow, I do not know how, I have managed to invoke this Microsoft Azure Information Management lock on, I guess, my user directory. I just want to undo it. How do I get rid of all these .pfiles, .ptxt, .pjpg, [p]pdf, etc.
You can try using PowerShell: https://learn.microsoft.com/en-us/powershell/module/azureinformationprotection/unprotect-rmsfile?view=azureipps
Something recursive like:
Get-ChildItem -Recurse | if((Get-AIPFileStatus -File $_.Fullname).IsRMSProtected) { Unprotect-RMSFile -File $_.FullName
I haven't tested that, but it looks like it'll work on paper.
Related
I'm trying to come up with a way in order to quickly remove multiple files that are in similar child directories. I need to accomplish the removals quickly to several domain controllers so they don't replicate and replace the deleted files.
The file paths vary slightly but enough and I was hoping to use wildcards (*) to cut down the six possible path variations to only two.
something along these lines:
The {} actually exist since they are hash values
C:\Windows\SYSVOL\domain\Policies\{*}\User*\Script\Logon\abc*
C:\Windows\SYSVOL\sysvol\xxxx\Policies\{*}\User*\Script\Logon\abc*
Initially I was thinking along the lines of using a recursive select-string search and pipe it to remove-item but I was getting an access denial error when i was just hypothesizing the search without the pipe to remove-item.
gci -path C:\Windows\SYSVOL\ -rec | select-string -pattern "domain\Policies\{*}\User*\Script\Logon\abc*
and
gci -path C:\Windows\SYSVOL\ -rec | select-string -pattern "sysvol\xxxx\Policies\{*}\User*\Script\Logon\abc*" | remove-item $_
(the remove-item is basically pseudo code)
I tried delimiting the {'s and 's but that didn't work either.
Once I have this working in theory I would then need to use the invoke-command to try to get this removal process to work its way through the list of systems that need this operation.
I don't know if this is the right approach / best approach, or if it is even doable.
Any advise would be greatly appreciated.
I'm working on a project where I have to apply a file to multiple folders every so often. I'm trying to learn some PowerShell commands to make this a little easier. I came up with the following script, which works, but I feel that this is too verbose and could be distilled down with a better script:
[string]$sourceDirectory = "C:\Setup\App Folder Files\*"
# Create an array of folders
$destinationDirectories = #(
'C:\Users\GG_RCB1\Documents\',
'C:\Users\GG_RCB2\Documents\',
'C:\Users\LA_RCB1\Documents\',
'C:\Users\PR_RCB1\Documents\',
'C:\Users\PQ_RCB1\Documents\',
'C:\Users\PQ_RCB2\Documents\',
'C:\Users\XC_RCB1\Documents\',
'C:\Users\XC_RCB2\Documents\',
'C:\Users\XC_RCB3\Documents\',
'C:\Users\XC_RCB4\Documents\',
'C:\Users\XC_RCB5\Documents\',
'C:\Users\XC_RCB6\Documents\',
'C:\Users\XC_RCB7\Documents\',
'C:\Users\XC_RCB8\Documents\')
# Perform iteration to create the same file in each folder
foreach ($i in $destinationDirectories) {
Copy-item -Force -Recurse -Verbose $sourceDirectory -Destination $i
}
I go into this process knowing that every folder in the User folder area is going to have the same format: _RCB<#>\Documents\
I know that I can loop through those files using this code:
Get-ChildItem -Path 'C:\Users'| where-object {$_.Name -match "^[A-Z][A-Z]_RCB"}
What I'm not sure how to do is to how, within that loop, drill down to the Documents folder and do the copy. I want to avoid having to keep updating the array from the first code sample, particularly when I know the naming convention of the subfolders in the Users folder. I'm just looking for a cleaner way to do this.
Thanks for any suggestions!
Ehh, I'll go ahead and post what I had in mind as well. Not to take away from #Mathias suggestion in the comments, but to offer my solution, here's my take:
Get-ChildItem -Path "C:\users\[A-Z][A-Z]_RCB*\documents" |
Copy-Item -Path $sourceDirectory -Destination { $_.FullName } -Recurse -WhatIf
Since everyone loves the "One-Liners" that can accomplish your needs. Get-ChildItem accepts wildcard-expressions in it's path which let's us accomplish this in one go. Given that your directories are...
consistent with the same naming pattern,
[A-Z][A-Z]_*
and the folder destination is the same.
Documents
Luckily, Copy-Item also has some cool features on it's own such as being able to use a script block that will allow the passing of $_.FullName property as it's destination, while they are passed down the pipeline one at a time.
Remove the -WhatIf common parameter when you've dictated the results are what you're after.
Is there a way to manually clear the pester TestDrive, other than something like Remove-Item "TestDrive:\" -Recurse -Force
AFAIK there isn't a function to trigger a clear of TestDrive, so yes I would recommend using something like Remove-Item $TestDrive -Recurse -Force if you had a specific need to.
However you should also be aware that a TestDrive is scoped to a Describe or Context block and automatically cleaned up at the end of those blocks. So if you want to avoid conflicts between different usages of TestDrive just have those test in different Context blocks.
Let's say I wrote a PowerShell script that includes this commmand:
Get-ChildItem -Recurse
But instead I wrote:
Get-ChildItem -Re
To save time. After some time passed and I upgraded my PowerShell version, Microsoft decided to add a parameter to Get-ChildItem called "-Return", that for example returns True or False depending if any items are found or not.
In that virtual scenario, do I have I to edit all my former scripts to ensure that the script will function as expected? I understand Microsoft's attempt to save my typing time, but this is my concern and therefore I will probably always try to write the complete parameter name.
Unless of course you know something I don't. Thank you for your insight!
This sounds more like a rant than a question, but to answer:
In that virtual scenario, do I have I to edit all my former scripts to ensure that the script will function as expected?
Yes!
You should always use the full parameter names in scripts (or any other snippet of reusable code).
Automatic resolution of partial parameter names, aliases and other shortcuts are great for convenience when using PowerShell interactively. It lets us fire up powershell.exe and do:
ls -re *.ps1|% FullName
when we want to find the path to all scripts in the profile. Great for exploration!
But if I were to incorporate that functionality into a script I would do:
Get-ChildItem -Path $Home -Filter *.ps1 -Recurse |Select-Object -ExpandProperty FullName
not just for the reasons you mentioned, but also for consistency and readability - if a colleague of mine comes along and maybe isn't familiar with the shortcuts I'm using, he'll still be able to discern the meaning and expected output from the pipeline.
Note: There are currently three open issues on GitHub to add warning rules for this in PSScriptAnalyzer - I'm sure the project maintainers would love a hand with this :-)
If i run the following:
Measure-Command -Expression {gci -Path C:\ -Recurse -ea SilentlyContinue | where Extension -eq ".txt"}
Measure-Command -Expression {gci -Path C:\ -Filter *.txt -Recurse -ea SilentlyContinue}
The second expression is always faster that the first one, im guessing its because it doesnt have to use the pipeline.
I thought maybe in the Pipeline method PowerShell recursed my drive and passed a collection of objects to the where clause, that would have to iterate through the items again, but i ruled that out, because if you run the first expression you can see it return output as it is recursing. So why is the Pipeline method slower?
Using Where-Object is always slower than using the built in parameters of the left hand side command. You first bring ALL objects to your shell and only then starts filtering them (client side filtering).
With regard to the -Filter parameter, it works faster because it performs on the provider level (server side filtering), objects are checked once accessed and you get back only the ones that match your criteria.
Shay's answer is totally correct. I wanted to touch on your secondary question a bit, though. Here's what's happening under the hood in the pipeline:
gci -Path C:\ -Recurse -ea SilentlyContinue | where Extension -eq ".txt"
gci will start searching for files and directories at or under c:\, any extension. As it finds each one, that one result is passed on to Where-Object, which will discard it if the extension is not .txt. If the extension is .txt, that object is passed on in the pipeline, and out to the console (or to a variable, or whatever). Then gci will continue its search, when it finds the next file, it will pass it on, etc. So although it might take a couple minutes to search the entire c:\ drive, you get partial results streamed back to you almost immediately, since the pipeline operates one object at a time.
What is not happening is that gci does the full disk search all at once, then hands the complete results set to Where-Object only when it's complete.