This is my first post within Stackoverflow, I have for many years just read many fantastic questions, answers and other various posts. I have learned a lot a lot from this fantastic community. I hope now that I have taken the brave step to really sink my teeth into powershell and join this community that I may be able to contribute in someway!
So I have started working on a project which at it's basic core level - list all files that are older than 7 years so they can then review and delete where possible.
I have however broken the whole entire script up into several stages. I am currently stuck at a step in stage 2.
I have been stuck for about 2 days on what to many of you powershell genius's out there may only take 10mins to figure out!
I must apologise for my stupidity and lack of knowledge, my experience with powershell scripting is limited to literally 5 working days, I am currently diving in and learning with books, but I also have a job to do so don't get to learn the easy way!
My script essentially has 3 steps;
Runs a Get-ACL of top Level DATA folders to create a listing of all Groups that have permissions on particular folder. I want to then either export this data or simply hold it for the next step.
Filter this gathered information based off a CSV which contains a Column labelled Role (Role will contain a group that the Folder Manager is exclusively in), and check the inherent member of this exlcusive group (maybe this last bit needs to be another step as well?)
Stores or Exports this list of exclusive members with their relevant folder to then later use as a variable for to send an email with a list of files that need to be deleted.
With the script below I am essentially stuck on Step 2 and how to create a filter from the CSV (or stored variables?) and apply it to the GET-ACL foreach loop. I may be going about this the whole wrong way using regex, and to be honest most of this is copy and paste and reading around the internet where people have done similar tasks. SO again I apologise if this is just a dumb way to go about it from the start!
I want to thank everyone in advance for all help, opinions and advice, I will listen to it all and I will try and take on-board as much as my brain can handle - I promise!
#$RoleList = import-csv "C:\DATA\scripts\GB_CMS\CSV\datafolders_rolelist.csv"
#foreach ($Manager in $RoleList) {
#$FolderManager = $RoleList.Role
$FolderManagers = Import-Csv C:\DATA\scripts\GB_CMS\CSV\datafolders_rolelist.csv | foreach {
New-Object PSObject -prop #{
Folder = $_.Folder;
Manager = $_.'Folder Manager';
Role = $_.Role
}
}
$Role = $FolderManagers.Role
$Role
gci "c:\DATA" | Where {$_.PSIsContainer} | get-acl |
ForEach $_.Name {
[regex]$regex="\w:\\\S+"
$path=$regex.match($_.Path).Value
$_ | select -expand access |
$
where {$_.identityreference -like "$Role"} |
Select #{Name="Path";Expression={$Path}},IdentityReference
}
Thanks,
Daniel.
Bit of a guess at what you want here. e.g. if you have folders
C:\Data\Accounts
C:\Data\Marketing
C:\Data\Sales
You might have permissions
C:\Data\Accounts {'FolderManagers-Accounts', 'Accounts', 'Directors'}
C:\Data\Marketing {'FolderManagers-Marketing', 'Marketing', 'Sales'}
C:\Data\Sales {'FolderManagers-Sales', 'Sales', 'Directors'}
and your CSV is
Name, Role, Email
Alice, FolderManagers-Accounts, alice#example.com
Bob, FolderManagers-Marketing, bob#example.com
And there will be a clear mapping of one (1) row in the CSV to one of the groups in the ACLs.
And you want, from your script:
Identify who to email about "C:\Data\Accounts"
How close am I?
# Import the managers. This will turn the CSV into an array of objects
# no need to do that explicitly
$FolderManagers = Import-Csv C:\DATA\scripts\GB_CMS\CSV\datafolders_rolelist.csv
# This will be a hashtable pairing up folder names with people
# e.g. 'C:\DATA\Accounts' -> Alice
$FolderMap = #{}
# Run through all the folders
GetChildItem -Path "C:\Data" -Directory | ForEach-Object {
# Run through the group/user ACL entries on the folder
foreach ($group in (Get-Acl $_.FullName).Access.IdentityReference)
{
# Look for a matching row in the CSV
$CsvRow = $FolderManagers | Where-Object {$_.Role -match $group}
if (-not $CsvRow)
{
Write-Error "No manager found for folder $_"
}
else
{
# Add to the map
# $_ converts to folder path, C:\DATA\Accounts
# $CsvRow is the person, #{Name=Alice, Role=..., Email=...}
$FolderMap[$_.FullName] = $CsvRow
}
}
}
Then it (the FolderMap) will be
Name Value
---- -----
C:\Data\Accounts {Name='Alice';Role=...
C:\Data\Marketing {Name='Bob';Role=...
you can query it with
$person = $FolderMap["c:\data\Accounts"]
$person.email
and if you really want to export it, maybe
$FolderMap | ConvertTo-Json | Set-Content foldermanagers.json
Nb. I wrote most of this off the top of my head, and it probably won't just run. And that's a problem with big, not very specific questions on StackOverflow.
Auto-generated PS help links from my codeblock (if available):
Import-Csv (in module Microsoft.PowerShell.Utility)
ForEach-Object
Get-Acl (in module Microsoft.PowerShell.Security)
Where-Object
Write-Error (in module Microsoft.PowerShell.Utility)
Related
I am looking into recreating the same results as the the Get-NTFSEffectiveAccess cmdlet provided in the NTFSSecurity module. Unfortunately, I need to re-invent the wheel using just PowerShell (as it's the only thing I know).
For the most part, I feel like I'm on the right track with my code:
$Path = "\\MyFile\Share\Path"
$User = 'Abe'
$ADUC = #(Get-ADUser -Identity $User -Properties DisplayName)
$ACL = Get-Acl -Path $Path
$Groups = $ACL.Access.IdentityReference.Where{$_ -notmatch "BUILTIN"} -replace "AREA52\\",""
foreach ($Group in $Groups) {
if ($ADUC.DistinguishedName -in $((Get-ADGroup -Identity $Group -Properties members).members)) {
[array]$ACL.Access.Where{ $_.IdentityReference -match $Group } |
Select-Object -Property #{
Name = 'DisplayName';
Expression = {
$ADUC.DisplayName
}
},#{
Name = 'GroupName';
Expression = {
$Group
}
}, FileSystemRights, AccessControlType
}
else {
#$ADUC.DisplayName + " not in " + $Group
}
}
. . .but, I am stuck. Stuck in regards to the logic should be. So i'm trying to do the following:
Compare the Groups that the user is in, to one another, to determine what actual rights they have.
The biggest issue, is probably this. We manage folder permissions by groups, and do not add the users directly to the folder witch specific rights.
I am also trying to list if the the groups (users) permission is applies to the current folder, or to the sub-directories as well
Just like the output of Get-NTFSEffectiveAccess.
Example:
Account Access Rights Applies to Type IsInherited Group
------- ------------- ---------- ---- ----------- -----
Abraham Read ThisFolderSubfoldersAndFiles Allow False Grp1
Jrose Read ThisFolderSubfoldersAndFiles Allow False Grp1
QUESTION: Is there a certain way I could compare the groups the user is in to one another, and get the Dominant access to that folder; like in windows Effective Permissions function?
Reasons on why I'd like to re-invent the wheel:
Environment I work in is very strict on modules that are installed on computers and unfortunately, the NTFSSecurity module, is not allowed.
Why not? Trying to become more edumecated:)
Been googling and looking at articles all day with one question on Experts Exchange that had a similar question, but. . . i'm not going to pay for that. haha
Would like to mention that this isn't a task, but a problem as I just can't understand the proper logic to go by here to get this done. Mentioned my both goals, but only asked for assistance with one problem as it may come off as unfair.
Hello & thanks in advance for the help!
Looking to delete computers (Workstations OU) in AD if they meet a certain criteria.
I need to make sure they have the "LOCATION," part of the Canonical name in common before proceeding to delete. If they are not at my location that could be reason to investigate and I do not want to delete them. This is an example of one PC (Caps are fields I changed):
ORGANIZATION.COM/Workstations/BUSINESS UNIT/Desktops/LOCATION/COMPUTER NAME
I have the following script currently that will print them to a .csv which is helpful, but to take it one step further, it would be nice to print this on the screen then review it quick and proceed with a delete. Any tips??
Get-Content C:\Temp\Powershell\hosts.txt | ForEach-Object {
Get-ADComputer $_ -Properties Name,CanonicalName |Select-Object Name,CanonicalName
} -ErrorAction Ignore | Export-Csv C:\Temp\Powershell\Output.csv
Or even a second line of code I can utilize the output.csv with, not sure where to go from here...
Again, Thanks!
Added -Recursive and it seems to be working as expected.
I am new to powershell sctipting, like Brand new. I have some experience using Exchange powershell but thats always been for very specific items like adjust calendar permissions and such. Nothing to robust.
Currently I am working on a powershell script to push out via Group policy that will run a a search on each domain PC. I've been getting help from a co-worker but he isn't available right now and I have a hard time following him sometimes. I am this site and its user might be able to assist me. What I am trying to do(and I believe I am close to) is pulling a list of drives for each computer on the domain. Once I pull that list O pipe it into a variable and then do a search on that variable for any files that end with .pst. Once the search is complete if there were results from the search a file should be created with the FUllname"path" of each file and the computer name should be used for naming the file. If there are no results form the search the file would be empty but the filename should still be named after t he computer. I believe I have gotten everything correct except that I do not know how to name the file based on the computer name. Thank you for your time and help with this.
Here is my code so far:
$drives=Get-WmiObject -query "SELECT * from win32_logicaldisk where
DriveType = '3'" | select deviceid
foreach ($drive in $drives){
$pstfound=Get-ChildItem $drive.deviceid *.pst -recurse | select
fullname
$pst+=$pstfound
}
IF ($pst -eq $null) {
$pst | Out-File \\"Servername"\Searchresults\Null
} Else {
$pst | Out-File \\"Servername"\Searchresults\HasItems
}
Thank you. I wasn't initially planning on using the UNC path but changed it up anyways and I think that will make it easier to go through later. I also figured out my issue for naming the file generated after the computer it ran on. I just set a variable $hostname=hostname and then set the files as \$hostname.csv
I have been tasked with reporting all of the ACL's on each folder in our Shared drive structure. Added to that, I need to do a look up on the membership of each unique group that gets returned.
Im using the NTFSSecurity module in conjunction with the get-childitem2 cmdlet to get past the 260 character path length limit. The path(s) I am traversing are many hundreds of folders deep and long since pass the 260 character limit.
I have been banging on this for a couple of weeks. My first challenge was crafting my script to do my task all at once, but now im thinking thats my problem... The issue at hand is resources, specifically memory exhaustion. Once the script gets into one of the deep folders, it consumes all RAM and starts swapping to disk, and I eventually run out of disk space.
Here is the script:
$csvfile = 'C:\users\user1\Documents\acl cleanup\dept2_Dir_List.csv'
foreach ($record in Import-Csv $csvFile)
{
$Groups = get-childitem2 -directory -path $record.FullName -recurse | Get-ntfsaccess | where -property accounttype -eq -value group
$groups2 = $Groups | where -property account -notmatch -value '^builtin|^NT AUTHORITY\\|^Creator|^AD\\Domain'
$groups3 = $groups2 | select account -Unique
$GroupMembers = ForEach ($Group in $Groups3) {
(Get-ADGroup $Group.account.sid | get-adgroupmember | select Name, #{N="GroupName";e={$Group.Account}}
)}
$groups2 | select FullName,Account,AccessControlType,AccessRights,IsInherited | export-csv "C:\Users\user1\Documents\acl cleanup\Dept2\$($record.name).csv"
$GroupMembers | export-csv "C:\Users\user1\Documents\acl cleanup\Dept2\$($record.name)_GroupMembers.csv"
}
NOTE: The dir list it reads in is the top level folders created from a get-childitem2 -directory | export-csv filename.csv
During the run, it appears to not be flushing memory properly. This is just a guess from observation. At the end of each run through the code, the variables should be getting over-written, I thought, but memory doesn't go down, so it looked to me that since memory didn't go back down, that it wasn't properly releasing it? Like I said, a guess... I have been reading about runspaces but I am confused about how to implement that with this script. Is that the right direction for this?
Thanks in advance for any assistance...!
Funny you should post about this as I just finished a modified version of the script that I think works much better. A friend turned me on to 'Function Filters' that seem to work well here. Ill test it on the big directories tomorrow to see how much better the memory management is but so far it looks great.
#Define the function ‘filter’ here and call it ‘GetAcl’. Process is the keyword that tells the function to deal with each item in the pipeline one at a time
Function GetAcl {
PROCESS {
Get-NTFSAccess $_ | where -property accounttype -eq -value group | where -property account -notmatch -value '^builtin|^NT AUTHORITY\\|^Creator|^AD\\Domain'
}
}
#Import the directory top level paths
$Paths = import-csv 'C:\users\rknapp2\Documents\acl cleanup\dept2_Dir_List.csv'
#Process each line from the importcsv one at a time and run GetChilditem against it.
#Notice the second part – I ‘|’ pipe the results of the GetChildItem to the function that because of the type of function it is, handles each item one at a time
#When done, pass results to Exportcsv and send it to a file name based on the path name. This puts each dir into its own file.
ForEach ($Path in $paths) {
(Get-ChildItem2 -path $path.FullName -Recurse -directory) | getacl | export-csv "C:\Users\rknapp2\Documents\acl cleanup\TestFilter\$($path.name).csv" }
I spent some time searching through similar questions on here to see if I could find some answers, but I'm so clueless about AD that I'm not even sure how to tell if I'd found what I was looking for...
I have a number of folders in one place. All these folders are similarly named:
Reports_January_2011
Reports_March_2012
Reports_March_2012
All of these folders have a pair of identically named subfolders:
Export
Raw
I need to see all groups that have any permissions configured in these folders. Basically I have a Reports folder for every month for the past 5 years, each of those with the two subfolders. I need to make sure they all have the right groups added to them.
I started trying to figure out the regex to pick out only the right reports folders, but I'm totally lost on where to start for the "Get groups" part of the script. My experience with PS is limited to batch renaming, moving, etc. Simple one line stuff.
There is a nice PowerShell module (File System Security PowerShell Module 1.3) that could make your life easier. With that module in place, you can use the Get-Ace cmdlet to list permissions for files using a command like the one below:
Get-Item F:\backup | Get-Ace | Where-Object { $_.ID -like "*users*" }
Have a look at it.
You can try something like this this:
dir c:\ | ? { $_.psiscontainer } |`
Get-Acl | fl -property #{n="Path";E={ convert-path $_.pspath}}, #{N="AccessList";`
E={ $_.AccessToString -split '\n' | ? { $_.startswith("MyDomain") }}; }
to have a list with path and access list. Removing the las pipe also local user are listed