Get ACL Folder & Subfolder + Users Using Powershell - powershell

Is it possible to get the permissions of a folder and its sub-folders then display the path, group, and users associated to that group? So, to look something like this. Or will it have to be one folder at a time.
-Folder1
-Line separator
-Group
-Line separator
-List of users
-Folder2
-Line separator
-Group
-Line separator
-List of users
The script I've come up with so far be warned I have very little experience with powershell. (Don't worry my boss knows.)
Param([string]$filePath)
$Version=$PSVersionTable.PSVersion
if ($Version.Major -lt 3) {Throw "Powershell version out of date. Please update powershell." }
Get-ChildItem $filePath -Recurse | Get-Acl | where { $_.Access | where { $_.IsInherited -eq $false } } | select -exp Access | select IdentityReference -Unique | Out-File .\Groups.txt
$Rspaces=(Get-Content .\Groups.txt) -replace 'JAC.*?\\|','' |
Where-Object {$_ -notmatch 'BUILTIN|NT AUTHORITY|CREATOR|-----|Identity'} | ForEach-Object {$_.TrimEnd()}
$Rspaces | Out-File .\Groups.txt
$ErrorActionPreference= 'SilentlyContinue'
$Groups=Get-Content .\Groups.txt
ForEach ($Group in $Groups)
{Write-Host"";$Group;Write-Host --------------
;Get-ADGroupMember -Identity $Group -Recursive | Get-ADUser -Property DisplayName | Select Name}
This only shows the groups and users, but not the path.

Ok, let's take it from the top! Excellent, you actually declare a parameter. What you might want to consider is setting a default value for the parameter. What I would do is use the current directory, which conveniently has an automatic variable $PWD (I believe that's short for PowerShell Working Directory).
Param([string]$filePath = $PWD)
Now if a path is provided it will use that, but if no path is provided it just uses the current folder as a default value.
Version check is fine. I'm pretty sure there's more elegant ways to do it, but I honestly don't have never done any version checking.
Now you are querying AD for each group and user that is found (after some filtering, granted). I would propose that we keep track of groups and members so that we only have to query AD once for each one. It may not save a lot of time, but it'll save some if any group is used more than once. So for that purpose we're going to make an empty hashtable to track groups and their members.
$ADGroups = #{}
Now starts a bad trend... writing to files and then reading those files back in. Outputting to a file is fine, or saving configurations, or something that you'll need again outside of the current PowerShell session, but writing to a file just to read it back into the current session is just a waste. Instead you should either save the results to a variable, or work with them directly. So, rather than getting the folder listing, piping it directly into Get-Acl, and losing the paths we're going to do a ForEach loop on the folders. Mind you, I added the -Directory switch so it will only look at folders and ignore files. This happens at the provider level, so you will get much faster results from Get-ChildItem this way.
ForEach($Folder in (Get-ChildItem $filePath -Recurse -Directory)){
Now, you wanted to output the path of the folder, and a line. That's easy enough now that we aren't ditching the folder object:
$Folder.FullName
'-'*$Folder.FullName.Length
Next we get the ACLs for the current folder:
$ACLs = Get-Acl -Path $Folder.FullName
And here's where things get complicated. I'm getting the group names from the ACLs, but I've combined a couple of your Where statements, and also added a check to see if it is an Allow rule (because including Deny rules in this would just be confusing). I've used ? which is an alias for Where, as well as % which is an alias for ForEach-Object. You can have a natural line brake after a pipe, so I've done that for ease of reading. I included comments on each line for what I'm doing, but if any of it is confusing just let me know what you need clarification on.
$Groups = $ACLs.Access | #Expand the Access property
?{ $_.IsInherited -eq $false -and $_.AccessControlType -eq 'Allow' -and $_.IdentityReference -notmatch 'BUILTIN|NT AUTHORITY|CREATOR|-----|Identity'} | #Only instances that allow access, are not inherited, and aren't a local group or special case
%{$_.IdentityReference -replace 'JAC.*?\\'} | #Expand the IdentityReference property, and replace anything that starts with JAC all the way to the first backslash (likely domain name trimming)
Select -Unique #Select only unique values
Now we'll loop through the groups, starting off by outputting the group name and a line.
ForEach ($Group in $Groups){
$Group
'-'*$Group.Length
For each group I'll see if we already know who's in it by checking the list of keys on the hashtable. If we don't find the group there we'll query AD and add the group as a key, and the members as the associated value.
If($ADGroups.Keys -notcontains $Group){
$Members = Get-ADGroupMember $Group -Recursive -ErrorAction Ignore | % Name
$ADGroups.Add($Group,$Members)
}
Now that we're sure that we have the group members we will display them.
$ADGroups[$Group]
We can close the ForEach loop pertaining to groups, and since this is the end of the loop for the current folder we'll add a blank line to the output, and close that loop as well
}
"`n"
}
So I wrote this up and then ran it against my C:\temp folder. It did tell me that I need to clean up that folder, but more importantly it showed me that most of the folders don't have any non-inherited permissions, so it would just give me the path with an underline, a blank line, and move to the next folder so I had a ton of things like:
C:\Temp\FolderA
---------------
C:\Temp\FolderB
---------------
C:\Temp\FolderC
---------------
That doesn't seem useful to me. If it is to you then use the lines above as I have them. Personally I chose to get the ACLs, check for groups, and then if there are no groups move to the next folder. The below is the product of that.
Param([string]$filePath = $PWD)
$Version=$PSVersionTable.PSVersion
if ($Version.Major -lt 3) {Throw "Powershell version out of date. Please update powershell." }
#Create an empty hashtable to track groups
$ADGroups = #{}
#Get a recursive list of folders and loop through them
ForEach($Folder in (Get-ChildItem $filePath -Recurse -Directory)){
# Get ACLs for the folder
$ACLs = Get-Acl -Path $Folder.FullName
#Do a bunch of filtering to just get AD groups
$Groups = $ACLs |
% Access | #Expand the Access property
where { $_.IsInherited -eq $false -and $_.AccessControlType -eq 'Allow' -and $_.IdentityReference -notmatch 'BUILTIN|NT AUTHORITY|CREATOR|-----|Identity'} | #Only instances that allow access, are not inherited, and aren't a local group or special case
%{$_.IdentityReference -replace 'JAC.*?\\'} | #Expand the IdentityReference property, and replace anything that starts with JAC all the way to the first backslash (likely domain name trimming)
Select -Unique #Select only unique values
#If there are no groups to display for this folder move to the next folder
If($Groups.Count -eq 0){Continue}
#Display Folder Path
$Folder.FullName
#Put a dashed line under the folder path (using the length of the folder path for the length of the line, just to look nice)
'-'*$Folder.FullName.Length
#Loop through each group and display its name and users
ForEach ($Group in $Groups){
#Display the group name
$Group
#Add a line under the group name
'-'*$Group.Length
#Check if we already have this group, and if not get the group from AD
If($ADGroups.Keys -notcontains $Group){
$Members = Get-ADGroupMember $Group -Recursive -ErrorAction Ignore | % Name
$ADGroups.Add($Group,$Members)
}
#Display the group members
$ADGroups[$Group]
}
#output a blank line, for some seperation between folders
"`n"
}

I have managed to get this working for me.
I edited the below section to show the Name and Username of the user.
$Members = Get-ADGroupMember $Group -Recursive -ErrorAction Ignore | % Name | Get-ADUser -Property DisplayName | Select-Object DisplayName,Name | Sort-Object DisplayName
This works really well, but would there be a way to get it to stop listing the same group access if it's repeated down the folder structure?
For example, "\example1\example2" was assigned a group called "group1" and we had the following folder structure:
\\example1\example2\folder1
\\example1\example2\folder2
\\example1\example2\folder1\randomfolder
\\example1\example2\folder2\anotherrandomfolder
All the folders are assigned the group "group1", and the current code will list each directory's group and users, even though it's the same. Would it be possible to get it to only list the group and users once if it's repeated down the directory structure?
The -notcontains doesn't seem to work for me
If that makes sense?

Related

Windows PowerShell script for verifying several Active Directory groups exist

I need to run a PowerShell script to verify that a huge list of Active Directory groups exist based on an .xlsx file. I would also like to see the owners of the AD groups if possible.
I can run separate scripts, if needed. Please help.
Reading and writing to an excel file from PowerShell is its own adventure, but one that is heavily documented if you google it.
For the query you are asking, its important to know if you have the group's Display Name or the group's SamAccountName. If you have both, you could put an -OR in the filter.
Here is a quick example of how it could work based on a text list.
$list = 'Cross Functional Team
Security Group
Bobs Team' -split([Environment]::NewLine)
$SelectSplat = #{
Property = #{n='Name';E={$_}},
#{N='Found';e={[Bool](Get-adgroup -Filter {DisplayName -eq $_})}},
#{N='ManagedBy';e={Get-adgroup -Filter {DisplayName -eq $_} -Properties ManagedBy | %{(Get-ADUser -Identity $_.ManagedBy).Name}}}
}
$list | Select-Object #SelectSplat

Powershell - GetChilditem grab first result

I have an issue with a powershell script i have made. The purpose of the script is to gather information from various ressources, CMDB og and other systems and gather them in a combined report and send it.
I have everything working just fine, except one single ting that keeps bothering me. In my script, i do a lot of parsing and trimming in the information i get, at in some functions i need to get some XML files. Example:
$filter = "D:\WEC\Script\Rapportering\BigFixData\"
$xmlfiles = Get-ChildItem -path $filter -Filter "Bigfix_trimmed_JN.xml" -Recurse -Force |where {$_.psIsContainer -eq $false }
$xmlfile = $xmlfiles | ogv -OutputMode Single
There will always be only one file to grab, and thats why i use the Filter option and give the specific name. The code above will trigger a pop-up, asking me to select the file. It works fine except for the file picker popup. I want to get rid of that.
I then changed the code to this:
$filter = "D:\WEC\Script\Rapportering\BigFixData\"
$xmlfiles = Get-ChildItem -path $filter -Filter "Bigfix_trimmed_JN.xml" | Select-Object -First 1 |where {$_.psIsContainer -eq $false }
This no longer shows the popup, but it does not seem to select the file. Resulting in a referenceObject error later in the script, because it is null.
the script is about 1000 lines and i have narrowed the error down to the command aboove.
Can anyone help me figuring out what i do wrong?
Thanks in advance
Your 2nd command is missing the -Recurse switch, which may explain why you're not getting any result.
While it is unlikely that directories match with a filter pattern as specific as "Bigfix_trimmed_JN.xml", the more concise and faster way to limit matching to files only in PSv3+ is to use the -File switch (complementarily, there's also a -Directory switch).
$xmlfile = Get-ChildItem $filter -Filter Bigfix_trimmed_JN.xml -Recurse -File |
Select-Object -First 1
You should add a check to see if no file was returned.
If you want the first file, you'll need to filter out the directories before piping to Select-Object -First 1 - otherwise you run the risk of the first input element being a directory and your pipeline therefore evaluates to $null:
$xmlfiles = Get-ChildItem -path $filter -Filter "Bigfix_trimmed_JN.xml" | Where-Object {-not $_.PsIsContainer} | Select-Object -First 1

Get-Aduser -Filter Option -notlike does not work

Attempting to use Get-Aduser to find entries in Active directory that are not in a text file. The -like option appears to work but cannot seem to get the -notlike to work.
When I use the -nolike option, the entries in the text file appear as part of the output file. Using the -like option the powershell works.
Here is the contents of the text file
svcXXSQL001Agent
svcXXSQL001DBEng
svcXXSQL001Int
svcXXSQLUAT501DBEng
svcxxapp211
Here is my existing code:
$server=get-content C:\temp\test.txt
foreach ($name in $server) {
Get-ADUser -SearchBase “OU=ServiceAccts,DC=nlong,DC=com” -Filter "name -notlike '$name'" | sort | Where-Object {$_.Name -like "svcxxsql*"} | Select- Object Name | Out-File -FilePath C:\temp\foo.txt
}
Thanks for the input, Norm.
Expecting the output without the given names is a false assumption, let me demonstrate using the numbers 1 and 2, and the Where-Object cmdlet in place of the Get-ADUser filter all numbers from 1 to 5 except for 1 or 2:
$numbers = 1,2,3
foreach($number in $numbers){
# Let's output all numbers from 1 to 3, except for $number
1..3 |Where-Object {$_ -notlike $number}
}
You will find the output to be:
2
3
1
3
1
2
In the first iteration of the loop, we receive the number 2 along with the number 3 - this is obviously not our intention, they were supposed to be filtered, but it ended up in the output because we filter only against 1 number at a time.
We can use either the -notcontains or -notin operators to filter against a collection of terms instead:
$numbers = 1,2,3
1..3 |Where-Object {$numbers -notcontains $_}
# or
1..3 |Where-Object {$_ -notin $numbers}
In your example, you would have to retrieve all the AD users and filter using the Where-Object cmdlet:
Get-ADUser -SearchBase "OU=ServiceAccts,DC=nlong,DC=com" |Where-Object {$_.Name -notin $server} | sort | Where-Object {$_.Name -like "svcxxsql*"} | Select-Object Name | Out-File -FilePath C:\temp\foo.txt
Since you're only interested in accounts that start with svcxxsql, we might as well place that as the filter:
Get-ADUser -SearchBase "OU=ServiceAccts,DC=nlong,DC=com" -Filter {Name -like "svcxxsql*"} |Where-Object {$_.Name -notin $server} | sort | Select-Object Name | Out-File -FilePath C:\temp\foo.txt
While this is old, here's a more efficient method using an actual LDAP filter that you construct from information supplied in the file.
Assuming the file contains the actual sAMAccountNames you wish to exclude:
$servers = get-content C:\temp\test.txt
# Begin the filter - sAMAccountType=805306368 is user objects only, no contacts
$filter = '(&(sAMAccountType=805306368)(!(|'
# recursively append each samAccountName to exclude in the filter
foreach ($u in $servers) {
$filter = $filter + "(samAccountName=$u)"
}
#finish the filter
$filter = $filter + ')))'
#ldap filter looks like this
# (&(sAMAccountType=805306368)(!(|(samAccountName=svcXXSQL001Agent)(samAccountName=svcXXSQL001DBEng)(samAccountName=svcXXSQL001Int)(...))))
# run the query
Get-aduser -LDAPFilter $filter -SearchBase "OU=ServiceAccts,DC=nlong,DC=com"
Active Directory can technically take an LDAP query that's 10MB in size, although obviously that'd be really excessive. So I recommend the above method be used only if it's a limited number of items you want to exclude.
I use a similar method to build a query for users that are members of certain groups but not others, and for that, it's significantly faster than grabbing groups with thousands of users each and trying to compare members that are exclusive to one.
As always, test and compare the time to execute the different methods:
grabbing everything and discarding unwanted results after
grabbing a partially-filtered set and discarding after (like the original answer)
constructing a more complex targeted query (this example)
Also consider that processing loads occur in different places. It can take a long time for a DC to execute a very long, complex LDAP query, with extra CPU and potential timeouts. But it can take even longer to grab all the properties from every single object from AD, with potential connection timeouts, with more data travelling across the wire to the target system, then getting all that data processed locally.
In my experience, it's the "grab everything" queries with big result sets that cause the most load, on DCs, network and target systems (Perfmon LDAP monitoring on the DCs can be interesting). That's that's why it's often best for the DC to do the filtering work, as long as the LDAP filters are sensible.

Adding AD group members to CSV list of AD groups

I have a list of AD security groups contained in a .csv file, in the following format:
Groups,
Groupname1,
Groupname2,
Groupname3,
What I want to do is feed this .csv file to a PowerShell script, so it can report on the group membership of the groups listed in the .csv file (I don't care about recursive groups, as I don't have any). The script should then dump the results to a further .csv file.
So, ideally, I want the final .csv produced output from the scripts to be something like.
Groupname1, name
Groupname1, name
Groupname2, name
Groupname2, name
Groupname3, name
Groupname3, name
You get the idea.
What I am struggling with is getting some sort of for loop going, to look through all the groups from the .csv and output the results as shown above (groupname and then user).
$list = Import-Csv -Path C:\temp\ADGroups.csv
foreach ($groups in $list) {
$ADGroup = $groups.groupname
Get-ADGroupmember -Identity $ADGroup | Get-ADUser -Property Name
Export-Csv -Path "c:\temp\dump.csv"
}
I've seen other suggestions (see example here), but these don't read the groups from a .csv file.
You need to add the group name to the output. I would do this with a calculated property. You also need to move the Export-Csv outside the loop or use -Append. If not, the groups will overwrite the file every time and it will only contain the results from the last group. Try this:
Import-Csv -Path C:\temp\ADGroups.csv | Foreach-Object {
$ADGroup = $_.Groups
Get-ADGroupmember -identity $ADGroup | Select-Objects #{n="Groupname";e={$ADGroup}}, Name
} | Export-CSV -Path "c:\temp\dump.csv" -NoTypeInformation

Powershell memory exhaustion using NTFSSecurity module on a deep folder traverse

I have been tasked with reporting all of the ACL's on each folder in our Shared drive structure. Added to that, I need to do a look up on the membership of each unique group that gets returned.
Im using the NTFSSecurity module in conjunction with the get-childitem2 cmdlet to get past the 260 character path length limit. The path(s) I am traversing are many hundreds of folders deep and long since pass the 260 character limit.
I have been banging on this for a couple of weeks. My first challenge was crafting my script to do my task all at once, but now im thinking thats my problem... The issue at hand is resources, specifically memory exhaustion. Once the script gets into one of the deep folders, it consumes all RAM and starts swapping to disk, and I eventually run out of disk space.
Here is the script:
$csvfile = 'C:\users\user1\Documents\acl cleanup\dept2_Dir_List.csv'
foreach ($record in Import-Csv $csvFile)
{
$Groups = get-childitem2 -directory -path $record.FullName -recurse | Get-ntfsaccess | where -property accounttype -eq -value group
$groups2 = $Groups | where -property account -notmatch -value '^builtin|^NT AUTHORITY\\|^Creator|^AD\\Domain'
$groups3 = $groups2 | select account -Unique
$GroupMembers = ForEach ($Group in $Groups3) {
(Get-ADGroup $Group.account.sid | get-adgroupmember | select Name, #{N="GroupName";e={$Group.Account}}
)}
$groups2 | select FullName,Account,AccessControlType,AccessRights,IsInherited | export-csv "C:\Users\user1\Documents\acl cleanup\Dept2\$($record.name).csv"
$GroupMembers | export-csv "C:\Users\user1\Documents\acl cleanup\Dept2\$($record.name)_GroupMembers.csv"
}
NOTE: The dir list it reads in is the top level folders created from a get-childitem2 -directory | export-csv filename.csv
During the run, it appears to not be flushing memory properly. This is just a guess from observation. At the end of each run through the code, the variables should be getting over-written, I thought, but memory doesn't go down, so it looked to me that since memory didn't go back down, that it wasn't properly releasing it? Like I said, a guess... I have been reading about runspaces but I am confused about how to implement that with this script. Is that the right direction for this?
Thanks in advance for any assistance...!
Funny you should post about this as I just finished a modified version of the script that I think works much better. A friend turned me on to 'Function Filters' that seem to work well here. Ill test it on the big directories tomorrow to see how much better the memory management is but so far it looks great.
#Define the function ‘filter’ here and call it ‘GetAcl’. Process is the keyword that tells the function to deal with each item in the pipeline one at a time
Function GetAcl {
PROCESS {
Get-NTFSAccess $_ | where -property accounttype -eq -value group | where -property account -notmatch -value '^builtin|^NT AUTHORITY\\|^Creator|^AD\\Domain'
}
}
#Import the directory top level paths
$Paths = import-csv 'C:\users\rknapp2\Documents\acl cleanup\dept2_Dir_List.csv'
#Process each line from the importcsv one at a time and run GetChilditem against it.
#Notice the second part – I ‘|’ pipe the results of the GetChildItem to the function that because of the type of function it is, handles each item one at a time
#When done, pass results to Exportcsv and send it to a file name based on the path name. This puts each dir into its own file.
ForEach ($Path in $paths) {
(Get-ChildItem2 -path $path.FullName -Recurse -directory) | getacl | export-csv "C:\Users\rknapp2\Documents\acl cleanup\TestFilter\$($path.name).csv" }