Unable to retrieve accessed folders from Event Logs using PowerShell - powershell

I am trying to control accesses to specific folder, so I have Audit Object Access policy enable and I've also enabled Auditing on the folder I want. Now I plan to see these accesses on a CSV file.
I have the following script that is supposed to achieve this
$OutputFileName = "EventsFrom-{0}.csv" -f (Get-Date -Format "MMddyyyy-HHmm")
Get-EventLog -LogName Security | Where-Object {$_.EventID -eq 4656} | Select-Object -Property TimeGenerated, MachineName, #{n='AccountName';e={$_.ReplacementStrings[1]}} | Export-CSV c:\scripts\$OutputFileName -NoTypeInformation
but the condition
Where-Object {$_.EventID -eq 4656}
causes the resulting CSV file to come out completely empty (even with no table headers). But when I change the Event ID (from 4656 to something like 4673) or remove the condition altoghether, I do get results on the resulting CSV.
Also, from the event viewer when I filter the results with the ID 4656, results do show up. Right now I genuinely don't know what to do. Thanks in advance for any help.
I appreciate if anyone could help me track down the cause for this. I don't really have much experience with PS scripting so a detailed expanation as to why this is happening (or the actual solution for my problem) would be very helpful.

Related

Export more than 1000 rows in Powershell

I am trying to access some data about Unified Groups using PowerShell.
Get-UnifiedGroup | ? {$_.AccessType -eq "Public"}
This is the command I am using, however I am also trying to export this data to CSV.
So the command becomes
Get-UnifiedGroup | ? {$_.AccessType -eq "Public"} | Export-Csv c:\temp\azureadusers.csv
But it only displays first 1000 results in the csv file and I am trying to get all of the data. I am new to PowerShell so I am still learning this.
How can I achieve this?
you may want too look at the -Filter Parameter too. It's always a good thing to filter as far left as possible. Mostly because it's free Performance gain.
-Filter {AccessType -eq "Public"}

Reading from and appending to CSV or XLSX in Powershell

For the last few years I've been using Powershell in my limited capacity to perform lookups in Active directory to fill in information that's missing from a lot of different reports. It usually involves get-aduser to find company, location, account status, manager, and a handful of other properties. What I'll do is take a report that's been handed to my team that is ALWAYS lacking key info and is usually anywhere from 500 to 5000 rows, export the userID (or whatever property I'm searching by) to a text file and reference that in a foreach loop to get the info I need. I'll then export that data to CSV, copy it to the original report and make sure whatever property I'm searching on matches at the top middle and bottom of the sheet then delete what's extra and go from there.
For reference, here's what I'm using now.
$TotalRows = (get-content '[path to file]').Length
foreach($EmployeeID in Get-Content '[path to file]') {
Write-Progress -Activity 'Processing IDs' -Status 'Searching Active Directory' -PercentComplete ((($rowcounter++) / $TotalRows) * 100)
Get-ADUser -Filter {EmployeeID -eq $EmployeeID} -Properties * | Select-Object EmployeeID,SamAccountName,UserPrincipalName, company, l, enabled, department, title, manager | Export-Csv -NoTypeInformation -Path '[path to output directory]\output.csv' -Append
}
This does a good job of things, but it requires a lot of copy/paste work and checking to make sure things still line up.
The ask... I want to be able to take all the garbage work out of the middle. I want to take a csv, tell Powershell to look in, lets say, E7 for a UPN, execute a get-aduser query then export the results to E8 and beyond based on how many properties I'm returning. The script would then chew through the rest of the sheet and export get-aduser info for each row.

Need to scan all domain computers for .pst files

I am new to powershell sctipting, like Brand new. I have some experience using Exchange powershell but thats always been for very specific items like adjust calendar permissions and such. Nothing to robust.
Currently I am working on a powershell script to push out via Group policy that will run a a search on each domain PC. I've been getting help from a co-worker but he isn't available right now and I have a hard time following him sometimes. I am this site and its user might be able to assist me. What I am trying to do(and I believe I am close to) is pulling a list of drives for each computer on the domain. Once I pull that list O pipe it into a variable and then do a search on that variable for any files that end with .pst. Once the search is complete if there were results from the search a file should be created with the FUllname"path" of each file and the computer name should be used for naming the file. If there are no results form the search the file would be empty but the filename should still be named after t he computer. I believe I have gotten everything correct except that I do not know how to name the file based on the computer name. Thank you for your time and help with this.
Here is my code so far:
$drives=Get-WmiObject -query "SELECT * from win32_logicaldisk where
DriveType = '3'" | select deviceid
foreach ($drive in $drives){
$pstfound=Get-ChildItem $drive.deviceid *.pst -recurse | select
fullname
$pst+=$pstfound
}
IF ($pst -eq $null) {
$pst | Out-File \\"Servername"\Searchresults\Null
} Else {
$pst | Out-File \\"Servername"\Searchresults\HasItems
}
Thank you. I wasn't initially planning on using the UNC path but changed it up anyways and I think that will make it easier to go through later. I also figured out my issue for naming the file generated after the computer it ran on. I just set a variable $hostname=hostname and then set the files as \$hostname.csv

Powershell memory exhaustion using NTFSSecurity module on a deep folder traverse

I have been tasked with reporting all of the ACL's on each folder in our Shared drive structure. Added to that, I need to do a look up on the membership of each unique group that gets returned.
Im using the NTFSSecurity module in conjunction with the get-childitem2 cmdlet to get past the 260 character path length limit. The path(s) I am traversing are many hundreds of folders deep and long since pass the 260 character limit.
I have been banging on this for a couple of weeks. My first challenge was crafting my script to do my task all at once, but now im thinking thats my problem... The issue at hand is resources, specifically memory exhaustion. Once the script gets into one of the deep folders, it consumes all RAM and starts swapping to disk, and I eventually run out of disk space.
Here is the script:
$csvfile = 'C:\users\user1\Documents\acl cleanup\dept2_Dir_List.csv'
foreach ($record in Import-Csv $csvFile)
{
$Groups = get-childitem2 -directory -path $record.FullName -recurse | Get-ntfsaccess | where -property accounttype -eq -value group
$groups2 = $Groups | where -property account -notmatch -value '^builtin|^NT AUTHORITY\\|^Creator|^AD\\Domain'
$groups3 = $groups2 | select account -Unique
$GroupMembers = ForEach ($Group in $Groups3) {
(Get-ADGroup $Group.account.sid | get-adgroupmember | select Name, #{N="GroupName";e={$Group.Account}}
)}
$groups2 | select FullName,Account,AccessControlType,AccessRights,IsInherited | export-csv "C:\Users\user1\Documents\acl cleanup\Dept2\$($record.name).csv"
$GroupMembers | export-csv "C:\Users\user1\Documents\acl cleanup\Dept2\$($record.name)_GroupMembers.csv"
}
NOTE: The dir list it reads in is the top level folders created from a get-childitem2 -directory | export-csv filename.csv
During the run, it appears to not be flushing memory properly. This is just a guess from observation. At the end of each run through the code, the variables should be getting over-written, I thought, but memory doesn't go down, so it looked to me that since memory didn't go back down, that it wasn't properly releasing it? Like I said, a guess... I have been reading about runspaces but I am confused about how to implement that with this script. Is that the right direction for this?
Thanks in advance for any assistance...!
Funny you should post about this as I just finished a modified version of the script that I think works much better. A friend turned me on to 'Function Filters' that seem to work well here. Ill test it on the big directories tomorrow to see how much better the memory management is but so far it looks great.
#Define the function ‘filter’ here and call it ‘GetAcl’. Process is the keyword that tells the function to deal with each item in the pipeline one at a time
Function GetAcl {
PROCESS {
Get-NTFSAccess $_ | where -property accounttype -eq -value group | where -property account -notmatch -value '^builtin|^NT AUTHORITY\\|^Creator|^AD\\Domain'
}
}
#Import the directory top level paths
$Paths = import-csv 'C:\users\rknapp2\Documents\acl cleanup\dept2_Dir_List.csv'
#Process each line from the importcsv one at a time and run GetChilditem against it.
#Notice the second part – I ‘|’ pipe the results of the GetChildItem to the function that because of the type of function it is, handles each item one at a time
#When done, pass results to Exportcsv and send it to a file name based on the path name. This puts each dir into its own file.
ForEach ($Path in $paths) {
(Get-ChildItem2 -path $path.FullName -Recurse -directory) | getacl | export-csv "C:\Users\rknapp2\Documents\acl cleanup\TestFilter\$($path.name).csv" }

Using PowerShell, can I find when a user account was created?

It's a simple question. I know the information is there somewhere. I've been hammering away with Powershell for 3 days and getting close. I'm runnin' out of time to be honest.
Here's the situation. Person goes in and creates an account on a local machine (windows 7). How do I find when the account was created. I understand looking NTUSER.DATE dates in the profile, but that doesn't quite work. I get information all spread out in the csv and no easy way to get it readable.
Get-Item -force "C:\Users\*\ntuser.dat" |
Where {((Get-Date)-$_.lastwritetime).days } |
Out-File c:\profiles.csv
I can see the information in the security log, and I can pull all 4720 events. However, that too is inconsistent, especially if some rascal (like me) cleared out the event log a couple of months ago.
Get-EventLog -LogName Security | Where-Object { $_.EventID -eq 4720 } |
EXPORT-CSV C:\NewStaff.csv
But that doesn't really get me what I need either. All I need is the username and the date the account was created. I know it's not that simple (although it should be LOL). It's a one time job and I suck at Powershell (although, I've learned a lot over the past couple of days).
Anyway, if someone wouldn't mind throwing me a bone, I'd appreciate it.
Thanks for looking.
You are indeed very close. All you need now is formatting. Example:
Get-EventLog -LogName "Security" | Where-Object { $_.EventID -eq 4720 } `
| Select-Object -Property `
#{Label="LogonName";Expression={$_.ReplacementStrings[0]}}, `
#{Label="CreationTime";Expression={$_.TimeGenerated}} `
| Export-Csv C:\NewStaff.csv -NoTypeInformation