Using PowerShell, can I find when a user account was created? - powershell

It's a simple question. I know the information is there somewhere. I've been hammering away with Powershell for 3 days and getting close. I'm runnin' out of time to be honest.
Here's the situation. Person goes in and creates an account on a local machine (windows 7). How do I find when the account was created. I understand looking NTUSER.DATE dates in the profile, but that doesn't quite work. I get information all spread out in the csv and no easy way to get it readable.
Get-Item -force "C:\Users\*\ntuser.dat" |
Where {((Get-Date)-$_.lastwritetime).days } |
Out-File c:\profiles.csv
I can see the information in the security log, and I can pull all 4720 events. However, that too is inconsistent, especially if some rascal (like me) cleared out the event log a couple of months ago.
Get-EventLog -LogName Security | Where-Object { $_.EventID -eq 4720 } |
EXPORT-CSV C:\NewStaff.csv
But that doesn't really get me what I need either. All I need is the username and the date the account was created. I know it's not that simple (although it should be LOL). It's a one time job and I suck at Powershell (although, I've learned a lot over the past couple of days).
Anyway, if someone wouldn't mind throwing me a bone, I'd appreciate it.
Thanks for looking.

You are indeed very close. All you need now is formatting. Example:
Get-EventLog -LogName "Security" | Where-Object { $_.EventID -eq 4720 } `
| Select-Object -Property `
#{Label="LogonName";Expression={$_.ReplacementStrings[0]}}, `
#{Label="CreationTime";Expression={$_.TimeGenerated}} `
| Export-Csv C:\NewStaff.csv -NoTypeInformation

Related

Export more than 1000 rows in Powershell

I am trying to access some data about Unified Groups using PowerShell.
Get-UnifiedGroup | ? {$_.AccessType -eq "Public"}
This is the command I am using, however I am also trying to export this data to CSV.
So the command becomes
Get-UnifiedGroup | ? {$_.AccessType -eq "Public"} | Export-Csv c:\temp\azureadusers.csv
But it only displays first 1000 results in the csv file and I am trying to get all of the data. I am new to PowerShell so I am still learning this.
How can I achieve this?
you may want too look at the -Filter Parameter too. It's always a good thing to filter as far left as possible. Mostly because it's free Performance gain.
-Filter {AccessType -eq "Public"}

Script triggers virus scanner. How can we slow it down?

This script that identifies duplicate files triggers a virus scanner. How can we slow it down?
Get-ChildItem -Recurse -File `
| Group-Object -Property Length `
| ?{ $_.Count -gt 1 } `
| %{ $_.Group } `
| Get-FileHash `
| Group-Object -Property Hash `
| ?{ $_.Count -gt 1 } `
| %{ $_.Group }
| %{ $_.path -replace "$([regex]::escape($(pwd)))",'' }
Is there a way to put like a 2 second pause in between files so it takes a long time to complete?
TIA
Edits for comments:
Don't want to say the antivirus software but it's very advanced.
I got the backticks from stack overflow, so garbage in garbage out :) [seriously thanks for the tip]
It works flawlessly on network shares with about 100 files on it.
The speed of your script isn't the problem with an A/V scanner. My guess is possibly the use of [regex]::replace(pattern, text) and Get-FileHash could be something your A/V flags on during heuristic analysis. Without knowing the A/V software, it's impossible to know if others have experienced and resolved the same problem you are having.
The real correct answer is to open a ticket with your A/V vendor on it flagging false positives. Signing your scripts is also known to help scripts with A/V some. Some A/Vs allow whitelisting by checksum, which you could use to approve your scripts if your vendor doesn't have any alternatives. Using the checksum of a signed script is even safer, as you can guarantee the code came from your organization at the time the checksum is calculated.
You can also configure A/V software to whitelist a directory, and you can effectively work around the issue by running scripts out of that directory while you sort the issue with your vendor. However, whitelisting by path should not be your permanent solution. Figure out why your scripts are getting flagged with the vendor, then follow their recommendations.
That said, to answer your original question "Is there a way to put like a 2 second pause in between files....?", yes. Start-Sleep will achieve what you want (but I have serious doubts it would affect your A/V results). The last block can be one line but is made multiline for readability (the semicolon ; is required if on one line):
Note: I've also replaced the backticks with better multi-line formatting. You can end a line with | operator and continue the code on the next line in a single expression. This also works for other operators as well.
This change has also fixed an issue in your original sample where you forgot the penultimate backtick. Backticks are easy to forget, and can be hard to look for. This is one reason why their use is not recommended for multi-line expressions.
Get-ChildItem -Recurse -File |
Group-Object -Property Length |
?{ $_.Count -gt 1 } |
%{ $_.Group } |
Get-FileHash |
Group-Object -Property Hash |
?{ $_.Count -gt 1 } |
%{ $_.Group } |
%{
$_.path -replace "$([regex]::escape($(pwd)))",'';
Start-Sleep 2
}

Unable to retrieve accessed folders from Event Logs using PowerShell

I am trying to control accesses to specific folder, so I have Audit Object Access policy enable and I've also enabled Auditing on the folder I want. Now I plan to see these accesses on a CSV file.
I have the following script that is supposed to achieve this
$OutputFileName = "EventsFrom-{0}.csv" -f (Get-Date -Format "MMddyyyy-HHmm")
Get-EventLog -LogName Security | Where-Object {$_.EventID -eq 4656} | Select-Object -Property TimeGenerated, MachineName, #{n='AccountName';e={$_.ReplacementStrings[1]}} | Export-CSV c:\scripts\$OutputFileName -NoTypeInformation
but the condition
Where-Object {$_.EventID -eq 4656}
causes the resulting CSV file to come out completely empty (even with no table headers). But when I change the Event ID (from 4656 to something like 4673) or remove the condition altoghether, I do get results on the resulting CSV.
Also, from the event viewer when I filter the results with the ID 4656, results do show up. Right now I genuinely don't know what to do. Thanks in advance for any help.
I appreciate if anyone could help me track down the cause for this. I don't really have much experience with PS scripting so a detailed expanation as to why this is happening (or the actual solution for my problem) would be very helpful.

PowerShell, extracting a list of SharePoint sites with owners and users, and making a CSV

I am trying to write a PowerShell scripts that accomplishes the following tasks with partial results :
Retrieving a list of SharePoint sites (including OneDrive sites) that contain a certain word in the url (done) ;
Using a ForEach-Object to retrieve the list of owners and users of each site (done);
Exporting these informations in a CSV file (partially done);
My problem is the number 3, I'm trying to make it with one column for the sites URLs, one column for the owner, and the 3rd column with all the users inside,
but unfortunately I'm only able to make the csv with the users list inside, here is the code that brought me to this point :
$username = "username#domain.onmicrosoft.com"
$password = "password" | ConvertTo-SecureString -asPlainText -Force
$credential = New-Object System.Management.Automation.PSCredential($username,$password)
$Tenanturl = "https://domain-admin.sharepoint.com/"
# Connessione all'account Admin SharePoint
Connect-SPOService -Url $TenantUrl -Credential $credential
Get-SPOSite -IncludePersonalSite $true -Limit all -Filter "Owner -like 'WordIWantToFilter'" |
ForEach-Object {
Get-SPOUser -Site $_.Url
} |
Select-Object -Property LoginName,UserType |
Export-Csv -Path "C:\Users\User\Desktop\SharePoint_Sites_List.csv" -NoTypeInformation
The result is a CSV file with LoginName,"UserType" in the cell A1, and the related info in the other rows.
What I am trying to accomplish :
First column for the sites URLs,
Second column for the owner of the sites,
and the 3rd column with all the users of each site inside.
I know I am missing a lot of stuff, but I'm not a developer whatsoever :),
these are some of the links I used to make this code,
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.utility/export-csv?view=powershell-7
https://techcommunity.microsoft.com/t5/sharepoint/list-all-site-collection-admins-powershell/m-p/264135
what should I look for ? I'm looking for tips or even just little pieces of code,
thanks
I predict you'll run into woes with this structure here:
Get-SPOSite -IncludePersonalSite $true -Limit all -Filter "Owner -like 'WordIWantToFilter'" |
ForEach-Object {
Get-SPOUser -Site $_.Url
} |
Select-Object -Property LoginName,UserType |
Export-Csv -Path "C:\Users\User\Desktop\SharePoint_Sites_List.csv" -NoTypeInformation
These pipes will make things confusing. As an Automation Consultant and Engineer for a variety of firms for five years, I would avoid over-reliance on pipes, as it makes debugging code tricky and error prone.
I would rewrite it like this, minimizing those piped statements:
$filteredSites = Get-SPOSite -IncludePersonalSite $true -Limit all -Filter "Owner -like 'WordIWantToFilter'"
"Found $($filteresSites.Counts) from previous line"
$sitesArray=#() #make empty array to hold results
ForEach($site in $filteredSites){
$sitesArray += Get-SPOUser -Site $_.Url
}
"Have $($sitesArray.Count) sites from previous line"
#good place to debug the output, btw :)
#prepare to export
"Run in the PowerShell ISE, we will have a left-over object `$sitesArray` we can use to test exporting "
$sitesArray | Select-Object -Property LoginName,UserType |
Export-Csv -Path "C:\Users\User\Desktop\SharePoint_Sites_List.csv" -NoTypeInformation
Then to use it, open this all in the PowerShell ISE and run it. You'll get some helpful logging to the console and the ISE leaves some state in time variables behind which makes it really easy to troubleshoot.
These changes will make it easier to determine where you're losing data. For instance, if you run this and see an empty file, and also have an empty $sitesArray, then your filtered in $filteredSites was too exclusive.
If you're still stuck, post an update.

Get-ADComputer Lastlogondate / Lastlogon

I'm currently asking myself if it is possible to determine the last logon time of any user of a computer object which is connected to an active directory?
I need to find out when any user was logged onto a specific computer which is still online, communicating with the domain but was not in use in the last X days by any user.
I've already tried the following queries:
get-adcomputer $computername -Properties lastlogon | select
#{Name="lastLogon";Expression={[datetime]::FromFileTime($_.'lastLogon')}}
AND
get-adcomputer za31testvmrobin -Properties lastlogondate
I'm expecting the timestamp of the last logondate of a user on a computer object.
Hope you can help me.
I somehow figured it out with help from #boxdog . Thanks for that.
Here is the Powershell Code in one line:
Get-EventLog -LogName Security -InstanceId 4624 -ComputerName $computer |
`where {$_.Message -match "Kontoname: USERNAME" -and
`$_.Message -match "Anmeldetyp: 2" } | select -First 1)
Kontoname = Accountname
Anmeldetyp = Logontype (2 means interactive from console with keyboard & mouse)
The tabulator is needed. You can also use wildcards like an asterisk.
I could not find an easier way to get it working. Therefor I had to use the comparison operator "match" to find a string with which I could search within the Message property of the Eventlog.
Unfortunately searching takes some time. Via remote it takes up to 5 minutes each computer which is quiet unsatisfying.
Maybe someone has another solution which is faster or knows a way to work parallel, actually I don't really know how to do that, because I'm getting content with
get-content c:\data\input.txt
Thanks in advance