I am trying to write a code to parse a logfile within a specific date range, logfile content is below:
For eg: extracting the date (11/28 07:08:46) and parsing it.
[C79C] ComputerName:BETHGARWICK UserID:A0006 Beth Garwick Station 9 LanId: | (11/28 07:08:46) | Client is disconnected from agent.
[C79C] ComputerName:BETHGARWICK UserID: Logged out Station 0 LanId: | (11/28 07:08:51) | Client is connected to agent.
[EB7C] ComputerName:APT UserID:A0005 Kelley Zajac Station 4 LanId: | (11/28 07:12:08) | Client is disconnected from agent.
[EB7C] ComputerName:APT UserID:A0005 Kelley Zajac Station 4 LanId: | (11/28 07:12:13) | Client is connected to agent.
[EC44] ComputerName:KCUTSHALL-PC UserID:GO kcutshall Station 9900 LanId: | (11/28 07:55:08 - 11/28 07:55:18) | Average limit (300) exceeded while pinging www.google.com [74.125.224.82] 3 times
[EC44] ComputerName:KCUTSHALL-PC UserID:GO kcutshall Station 9900 LanId: | (11/28 07:55:23) | Average limit (300) exceeded while pinging www.google.com [www.google.com]
[EC44] ComputerName:KCUTSHALL-PC UserID:GO kcutshall Station 9900 LanId: | (11/28 07:55:29 - 11/28 07:55:49) | Average limit (300) exceeded while pinging www.google.com [74.125.224.50] 5x
[EC44] ComputerName:KCUTSHALL-PC UserID:GO kcutshall Station 9900 LanId: | (11/28 07:55:54 - 11/28 07:56:45) | Average limit (300) exceeded while pinging www.google.com [74.125.224.50] 11 times
[EC44] ComputerName:KCUTSHALL-PC UserID:GO kcutshall Station 9900 LanId: | (11/28 07:56:50) | Average limit (300) exceeded while pinging www.google.com [www.google.com]
Tried .net functions parseexact(), parse() but did not work.
$patter = 'mm/dd'
$culture = [Globalization.CultureInfo]::InvariantCulture
$logfiles = Get-Content -Path "C:\Users\ABC\Desktop\Temp\HTTPS\QoS_logs\test.logs"
$logfiles | foreach {
$dateasText = $_.ToString().Split("|")[1].Replace("(","").Replace(")","").Trim()
$date = [DateTime]::ParseExact($dateasText,$pattern,$null)
Exeption calling "ParseExact" with "3" argument(s): "String was not recognized as a valid DateTime."
At "C:\Users\ABC\Desktop\Temp\HTTPS\QoS_logs\test.logs:23 char:1
+ $date = [DateTime]::ParseExact($dateasText,$pattern,$null)
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : FormatException
Use $pattern as a RegEx to grep the date in a capture group dt
## Q:\Test\2019\09\06\SO_57819852.ps1
$pattern = ' LanId: \| \((?<dt>[01][0-9]/[0-3][0-9] [0-2][0-9]:[0-5][0-9]:[0-5][0-9])'
$logfile = "C:\Users\ABC\Desktop\Temp\HTTPS\QoS_logs\test.logs" # ".\test.logs" #
foreach($Line in Get-Content $logfile) {
if ($Line -match $pattern){
$date = [DateTime]::ParseExact($Matches.dt,'MM/dd HH:mm:ss',$null)
$date
}
}
Sample output in my German locale:
Donnerstag, 28. November 2019 07:08:46
Donnerstag, 28. November 2019 07:08:51
Donnerstag, 28. November 2019 07:12:08
Donnerstag, 28. November 2019 07:12:13
Donnerstag, 28. November 2019 07:55:08
Donnerstag, 28. November 2019 07:55:23
Donnerstag, 28. November 2019 07:55:29
Donnerstag, 28. November 2019 07:55:54
Donnerstag, 28. November 2019 07:56:50
Related
2022.10.23 08:27:01.829 | INFO | Request finished in 3.892ms 200 application/json; charset=utf-8
2022.10.23 08:27:05.044 | DBUG | Starting HttpMessageHandler cleanup cycle with 4 items
2022.10.23 08:27:05.044 | DBUG | Ending HttpMessageHandler cleanup cycle after 0.0048ms - processed: 0 items - remaining: 4 items
2022.10.23 23:31:05.097 | INFO | Request finished in 9.0403ms 202
2022.10.23 08:27:15.052 | DBUG | Starting HttpMessageHandler cleanup cycle with 4 items
2022.10.23 08:27:15.052 | DBUG | Ending HttpMessageHandler cleanup cycle after 0.0022ms - processed: 0 items - remaining: 4 items
2022.10.23 00:27:01.544 | INFO | Request finished in 3.8349ms 200 application/json; charset=utf-8
2022.10.23 00:42:01.551 | INFO | Request finished in 4.7531ms 200 application/json; charset=utf-8
I got .log files with lines like this. I need to get the time and date in a table like this. I had to filter out and count how many lines with "ms 200", "ms 202", "ms 400" etc. are there and find the file where they are.
This is my working code:
$logpath = "C:\Users\matus\OneDrive\Desktop\praca\PosybeRestEPService\*.log"
Get-ChildItem $logpath -Filter *.log |
Foreach-Object {
Select-String -Path $_.FullName -Pattern '(?<=\d.*?ms )(2|3|4|5)\d+' |
Group-Object -Property { $_.Matches.Value } |
Select-Object -Property Count, Name, #{
Name = 'Filename'
Expression = { $_.Group[0].Filename }
}
}
I got the right output:
Count Name Filename
----- ---- --------
68 202 PosybeRestEPService20221020.log
96 200 PosybeRestEPService20221020.log
96 200 PosybeRestEPService20221021.log
34 202 PosybeRestEPService20221021.log
96 200 PosybeRestEPService20221022.log
149 202 PosybeRestEPService20221022.log
1 400 PosybeRestEPService20221022.log
96 200 PosybeRestEPService20221023.log
1 202 PosybeRestEPService20221023.log
96 200 PosybeRestEPService20221024.log
165 202 PosybeRestEPService20221024.log
96 200 PosybeRestEPService20221025.log
32 202 PosybeRestEPService20221025.log
96 200 PosybeRestEPService20221026.log
154 202 PosybeRestEPService20221026.log
96 200 PosybeRestEPService20221027.log
92 202 PosybeRestEPService20221027.log
Now i need to write day and time when these "ms 200", "ms 202", "ms 400" occurred.
I tried something like this:
$errors = #($logpath | Where-Object { $_ -match '^\s*(\d{2}/\d{2}/\d{4} \d{2}:\d{2}:\d{2}).*Errors:' } |
Where-Object { [datetime]::ParseExact($matches[1], 'dd/MM/yyyy HH:mm:ss', $null) }).Count
Are you perhaps after something more along the lines of:
$logpath = "C:\Users\matus\OneDrive\Desktop\praca\PosybeRestEPService\*.log"
Select-String -Path $logpath -Pattern '(?<=\d.*?ms )(2|3|4|5)\d+' |
Group-Object -Property { $_.Matches.Value } |
Select-Object -Property Count, Name, #{
Name = 'Filename'
Expression = { $_.Group.Filename -join ', ' }
},#{
Name = 'Date'
Expression = {
($_.Group.Line |
ForEach-Object -Process {
$_ -match '^.*?(?=\|)' | Out-Null
[datetime]::ParseExact($Matches[0].Trim(),'yyyy.M.d HH:mm:ss.fff',[cultureinfo]::InvariantCulture)
}
) -join ', '
}
}
The pattern '^.*?(?=\|)' is somewhat of a cheat as it's just matching the beginning text all the way up to the first |. Since that's the whole date, you can use the datetime class method of ParseExact to convert it over to a datetime object.
This should output the file names that those patterns were matched along with the dates all separated by a comma (,). Now, the output is something like:
Count Name Filename Date
----- ---- -------- ----
3 200 1.txt, 2.txt, 7.txt 9/20/2020 1:42:01 PM, 10/21/2022 1:42:01 PM, 10/20/2022 1:42:01 PM
2 400 3.txt, 6.txt 11/20/2022 1:42:01 PM, 10/20/2022 1:42:01 PM
1 300 4.txt 10/20/2020 13:42:01
1 500 5.txt 10/20/2022 13:42:01
I found a similar post regarding the problem in the link below.
How to fetch first column from given powershell array?
I am not able to directly convert it to a table as some fields are missing and do operations.
Customer ID Client Name Computer Name Computer Brand Duration Connection Time Lang
123 first last 127.0.0.1 lenovo 10:00 8/18/2019 6:00 PM Eng
1 lastname 127.0.0.2 apple 2:30:00 8/18/2019 1:00 AM Chn
86 user3 127.0.0.1 dell 8/18/2019 2:00 PM
21 user4 127.0.0.4 apple 30:00 8/17/2019 1:00 PM Eng
I want to first filter with a specific user who is connected for more than 30 minutes and then list its id.
Update
The result should be
1
21
because they are connected for 30min and over.
If the data you show is indeed the output of a Fixed-Width file, you need to try and get the widths for each field in order to parse it. A handicap here is that the original header names contain a space character and we need to replace that by an underscore.
For that, you can use the below function:
function ConvertFrom-FixedWith {
[CmdletBinding()]
Param(
[Parameter(Mandatory = $true, Position = 0)]
[string[]]$Content
)
$splitter = '§¤¶' # some unlikely string: Alt-21, [char]164, Alt-20
$needQuotes = '^\s+|[",]|\s+$' # quote the fields if needed
function _FWClean ([string]$field) {
# internal helper function to clean a field value with regards to quoted fields
$field = $_.Trim() -replace '(?<!\\)\\"|""', '§DQUOTE¶'
if ($field -match '^"(.*)"$') { $field = $matches[1] }
if ($field -match $needQuotes) { $field = '"{0}"' -f $field }
return $field -replace '§DQUOTE¶', '""'
}
# try and calculate the field widths using the first header line
# this only works if none of the header names have spaces in them
# and where the headers are separated by at least one space character.
Write-Verbose "Calculating column widths using first row"
$row = ($Content[0] -replace '\s+', ' ').Trim()
$fields = #($row -split ' ' ) # | ForEach-Object { _FWClean $_ })
$ColumnBreaks = for ($i = 1; $i -lt $fields.Length; $i++) {
$Content[0].IndexOf($fields[$i])
}
$ColumnBreaks = $ColumnBreaks | Sort-Object -Descending
Write-Verbose "Splitting fields and generating output"
$Content | ForEach-Object {
if ($null -ne $_ -and $_ -match '\S') {
$line = $_
# make sure lines that are too short get padded on the right
if ($line.Length -le $ColumnBreaks[0]) { $line = $line.PadRight(($ColumnBreaks[0] + 1), ' ') }
# add the splitter string on every column break point
$ColumnBreaks | ForEach-Object {
$line = $line.Insert($_, $splitter)
}
# split on the splitter string, trim, and dedupe possible quotes
# then join using the delimiter character
#($line -split $splitter | ForEach-Object { _FWClean $_ }) -join ','
}
} | ConvertFrom-Csv # the result is an array of PSCustomObjects
}
With that function in place, parsing the text can be done like so:
$text = #"
Customer_ID Client_Name Computer_Name Computer_Brand Duration Connection_Time Lang
123 first last 127.0.0.1 lenovo 10:00 8/18/2019 6:00 PM Eng
1 lastname 127.0.0.2 apple 2:30:00 8/18/2019 1:00 AM Chn
86 user3 127.0.0.1 dell 8/18/2019 2:00 PM
21 user4 127.0.0.4 apple 30:00 8/17/2019 1:00 PM Eng
"# -split '\r?\n'
# replace the single space characters in the header names by underscore
$text[0] = $text[0] -replace '(\w+) (\w+)', '$1_$2'
# the 'ConvertFrom-FixedWith' function takes a string array as input
$table = ConvertFrom-FixedWith -Content $text
#output on screen
$table | Format-Table -AutoSize
# export to CSV file
$table | Export-Csv -Path 'D:\test.csv' -NoTypeInformation
Output (on screen)
Customer ID Client Name Computer Name Computer Brand Duration Connection Time Lang
----------- ----------- ------------- -------------- -------- --------------- ----
123 first last 127.0.0.1 lenovo 10:00 8/18/2019 6:00 PM Eng
1 lastname 127.0.0.2 apple 2:30:00 8/18/2019 1:00 AM Chn
86 user3 127.0.0.1 dell 8/18/2019 2:00 PM
21 user4 127.0.0.4 apple 30:00 8/17/2019 1:00 PM Eng
If your input $text is already a string array storing all the ines as we see them in your question, then leave out the -split '\r?\n'
Having parsed the input to a table of PsCustomObjects, you can get the customers that are connected for 30 minutes or more with the help of another small helper function:
function Get-DurationInMinutes ([string]$Duration) {
$h, $m, $s = (('0:{0}' -f $Duration) -split ':' | Select-Object -Last 3)
return [int]$h * 60 + [int]$m
}
($table | Where-Object { (Get-DurationInMinutes $_.Duration) -ge 30 }).Customer_ID
This will output
1
21
Update
Now that we finally know the data is from a TAB delimited CSV file, you don't need the ConvertFrom-FixedWith function.
Simply import the data using if it comes from a file
$table = Import-Csv -Path 'D:\customers.csv' -Delimiter "`t"
Or, if it comes from the output of another command as string or string array:
$table = $original_output | ConvertFrom-Csv -Delimiter "`t"
Then, use the Get-DurationInMinutes helper function just like above to get the Customer ID's that are connected for more than 30 minutes:
function Get-DurationInMinutes ([string]$Duration) {
$h, $m, $s = (('0:{0}' -f $Duration) -split ':' | Select-Object -Last 3)
return [int]$h * 60 + [int]$m
}
($table | Where-Object { (Get-DurationInMinutes $_.Duration) -ge 30 }).'Customer ID'
Uhh. I'm surprised there's not a canonical way to do this. Based on https://www.reddit.com/r/PowerShell/comments/211ewa/how_to_convert_fixedwidth_to_pipedelimited_or/.
# 0 19 38 59 81 97 120 123
# Customer ID Client Name Computer Name Computer Brand Duration Connection Time Lang
# 123 first last 127.0.0.1 lenovo 10:00 8/18/2019 6:00 PM Eng
# 1 lastname 127.0.0.2 apple 2:30:00 8/18/2019 1:00 AM Chn
# 86 user3 127.0.0.1 dell 8/18/2019 2:00 PM
# 21 user4 127.0.0.4 apple 30:00 8/17/2019 1:00 PM Eng
$cols = 0,19,38,59,81,97,120,123 # fake extra column at the end, assumes all rows are that wide
$firstline = get-content columns.txt | select -first 1
$headers = for ($i = 0; $i -lt $cols.count - 1; $i++) {
$firstline.substring($cols[$i], $cols[$i+1]-$cols[$i]).trim()
}
# string Substring(int startIndex, int length)
$lines = Get-Content columns.txt | select -skip 1
$lines | ForEach {
$hash = [ordered]#{}
for ($i = 0; $i -lt $headers.length; $i++) {
$hash += #{$headers[$i] = $_.substring($cols[$i], $cols[$i+1]-$cols[$i]).trim()}
}
[pscustomobject]$hash
}
Output:
PS /Users/js/foo> ./columns | ft
Customer ID Client Name Computer Name Computer Brand Duration Connection Time Lan
----------- ----------- ------------- -------------- -------- --------------- ---
123 first last 127.0.0.1 lenovo 10:00 8/18/2019 6:00 PM Eng
1 lastname 127.0.0.2 apple 2:30:00 8/18/2019 1:00 AM Chn
86 user3 127.0.0.1 dell 8/18/2019 2:00 PM
21 user4 127.0.0.4 apple 30:00 8/17/2019 1:00 PM Eng
I think you have a couple of requirements here. I'm going to describe one way to do it using a generic 'for loop' and regular expression - something you can play with and tweak to your needs. There are better ways of doing this (Powershell shortcuts), but based on the way you asked I'm going to assume that understanding is your goal, so this code should serve well if you have a background in any programming language. Hope this helps!
# Here is your data formatted in an array. Missing values are just empty fields.
# You could have fewer or more fields, but I've broken up your data into nine fields
# (0-8 when counting elements in an array)
# Customer ID, FName, LName, ComputerHostName, Brand, Duration, ConnectionDate, ConnectionTime, Lang
$myarray = #(
('123', 'firstname', 'lastname', '127.0.0.1', 'lenovo', '10:00', '8/18/2019', '6:00 PM', 'Eng'),
('1', 'lastnam', '', '127.0.0.2', 'apple', '2:30:00', '8/18/2019', '1:00 AM', 'Chn'),
('86', 'user3', '', '127.0.0.1', 'dell', '04:33', '8/18/2019', '2:00 PM', ''),
('21', 'user4', '', '127.0.0.4', 'apple', '30:00', '8/17/2019', '1:00 PM', 'Eng')
)
# This is a generic for loop that prints the ComputerHostName, which is the 4th column.
# The 4th column is column #3 if counting from zero (0,1,2,3)
# I'm using a regular expression to match duration above 30 minutes with the '-match' operator
for ( $i = 0; $i -lt $myarray.Length; $i++ ) {
if ( $myarray[$i][5] -match "[3-5][0-9]:[0-9][0-9]$" ){
"$($myarray[$i][5]) - $($myarray[$i][3])"
}
}
Printed Result:
2:30:00 - 127.0.0.2
30:00 - 127.0.0.4
I have a log file with this:
Wed Oct 17 05:39:27 2018 : Resource = 'test04' cstep= 'titi04' time =18.751s
Wed Oct 17 05:40:31 2018 : Resource = 'test05' cstep= 'titi05' time =58.407s
Wed Oct 17 05:41:31 2018 : Resource = 'test06' cstep= 'titi06' time =3.400s
Wed Oct 17 05:42:31 2018 : Resource = 'test07' cstep= 'titi07' time =4.402s
I want split and want only the values greater than 5:
18.751
58.407
My script is in PowerShell and collects all values, not just values greater than 5:
$list = Get-Content "C:\Users\Desktop\slow_trans\log_file.txt"
$results = foreach ($line in $list) {
$line.Split('=')[3].Trim().TrimEnd('s')
}
$results
Results are
18.751
58.407
3.400
4.402
I want only
3.400
4.402
Changing the requirements on the fly is normally a no go,
so you don't deserve it.
Also the wording Superior 5 reminds me at a previous question from another user account..
Nevertheless here a script with a single pipe and datetime conversion.
## Q:\Test\2018\11\06\SO_53170145.ps1
Get-Content .\logfile.txt |
Where-Object {$_ -match '^(.*?) : .*time =([0-9\.]+)s'}|
Select-Object #{n='DT';e={([datetime]::ParseExact($Matches[1],'ddd MMM dd HH:mm:ss yyyy',[cultureinfo]::InvariantCulture).ToString('yyyy-MM-dd HH:mm:ss'))}},
#{n='val';e={[double]$Matches[2]}} |
Where-Object val -le 5
Sample output (decimal comma due to my German locale)
DT val
-- ---
2018-10-17 05:41:31 3,4
2018-10-17 05:42:31 4,402
the following casts the selected string as double and then returns only those which are less than 5
$results = Foreach ($line in $list) {
$val = [double]$line.Split('=')[3].Trim().TrimEnd('s')
if($val -lt 5) {
$val
}
}
Select-String is one option:
(Select-String -Path "TargetLog.txt" -Pattern ".*(?<time>\d+\.\d+)s").Matches |
ForEach-Object {
if([double]$_.Groups['time'].Value -lt 5.0) {$_.Value}
}
This will output the entire matching line:
Wed Oct 17 05:41:31 2018 : Resource = 'test06' cstep= 'titi06' time =3.400s
Wed Oct 17 05:42:31 2018 : Resource = 'test07' cstep= 'titi07' time =4.402s
If you only want the number from each line, change the if block to this:
{$_.Groups['time'].Value}
UPDATE
I have got a log file of 1000 lines containing some reference.
Time Reference Date of start Date of end
12:00 AT001 13 November 2011 15 November 2011
13:00 AT038 15 December 2012 17 December 2012
14:00 AT076 17 January 2013 19 January 2013
$ref1 = AT038
Basically, I want to parse the log file and have an output (line by line) for $ref1 such as :
Time : 13h
Reference : AT038
Date of start : 15 December 2012
Date of end : 17 December 2012
Thanks in advance
try:
$ref1 = "AT038"
$csv = Import-Csv .\myfile.txt -Delimiter ' '#Import file as CSV with space as delimiter
$csv | ? { $_.reference -EQ $ref1 } | FL #Piping each line of CSV to where-object cmdlet, filtering only line where value of column reference is equal to $ref1 variable value. Piping the result of the filtering to file-list to have output as requested in OP.
Code added after requisite are changed in OP:
$ref1 = "AT038"
$txt = gc .\myfile.txt
$txt2 = $txt | % { $b = $_ -split ' '; "$($b[0]) $($b[1]) $($b[2])_$($b[3])_$($b[4]) $($b[5])_$($b[6])_$($b[7])" }
$csv = convertfrom-csv -InputObject $txt2 -Delimiter ' '
$csv | ? { $_.reference -EQ $ref1 } | FL
How about this:
Get-Content SourceFileName.txt |
% { ($_ -Replace '(\d{2}):\d{2} (\w{2}\d{3})', 'Time : $1h|Reference : $2').Split('|')} |
Out-File TargetFileName.txt
Here is my revised version:
$regex = '(\d{2}):\d{2} (\w{2}\d{3}) (\d{2} \b\w+\b \d{4}) (\d{2} \b\w+\b \d{4})'
$replace = 'Time : $1h|Reference : $2|Date of start : $3|Date of end : $4'
Get-Content SourceFileName.txt |
% { ($_ -Replace $regex, $replace).Split('|')} |
Out-File TargetFileName.txt
I am querying a data source for dates. Depending on the item I am searching for, it may have more than date associated with it.
get-date ($Output | Select-Object -ExpandProperty "Date")
An example of the output looks like:
Monday, April 08, 2013 12:00:00 AM
Friday, April 08, 2011 12:00:00 AM
I would like to compare these dates and return which one is set further out into the future.
As Get-Date returns a DateTime object you are able to compare them directly. An example:
(get-date 2010-01-02) -lt (get-date 2010-01-01)
will return false.
I wanted to show how powerful it can be aside from just checking "-lt".
Example: I used it to calculate time differences take from Windows event view Application log:
Get the difference between the two date times:
PS> $Obj = ((get-date "10/22/2020 12:51:1") - (get-date "10/22/2020 12:20:1 "))
Object created:
PS> $Obj
Days : 0
Hours : 0
Minutes : 31
Seconds : 0
Milliseconds : 0
Ticks : 18600000000
TotalDays : 0.0215277777777778
TotalHours : 0.516666666666667
TotalMinutes : 31
TotalSeconds : 1860
TotalMilliseconds : 1860000
Access an item directly:
PS> $Obj.Minutes
31
Late but more complete answer in point of getting the most advanced date from $Output
## Q:\test\2011\02\SO_5097125.ps1
## simulate object input with a here string
$Output = #"
"Date"
"Monday, April 08, 2013 12:00:00 AM"
"Friday, April 08, 2011 12:00:00 AM"
"# -split '\r?\n' | ConvertFrom-Csv
## use Get-Date and calculated property in a pipeline
$Output | Select-Object #{n='Date';e={Get-Date $_.Date}} |
Sort-Object Date | Select-Object -Last 1 -Expand Date
## use Get-Date in a ForEach-Object
$Output.Date | ForEach-Object{Get-Date $_} |
Sort-Object | Select-Object -Last 1
## use [datetime]::ParseExact
## the following will only work if your locale is English for day, month day abbrev.
$Output.Date | ForEach-Object{
[datetime]::ParseExact($_,'dddd, MMMM dd, yyyy hh:mm:ss tt',$Null)
} | Sort-Object | Select-Object -Last 1
## for non English locales
$Output.Date | ForEach-Object{
[datetime]::ParseExact($_,'dddd, MMMM dd, yyyy hh:mm:ss tt',[cultureinfo]::InvariantCulture)
} | Sort-Object | Select-Object -Last 1
## in case the day month abbreviations are in other languages, here German
## simulate object input with a here string
$Output = #"
"Date"
"Montag, April 08, 2013 00:00:00"
"Freidag, April 08, 2011 00:00:00"
"# -split '\r?\n' | ConvertFrom-Csv
$CIDE = New-Object System.Globalization.CultureInfo("de-DE")
$Output.Date | ForEach-Object{
[datetime]::ParseExact($_,'dddd, MMMM dd, yyyy HH:mm:ss',$CIDE)
} | Sort-Object | Select-Object -Last 1
Considering you want to include time also, I have included sample. I am putting datetime in the ISO8601, so it works in locale agnostic manner.
Monday, April 08, 2013 12:00:00 AM
Friday, April 08, 2011 12:00:00 AM
(Get-date "2013-04-08T00:00:00") -lt (Get-Date "2011-04-08T00:00:00")
False