My sample log looks like :
2022-09-01 23:13:05Z | error | 2022-09-02 02:13:05 - [Task] Id:120 Name:OPT_VIM_1HEAD Exception with index:18 | 18.9251137 | Exception:
ERROR connection to partner '10.19.101.17:3300' broken
2022-09-01 23:13:25Z | error | 2022-09-02 02:13:25 - [Task] Id:121 Name:OPT_VIM_1ITEM
ERROR connection to partner '10.19.101.22:3300' broken
2022-09-01 23:13:25Z | error | 2022-09-02 02:13:25 - [Task] Id:121 Name:OPT_VIM_1ITEM RunId:7 Task execution failed with error: One or more errors occurred., detail:
ERROR connection to partner '10.19.101.22:3300' broken
I want to extract the job name OPT_VIM_1HEAD or OPT_VIM_1ITEM (its dynamic) and also the timestamp after the "error" pattern : 2022-09-02 02:13:25 or 2022-09-02 02:13:05 in different variables.
I have also written the script as :
$dir = 'C:\ProgramData\AecorsoftDataIntegrator\logs\'
$StartTime = get-date
$fileList = (Get-ChildItem -Path $dir -Filter '2022-09-02.log' | Sort-Object LastWriteTime -Descending | Select-Object -First 1).fullname
$message = Get-Content $fileList | Where-Object {$_ -like ‘*error*’}
$message
$details = Select-String -LiteralPath $fileList -Pattern 'error' -Context 0,14 | Select-Object -First 1 | Select-Object Path, FileName, Pattern, Linenumber
$details[0]
But not able to retrieve the tokens mentioned above
Use regex processing via the -match operator to extract the tokens of interest from each line:
# Sample lines from the log file.
$logLines = #'
2022-09-01 23:13:05Z | error | 2022-09-02 02:13:05 - [Task] Id:120 Name:OPT_VIM_1HEAD Exception with index:18 | 18.9251137 | Exception:
ERROR connection to partner '10.19.101.17:3300' broken
2022-09-01 23:13:25Z | error | 2022-09-02 02:13:25 - [Task] Id:121 Name:OPT_VIM_1ITEM
ERROR connection to partner '10.19.101.22:3300' broken
2022-09-01 23:13:25Z | error | 2022-09-02 02:13:25 - [Task] Id:121 Name:OPT_VIM_1ITEM RunId:7 Task execution failed with error: One or more errors occurred., detail:
ERROR connection to partner '10.19.101.22:3300' broken
'# -split '\r?\n'
# Process each line...
$logLines | ForEach-Object {
# ... by matching it ($_) against a regex with capture groups - (...) -
# using the -match operator.
if ($_ -match '\| (\d{4}-.+?) - \[.+? Name:(\w+)') {
# The line matched.
# Capture groups 1 and 2 in the automatic $Matches variable contain
# the tokens of interest; assign them to variables.
$timestamp = $Matches.1
$jobName = $Matches.2
# Sample output, as an object
[PSCustomObject] #{
JobName = $jobName
Timestamp = $timestamp
}
}
}
Output:
JobName Timestamp
------- ---------
OPT_VIM_1HEAD 2022-09-02 02:13:05
OPT_VIM_1ITEM 2022-09-02 02:13:25
OPT_VIM_1ITEM 2022-09-02 02:13:25
For an explanation of the regex and the ability to experiment with it, see this regex101.com page.
Related
I have a command like so:
(($remoteFiles | Where-Object { -not ($_ | Select-String -Quiet -NotMatch -Pattern '^[a-f0-9]{32}( )') }) -replace '^[a-f0-9]{32}( )', '$0= ' -join "`n") | ConvertFrom-StringData
sometimes it throws a
ConvertFrom-StringData : Data item 'a3512c98c9e159c021ebbb76b238707e' in line 'a3512c98c9e159c021ebbb76b238707e = My Pictures/Tony/Automatic Upload/Tony’s iPhone/2022-10-08 21-46-21 (2).mov'
is already defined.
BUT I believe there to be more and the error is only thrown on the FIRST occurrence, is there a way to get all of the errors so I can act upon them?
is there a way to get all of the errors
I'm afraid there is not, because what ConvertFrom-StringData reports on encountering a problem is a statement-terminating error, which means that it aborts its execution instantly, without considering further input.
You'd have to perform your own analysis of the input in order to detect multiple problems, such as duplicate keys; e.g.:
#'
a = 1
b = 2
a = 10
c = 3
b = 20
'# | ForEach-Object {
$_ -split '\r?\n' |
Group-Object { ($_ -split '=')[0].Trim() } |
Where-Object Count -gt 1 |
ForEach-Object {
Write-Error "Duplicate key: $($_.Name)"
}
}
Output:
Write-Error: Duplicate key: a
Write-Error: Duplicate key: b
I am trying to use the Format-Table command to output an array of hash tables of all files checked out from our TFS repo.
My code thus far:
$arr = #();
#Take the string from the tf command, parse it and build an array of hash tables
(tf stat /recursive /user:* /format:detailed | Select-String -Pattern '^\$' -NotMatch | Select -SkipLast 3 | Out-String) -split '(\r\n){2}' | ForEach-Object {
$ht = #{};
if ($_ -ne '') {
$str = $_ | Out-String;
$str -split '\r?\n'| ForEach-Object {
$key, $value = $_ -split '\s*:\s*';
#Write-Host $key, $Value;
try {
$ht.Add($key, $value);
} catch [ArgumentException] {
Write-Host "Caught exception";
}
}
$arr += ($ht);
}
}
Edit
Looks like I'm erroring out here.
$arr.ForEach({[PSCustomObject]$_}) | Format-Table -AutoSize
Full Error:
Cannot convert value "System.Collections.Hashtable" to type
"System.Management.Automation.LanguagePrimitives+InternalPSCustomObject". Error: "Cannot process argument because the
value of argument "name" is not valid. Change the value of the "name" argument and run the operation again."
At C:\Dev\Tools\powershell\Convert-TfsOutput.ps1:21 char:15
+ $arr.ForEach({[PSCustomObject]$_}) | Format-Table -AutoSize
+ ~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidArgument: (:) [], RuntimeException
+ FullyQualifiedErrorId : InvalidCastConstructorException
Edit2
Here is sample output when i replace the above line with:
$arr.ForEach({ $_ | Out-String })
Name Value
---- -----
Workspace work1
Date {Wednesday, September 5, 2018 1, 38, 48 PM}
Local item file1
File type Windows-1252
User user1
Lock none
Change edit
Name Value
---- -----
Workspace work2
Date {Monday, September 10, 2018 12, 14, 56 PM}
Local item file2
User user2
Lock none
Change edit
Edit 3
Output of the below command
Write-Host $str;
User : User1
Date : Wednesday, September 5, 2018 1:38:48 PM
Lock : none
Change : edit
Workspace : Work1
Local item : File1
File type : Windows-1252
User : User2
Date : Monday, September 10, 2018 12:14:56 PM
Lock : none
Change : edit
Workspace : Work2
Local item : File2
Would like the output in a tabular format with rows below the column names:
Workspace | Date | Local item | File type | User | Lock | Change
Tried to use the code in another answer but it does not output correctly.
Format-Table on Array of Hash Tables
Convert your hashtables to custom objects before passing them to Format-Table.
... | Where-Object { $_ } | ForEach-Object {
$ht = #{};
($_ | Out-String) -split '\r?\n'| ForEach-Object {
...
}
New-Object -Type PSObject -Property $ht
} | Format-Table
Edit: Looks like your input data has blank lines which lead to keys with empty strings in your hashtables, which then cause the error you observed, because objects can't have a property with an empty string for a name.
Change your hashtable/object creation to something like this:
... | Where-Object { $_ } | ForEach-Object {
$ht = ($_ | Out-String).Trim() -replace '\s+:\s+', '=' |
ConvertFrom-StringData
New-Object -Type PSObject -Property $ht
} | Format-Table
I am trying to extract error lines from a log file which are defined by two things. The log file line looks like this:
2018-05-22 06:25:35.309 +0200 (Production,S8320,DKMdczmpOXVJtYCSosPS6SfK8kGTSN1E,WwObvwqUw-0AAEnc-XsAAAPR) catalina-exec-12 : ERROR com.tableausoftware.api.webclient.remoting.RemoteCallHandler - Exception raised by call target: User 2027 does not have permissions to view comments for view 13086. (errorCode=1)
com.tableausoftware.domain.exceptions.PermissionDeniedException: User 2027 does not have permissions to view comments for view 13086. (errorCode=1)
The error is described in two lines, so I need to filter the error and the current hour and then copy it into a file.
This code does does copy all the errors, but not only from the current hour.
$hodina = (Get-Date -UFormat "%H").ToString()
$hodina = " " + $hodina +":"
$err = ": ERROR"
$errors = Select-String -Path "D:\..\file.log" -Pattern $hodina, $err -Context 0, 1
echo ($errors).Line >> Errors_file.txt
So I was wondering, how to put multiple variables into -Pattern, or if there is another solution to this problem.
Here is how to get all of the matching lines:
Get-Content "file.log" |
Select-String -Pattern "^(?:(\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2})).* : ERROR (?:(.*))$" |
ForEach-Object {
[PsCustomObject]#{
TimeStamp=(Get-Date $_.Matches.Groups[1].Value)
LineNumber=$_.LineNumber
Error=$_.Matches.Groups[2].Value
}
}
This will give you output like this:
TimeStamp LineNumber Error
--------- ---------- -----
22/05/2018 06:25:35 1 com.tableausoftware.api.webclient.remoting.RemoteCallHandler - Exception raised...
22/05/2018 06:25:35 4 com.tableausoftware.api.webclient.remoting.RemoteCallHandler - Exception raised...
22/05/2018 06:25:35 8 com.tableausoftware.api.webclient.remoting.RemoteCallHandler - Exception raised...
22/05/2018 06:25:35 10 com.tableausoftware.api.webclient.remoting.RemoteCallHandler - Exception raised...
If you only want the items where the hour of the timestamp matches the current hour, modify the code like this:
Get-Content "file.log" |
Select-String -Pattern "^(?:(\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2})).* : ERROR (?:(.*))$" |
ForEach-Object {
[PsCustomObject]#{
TimeStamp=(Get-Date $_.Matches.Groups[1].Value)
LineNumber=$_.LineNumber
Error=$_.Matches.Groups[2].Value
}
} | Where-Object {$_.TimeStamp.Hour -eq (Get-Date).Hour}
You can then send the output to file, or better (if you plan to manipulate them later in PowerShell), CSV (Export-Csv) or CliXml (Export-CliXml)
I have a debug log which spits out certain numbers that follow a preset error message.
For instance
08:29:25.178 [DEBUG] Error lookup ID 2834
08:29:25.179 [DEBUG] Error lookup ID 2834
The main reason I want to do this is to be able to then just output the unique instances of this ID (in above example would be just one 2834). It is not possible otherwise as the timestamp in the line makes it unique. So therefore I need to only output the id at the end (in this case 2834).
I currently have following script which works but I am wondering if there is not a more efficient/elegant way to do all this.
$tempfile='tempfile.txt'
$tempfile2='tempfile2.txt'
$tempfile3='tempfile3.txt'
$finalfile='missingIDs.txt'
get-content 20180131.log -ReadCount 1000 |
foreach { $_ -match " Error lookup ID" } > $tempfile
get-content $tempfile | % { $_.Split(' ')[-1] } >$tempfile2
gc $tempfile2 | sort | get-unique > $tempfile3
gc $tempfile3| get-unique > $finalfile
Restating the problem for clarity:
Given lines of input, find " Error lookup ID" followed by a string of numbers, ID. Return all unique ID found in the input.
$testInput = #(
"08:29:25.177 [INFO] system started 5342"
"08:29:25.177 [DEBUG] Error lookup ID 2834"
"08:29:25.178 [TRACE] entered something"
"08:29:25.179 [DEBUG] Error lookup ID 2834"
"08:29:25.179 [DEBUG] Error lookup ID 2836"
)
$testInput | % { if ($_ -match ".*Error lookup ID (\d+)"){$Matches.1} } | Select-Object -Unique
Remove the intermediate text files, and using the pipeline instead.
$finalfile='missingIDs.txt'
Get-Content 20180131.log -ReadCount 1000 |
foreach { $_ -match " Error lookup ID" } |
foreach { $_.Split(' ')[-1]} |
Sort-Object -Unique |
Out-File $finalfile
This makes your whole process more efficient, as there's no disk writes/reads.
I am running this command to pull the last line of a log file:
Get-Content c:\temp\MigrationJobStatus-20171020-123839-515.log |
Select-Object -Last 1
The results do give me the last line, but now I need to filter the results:
10/20/2017 12:38:56 PM Information [Event]: [JobEnd], [JobId]: [70b82296-b6e2-4539-897d-c46384619059], [Time]: [10/20/2017 12:38:49.074], [FilesCreated]: [0], [BytesProcessed]: [0], [ObjectsProcessed]: [34], [TotalExpectedSPObjects]: [34], [TotalErrors]: [19], [TotalWarnings]: [3], [TotalRetryCount]: [0], [MigrationType]: [None], [MigrationDirection]: [Import], [CreatedOrUpdatedFileStatsBySize]: [{}], [ObjectsStatsByType]: [{"SPUser":{"Count":1,"TotalTime":0,"AccumulatedVersions":0,"ObjectsWithVersions":0},"SPFolder": "Count":4,"TotalTime":629,"AccumulatedVersions":0,"ObjectsWithVersions":0},"SPDocumentLibrary":"Count":1,"TotalTime":68,"AccumulatedVersions":0,"ObjectsWithVersions":0},"SPFile":{"Count":13,"TotalTime":0,"AccumulatedVersions":0,"ObjectsWithVersions":0},"SPListItem":{"Count":16,"TotalTime":2240,"AccumulatedVersions":0,"ObjectsWithVersions":0}}], [CorrelationId]: [7bbf249e-701a-4000-8eee-c4a7ef172063]
I need to be able to pull the following and export to CSV:
[JobId]: [70b82296-b6e2-4539-897d-c46384619059]
[FilesCreated]: [0]
[BytesProcessed]: [0]
[ObjectsProcessed]: [34]
[TotalExpectedSPObjects]: [34]
[TotalErrors]: [19]
[TotalWarnings]: [3]
Can someone give me some ideas on how to accomplish this?
I am doing a OneDrive 4 Business migration and need to pull the results of the Get-SPOMigrationJobProgress log for a few thousand users.
Need to add other fields there and then save results using Out-File
$results = ""
$fields = #("[JobId]", "[FilesCreated]")
$items = get-content c:\temp\MigrationJobStatus-20171020-123839-515.log | select-object -last 1 | %{ $_.Split(",")}
foreach($item in $items)
{
$field = ($item.Split(":")[0]).Trim()
if($fields.Contains($field)) { $results+= "$item`r`n" }
}
Write-Host $results
You can use split and grab the fields you need.
$text = get-content c:\temp\MigrationJobStatus-20171020-123839-515.log | select-object -last 1
$text = ($text -split ",").Trim(" ")
$csvtext = #"
$($text[3])
$($text[4])
$($text[5])
$($text[6])
$($text[7])
$($text[8])
"#
$csvtext | Out-File ".\logfile.csv"
You can get to the fields that you want by using regular expression and then create a psobject from each match:
$regexPattern = '\[([^]]+)\]: \[([^]]+)\]'
$result = Get-Content c:\temp\MigrationJobStatus-20171020-123839-515.log |
Select-Object -Last 1 |
Select-String -Pattern $regexPattern -AllMatches |
ForEach-Object { $_.Matches.Value } |
ForEach-Object { $_ -match $regexPattern |
Select-Object #{n='Name';e={$Matches[1]}},#{n='Value';e={$Matches[2]}} }
You can filter down the resulting object collection with Where-Object and use Export-Csv to get your result into a csv file.