I am trying to extract error lines from a log file which are defined by two things. The log file line looks like this:
2018-05-22 06:25:35.309 +0200 (Production,S8320,DKMdczmpOXVJtYCSosPS6SfK8kGTSN1E,WwObvwqUw-0AAEnc-XsAAAPR) catalina-exec-12 : ERROR com.tableausoftware.api.webclient.remoting.RemoteCallHandler - Exception raised by call target: User 2027 does not have permissions to view comments for view 13086. (errorCode=1)
com.tableausoftware.domain.exceptions.PermissionDeniedException: User 2027 does not have permissions to view comments for view 13086. (errorCode=1)
The error is described in two lines, so I need to filter the error and the current hour and then copy it into a file.
This code does does copy all the errors, but not only from the current hour.
$hodina = (Get-Date -UFormat "%H").ToString()
$hodina = " " + $hodina +":"
$err = ": ERROR"
$errors = Select-String -Path "D:\..\file.log" -Pattern $hodina, $err -Context 0, 1
echo ($errors).Line >> Errors_file.txt
So I was wondering, how to put multiple variables into -Pattern, or if there is another solution to this problem.
Here is how to get all of the matching lines:
Get-Content "file.log" |
Select-String -Pattern "^(?:(\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2})).* : ERROR (?:(.*))$" |
ForEach-Object {
[PsCustomObject]#{
TimeStamp=(Get-Date $_.Matches.Groups[1].Value)
LineNumber=$_.LineNumber
Error=$_.Matches.Groups[2].Value
}
}
This will give you output like this:
TimeStamp LineNumber Error
--------- ---------- -----
22/05/2018 06:25:35 1 com.tableausoftware.api.webclient.remoting.RemoteCallHandler - Exception raised...
22/05/2018 06:25:35 4 com.tableausoftware.api.webclient.remoting.RemoteCallHandler - Exception raised...
22/05/2018 06:25:35 8 com.tableausoftware.api.webclient.remoting.RemoteCallHandler - Exception raised...
22/05/2018 06:25:35 10 com.tableausoftware.api.webclient.remoting.RemoteCallHandler - Exception raised...
If you only want the items where the hour of the timestamp matches the current hour, modify the code like this:
Get-Content "file.log" |
Select-String -Pattern "^(?:(\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2})).* : ERROR (?:(.*))$" |
ForEach-Object {
[PsCustomObject]#{
TimeStamp=(Get-Date $_.Matches.Groups[1].Value)
LineNumber=$_.LineNumber
Error=$_.Matches.Groups[2].Value
}
} | Where-Object {$_.TimeStamp.Hour -eq (Get-Date).Hour}
You can then send the output to file, or better (if you plan to manipulate them later in PowerShell), CSV (Export-Csv) or CliXml (Export-CliXml)
Related
First, my PS knowledge is very basic, so know that up front.
I'm working on a basic script to search EventIDs in archived .evtx files and kick out "reports". The Where-Object queries are in .txt files stored in .\AuditEvents\ folder. I'm trying to do a ForEach on the .txt files and pass each query to Get-WinEvent.
Here's an example of how the queries appear in the .txt files:
{($_.ID -eq "11")}
The script is:
$ae = Get-ChildItem .\AuditEvents\
ForEach ($f in $ae) {
$qs = Get-Content -Path .\AuditEvents\$f
Get-WinEvent -Path .\AuditReview\*.evtx -MaxEvents 500 | Select-Object TimeCreated, ID, LogName, MachineName, ProviderName, LevelDisplayName, Message | Where-Object $qs | Out-GridView -Title $f.Name
}
This is the error:
Where-Object : Cannot bind argument to parameter 'FilterScript' because it is null.
At C:\Users\######\Desktop\PSAuditReduction\PSAuditReduction.ps1:6 char:177
+ ... e, ProviderName, LevelDisplayName, Message | Where-Object $qs | Out-G ...
+ ~~~
+ CategoryInfo : InvalidData: (:) [Where-Object], ParameterBindingValidationException
+ FullyQualifiedErrorId : ParameterArgumentValidationErrorNullNotAllowed,Microsoft.PowerShell.Commands.WhereObjectCommand
Your symptom implies that $qs is $null, which in turn implies that file .\AuditEvents\$f is empty.
However, even if it had content, you couldn't pass the resulting string as-is to the (positionally implied) -FilterScript parameter of Where-Object requires a script block ({ ... }).
You must create a script block from the string explicitly, using [scriptblock]::Create().
A simplified example:
# Simulated input using a literal string instead of file input via Get-Content
$qs = '{ 0 -eq $_ % 2 }' # Sample filter: return $true for even numbers.
# Remove the enclosing { and }, as they are NOT part of the code itself
# (they are only needed to define script-block *literals* in source code).
# NOTE: If you control the query files, you can simplify them
# by omitting { and } to begin with, which makes this
# -replace operation unnecessary.
$qs = $qs.Trim() -replace '^\{(.+)\}$', '$1'
# Construct a script block from the string and pass it to Where-Object
1..4 | Where-Object ([scriptblock]::Create($qs)) # -> 2, 4
Note:
Your code assumes that each .\AuditEvents\$f file contains just one line, and that that line contains valid PowerShell source code suitable for use a Where-Object filter.
Generally, be sure to only load strings that you'll execute as code from sources you trust.
Taking a step back:
As Abraham Zinala points out, a much faster way to filter event-log entries is by using Get-WinEvent's -FilterHashtable parameter.
This allows you to save hastable literals in your query files, which you can read directly into a hashtable with Import-PowerShellDataFile:
# Create a file with a sample filter.
'#{Path=".\AuditEvents\.*evtx";ID=11}' > sample.txt
# Read the file into a hashtable...
$hash = Import-PowerShellDataFile sample.txt
# ... and pass it to Get-WinEvent
Get-WinEvent -MaxEvents 500 -FilterHashtable $hash | ...
My sample log looks like :
2022-09-01 23:13:05Z | error | 2022-09-02 02:13:05 - [Task] Id:120 Name:OPT_VIM_1HEAD Exception with index:18 | 18.9251137 | Exception:
ERROR connection to partner '10.19.101.17:3300' broken
2022-09-01 23:13:25Z | error | 2022-09-02 02:13:25 - [Task] Id:121 Name:OPT_VIM_1ITEM
ERROR connection to partner '10.19.101.22:3300' broken
2022-09-01 23:13:25Z | error | 2022-09-02 02:13:25 - [Task] Id:121 Name:OPT_VIM_1ITEM RunId:7 Task execution failed with error: One or more errors occurred., detail:
ERROR connection to partner '10.19.101.22:3300' broken
I want to extract the job name OPT_VIM_1HEAD or OPT_VIM_1ITEM (its dynamic) and also the timestamp after the "error" pattern : 2022-09-02 02:13:25 or 2022-09-02 02:13:05 in different variables.
I have also written the script as :
$dir = 'C:\ProgramData\AecorsoftDataIntegrator\logs\'
$StartTime = get-date
$fileList = (Get-ChildItem -Path $dir -Filter '2022-09-02.log' | Sort-Object LastWriteTime -Descending | Select-Object -First 1).fullname
$message = Get-Content $fileList | Where-Object {$_ -like ‘*error*’}
$message
$details = Select-String -LiteralPath $fileList -Pattern 'error' -Context 0,14 | Select-Object -First 1 | Select-Object Path, FileName, Pattern, Linenumber
$details[0]
But not able to retrieve the tokens mentioned above
Use regex processing via the -match operator to extract the tokens of interest from each line:
# Sample lines from the log file.
$logLines = #'
2022-09-01 23:13:05Z | error | 2022-09-02 02:13:05 - [Task] Id:120 Name:OPT_VIM_1HEAD Exception with index:18 | 18.9251137 | Exception:
ERROR connection to partner '10.19.101.17:3300' broken
2022-09-01 23:13:25Z | error | 2022-09-02 02:13:25 - [Task] Id:121 Name:OPT_VIM_1ITEM
ERROR connection to partner '10.19.101.22:3300' broken
2022-09-01 23:13:25Z | error | 2022-09-02 02:13:25 - [Task] Id:121 Name:OPT_VIM_1ITEM RunId:7 Task execution failed with error: One or more errors occurred., detail:
ERROR connection to partner '10.19.101.22:3300' broken
'# -split '\r?\n'
# Process each line...
$logLines | ForEach-Object {
# ... by matching it ($_) against a regex with capture groups - (...) -
# using the -match operator.
if ($_ -match '\| (\d{4}-.+?) - \[.+? Name:(\w+)') {
# The line matched.
# Capture groups 1 and 2 in the automatic $Matches variable contain
# the tokens of interest; assign them to variables.
$timestamp = $Matches.1
$jobName = $Matches.2
# Sample output, as an object
[PSCustomObject] #{
JobName = $jobName
Timestamp = $timestamp
}
}
}
Output:
JobName Timestamp
------- ---------
OPT_VIM_1HEAD 2022-09-02 02:13:05
OPT_VIM_1ITEM 2022-09-02 02:13:25
OPT_VIM_1ITEM 2022-09-02 02:13:25
For an explanation of the regex and the ability to experiment with it, see this regex101.com page.
I have below code to match the pattern and save it in CSV file. I need to save regex1 and regex2 as col1 and col2 in csv instead of saving all in 1st col.
$inputfile = ( Get-Content D:\Users\naham1224\Desktop\jil.txt )
$FilePath = "$env:USERPROFILE\Desktop\jil2.csv"
$regex1 = "(insert_job: [A-Za-z]*_*\S*)"
$regex2 = "(machine: [A-Z]*\S*)"
$inputfile |
Select-String -Pattern $regex2,$regex1 -AllMatches |
ForEach-Object {$_.matches.groups[1].value} |
Add-Content $FilePath`
Input file contains : input.txt
/* ----------------- AUTOSYS_DBMAINT ----------------- */
insert_job: AUTOSYS_DBMAINT job_type: CMD
command: %AUTOSYS%\bin\DBMaint.bat
machine: PWISTASASYS01
owner: svc.autosys#cbs
permission:
date_conditions: 1
days_of_week: su,mo,tu,we,th,fr,sa
start_times: "03:30"
description: "Runs DBmaint process on AE Database - if fails - MTS - will run next scheduled time"
std_out_file: ">$$LOGS\dbmaint.txt"
std_err_file: ">$$LOGS\dbmaint.txt"
alarm_if_fail: 0
alarm_if_terminated: 0
send_notification: 0
notification_msg: "Check DBMaint output in autouser.PD1\\out directory"
notification_emailaddress: jnatal#cbs.com
/* ----------------- TEST_ENV ----------------- */
insert_job: TEST_ENV job_type: CMD
command: set
machine: PWISTASASYS01
owner: svc.autosys#cbs
permission:
date_conditions: 1
days_of_week: su,mo,tu,we,th,fr,sa
start_times: "03:30"
description: "output env"
std_out_file: ">C:\Users\svc.autosys\Documents\env.txt"
std_err_file: ">C:\Users\svc.autosys\Documents\env.txt"
alarm_if_fail: 1
alarm_if_terminated: 1
Current output :
Current output
Expected output :
Expected output
I am trying various ways to do so but no luck. any suggestions and help is greatly appreciated.
Here is how I would do this:
$inputPath = 'input.txt'
$outputPath = 'output.csv'
# RegEx patterns to extract data.
$patterns = #(
'(insert_job): ([A-Za-z]*_*\S*)'
'(machine): ([A-Z]*\S*)'
)
# Create an ordered Hashtable to collect columns for one row.
$row = [ordered] #{}
# Loop over all occurences of the patterns in input file
Select-String -Path $inputPath -Pattern $patterns -AllMatches | ForEach-Object {
# Extract key and value from current match
$key = $_.matches.Groups[ 1 ].Value
$value = $_.matches.Value
# Save one column of current row.
$row[ $key ] = $value
# If we have all columns of current row, output it as PSCustomObject.
if( $row.Count -eq $patterns.Count ) {
# Convert hashtable to PSCustomObject and output (implicitly)
[PSCustomObject] $row
# Clear Hashtable in preparation for next row.
$row.Clear()
}
} | Export-Csv $outputPath -NoTypeInformation
Output CSV:
"insert_job","machine"
"insert_job: AUTOSYS_DBMAINT","machine: PWISTASASYS01"
"insert_job: TEST_ENV","machine: PWISTASASYS01"
Remarks:
Using Select-String with parameter -Path we don't have to read the input file beforehand.
An ordered Hashtable (a dictionary) is used to collect all columns, until we have an entire row to output. This is the crucial step to produce multiple columns instead of outputting all data in a single column.
Converting the Hashtable to a PSCustomObject is necessary because Export-Csv expects objects, not dictionaries.
While the CSV looks like your "expected output" and you possibly have good reason to expect it like that, in a CSV file the values normally shouldn't repeat the column names. To remove the column names from the values, simply replace $value = $_.matches.Value by $_.matches.Groups[ 2 ].Value, which results in an output like this:
"insert_job","machine"
"AUTOSYS_DBMAINT","PWISTASASYS01"
"TEST_ENV","PWISTASASYS01"
As for what you have tried:
Add-Content writes only plain text files from string input. While you could use it to create CSV files, you would have to add separators and escape strings all by yourself, which is easy to get wrong and more hassle than necessary. Export-CSV otoh takes objects as inputs and cares about all of the CSV format details automatically.
As zett42 mentioned Add-Content is not the best fit for this. Since you are looking for multiple values separated by commas Export-Csv is something you can use. Export-Csv will take objects from the pipeline, convert them to lines of comma-separated properties, add a header line, and save to file
I took a little bit of a different approach here with my solution. I've combined the different regex patterns into one which will give us one match that contains both the job and machine names.
$outputPath = "$PSScriptRoot\output.csv"
# one regex to match both job and machine in separate matching groups
$regex = '(?s)insert_job: (\w+).+?machine: (\w+)'
# Filter for input files
$inputfiles = Get-ChildItem -Path $PSScriptRoot -Filter input*.txt
# Loop through each file
$inputfiles |
ForEach-Object {
$path = $_.FullName
Get-Content -Raw -Path $path | Select-String -Pattern $regex -AllMatches |
ForEach-Object {
# Loop through each match found in the file.
# Should be 2, one for AUTOSYS_DBMAINT and another for TEST_ENV
$_.Matches | ForEach-Object {
# Create objects with the values we want that we can output to csv file
[PSCustomObject]#{
# remove next line if not needed in output
InputFile = $path
Job = $_.Groups[1].Value # 1st matching group contains job name
Machine = $_.Groups[2].Value # 2nd matching group contains machine name
}
}
}
} | Export-Csv $outputPath # Pipe our objects to Export-Csv
Contents of output.csv
"InputFile","Job","Machine"
"C:\temp\powershell\input1.txt","AUTOSYS_DBMAINT","PWISTASASYS01"
"C:\temp\powershell\input1.txt","TEST_ENV","PWISTATEST2"
"C:\temp\powershell\input2.txt","AUTOSYS_DBMAINT","PWISTASAPROD1"
"C:\temp\powershell\input2.txt","TEST_ENV","PWISTATTEST1"
I'm trying to compile log errors from multiple files in a single directory. The error messages are included over the span of two lines. I would like to concatenate both lines into a single line/object and then export all errors into a a neat csv.
I'm attempting to accomplish this with the Select-String utility, and the -Context parameter. Prior to piping the results through the Select-Object utility, everything's Kosher. However, Once I pipe the results through Select-Object or Export-CSV, the -Context line is lost.
$trigger = 'ERROR'
$folderPath = 'C:\Users\test\Desktop\testpath'
$logFiles = gci -Path $folderPath -Filter *.txt -File
$logFiles | Select-String -Pattern $trigger -CaseSensitive -SimpleMatch -Context 0,1 | Select-Object LineNumber, Line, Filename |
Export-Csv -Path .\$(Get-Date -Format yyyymmddhhmmss).csv -Encoding UTF8 -NoTypeInformation
Omitting the Select-Object and Export-Csv Cmdlets yields the desired, raw, results with the friendly right angle bracket '>' (ASCII 62). The raw results can even be exported via the Out-File Cmdlet, no problem.
However, what I would like to do, is combine the Pattern line with the Context line, creating a single object, which would eventually be output as a csv for further analysis.
I would like apologize if this question seems trivial. I've scoured resources trying to figure this out and unfortunately haven't been able to. Thanks in advance!
Pipe select-string through fl * to see what the properties are.
$a = ls log | select-string error -context 0,1
$a | fl *
IgnoreCase : True
LineNumber : 2
Line : error
Filename : log
Path : /Users/js/log
Pattern : error
Context : Microsoft.PowerShell.Commands.MatchInfoContext
Matches : {0}
$a.context
PreContext PostContext DisplayPreContext DisplayPostContext
---------- ----------- ----------------- ------------------
{} {after } {} {after }
This worked for me:
ls log | select-string error -context 0,1 | select linenumber, line,
#{n='PostContext'; e={$_.context.postcontext}}, filename
LineNumber Line PostContext Filename
---------- ---- ----------- --------
2 error after log
I've been using a PowerShell script that reads a file and extracts error codes. It quick, simple and does the job that I want it to but I've now been asked to share it with a wide audience so I need to make it a bit more robust.
The problem I've got is that I need to take the output from my script and use it to lookup against a CSV file so that I get a user friendly table at the end that lists:
A count of the how many time each error occurred (in descending order)
The Error code
The corresponding error message that it displays to the end user
This is the line format in the source file, there's normally upwards on 2000 lines
17-12-2016,10:17:44:487{String=ERROR->(12345678)<inData:{device=printer, formName=blah.frm, eject=FALSE, operation=readForm}><outData:{fields=0, CODE=, field1=}> <outError:{CODE=Error103102, extendedErrorCode=-1, VARS=0}>}
This is my current script:
$WS = Read-Host "Enter computer name"
$date = Read-host "Enter Date"
# Search pattern for select-string (date always at the beginning of the line and the error code somewhere further in)
$Pattern = $date+".*<outError:{CODE="
# This find the lines in the files that contain the search pattern
$lines = select-string -path "\\$WS\c$\folder\folder\file.dat" -pattern $Pattern
# This is the specific Error code pattern that I'm looking for in each line
$regex = [regex] 'Error\d{1,6}'
$Var = #()
# Loops through each line and extracts Error code
foreach ($line in $lines) { $a = $line -match $regex
# Adds each match to variable
$Var += $matches.Values
}
# Groups and sorts results in to easy to read format
$Var | group | Sort-Object -Property count -descending
And this is the result it gives me:
Count Name Group
----- ---- -----
24 Error106013 {Error106013, Error106013, Error106013, Error106013...}
14 Error106109 {Error106109, Error106109, Error106109, Error106109...}
12 Error203002 {Error203002, Error203002, Error203002, Error203002...}
The CSV that I need to lookup against is as simple as it gets, with just 2 values per line in the format:
Code,Error message
What I need to get to is something like this:
Count Name Error Message
----- ---- -----
24 Error106013 Error:blah
14 Error106109 Error:blah,blah
12 Error203002 Error:blah,blah,balh
Google has failed me so I'm hoping that there is someone out there that can at the least point me in the right direction.
Not tested but it should work with a simple calculated property - just replace the last line with:
$errorMap = Import-Csv 'your_errorcode.csv'
$Var | Group-Object | Sort-Object -Property count -descending |
Select-Object Count, Name, #{l='Error Message'; e={($errorMap | Where-Object Code -eq $_.Name)."Error message"}}
Note: You also have to replace path to your CSV.