How to parse and delete archived event logs in Powershell - powershell

I'm trying to parse archived Security logs to track down an issue with changing permissions. This script greps through .evtx files that are +10 days old. It currently outputs what I want, but when it goes to clean up the old logs (About 50GB/daily, uncompressed, each of which are archived into their own daily folder via another script that runs at midnight) it begins complaining that the logs are in use and cannot be deleted. The process that seems to be in use when I try to delete the files through Explorer is alternately DHCP Client or Event Viewer, stopping both of these services works, but clearly I can't run without eventvwr. DHCP client is used for networking niceness but is not needed.
The only thing that touches the .evtx files is this script, they're not backed up, they're not monitored by anything else, they're not automatically parsed by the Event Log service, they're just stored on disk waiting.
The script originally deleted things as it went, but then since that failed all the deletions were moved to the end, then to the KillLogWithFire() function. Even the timer doesn't seem to help. I've also tried moving the files to a Processed subfolder, but that does't work for the same reason.
I assume that there's some way to release any handles that this script opens on any files, but attempting to .close() or .dispose() of the EventLog variable in the loop doesn't work.
$XPath = #'
*[System[Provider/#Name='Microsoft-Windows-Security-Auditing']]
and
*[System/EventID=4670]
'#
$DeletableLogs = #()
$logfile = "L:\PermChanges.txt"
$AdminUsers = ("List","of","Admin","Users")
$today = Get-Date
$marker = "
-------------
$today
-------------
"
write-output $marker >> $logfile
Function KillLogWithFire($log){
Try {
remove-item $log
}
Catch [writeerror]{
$Timer += 1
sleep $timer
write-output "Killing log $log in $timer seconds"
KillLogWithFire($log)
}
}
Function LogPermissionChange($PermChanges){
ForEach($PermChange in $PermChanges){
$Change = #{}
$Change.ChangedBy = $PermChange.properties[1].value.tostring()
#Filter out normal non-admin users
if ($AdminUsers -notcontains $Change.ChangedBy){continue}
$Change.FileChanged = $PermChange.properties[6].value.tostring()
#Ignore temporary files
if ($Change.FileChanged.EndsWith(".tmp")){continue}
elseif ($Change.FileChanged.EndsWith(".partial")){continue}
$Change.MadeOn = $PermChange.TimeCreated.tostring()
$Change.OriginalPermissions = $PermChange.properties[8].value.tostring()
$Change.NewPermissions = $PermChange.properties[9].value.tostring()
write-output "{" >> $logfile
write-output ("Changed By : "+ $Change.ChangedBy) >> $logfile
write-output ("File Changed : "+ $Change.FileChanged) >> $logfile
write-output ("Change Made : "+ $Change.MadeOn) >> $logfile
write-output ("Original Permissions :
"+ $Change.OriginalPermissions) >> $logfile
write-output ("New Permissions :
"+ $Change.NewPermissions) >> $logfile
"}
" >> $logfile
}
}
GCI -include Archive-Security*.evtx -path L:\Security\$Today.AddDays(-10) -recurse | ForEach-Object{
Try{
$PermChanges = Get-WinEvent -Path $_ -FilterXPath $XPath -ErrorAction Stop
}
Catch [Exception]{
if ($_.Exception -match "No events were found that match the specified selection criteria."){
}
else {
Throw $_
}
}
LogPermissionChange($PermChanges)
$PermChanges = $Null
$DeletableLogs += $_
}
foreach ($log in $DeletableLogs){
$Timer = 0
Try{
remove-item $log
}
Catch [IOException]{
KillLogWithFire($log)
}
}
UPDATE
Rather than editing the original code as I've been told not to do, I wanted to post the full code that's now in use as a separate answer. The Initial part, which parses the logs and is run every 30 minutes is mostly the same as above:
$XPath = #'
*[System[Provider/#Name='Microsoft-Windows-Security-Auditing']]
and
*[System/EventID=4670]
'#
$DeletableLogs = #()
$logfile = "L:\PermChanges.txt"
$DeleteList = "L:\DeletableLogs.txt"
$AdminUsers = ("List","Of","Admins")
$today = Get-Date
$marker = "
-------------
$today
-------------
"
write-output $marker >> $logfile
Function LogPermissionChange($PermChanges){
ForEach($PermChange in $PermChanges){
$Change = #{}
$Change.ChangedBy = $PermChange.properties[1].value.tostring()
#Filter out normal non-admin users
if ($AdminUsers -notcontains $Change.ChangedBy){continue}
$Change.FileChanged = $PermChange.properties[6].value.tostring()
#Ignore temporary files
if ($Change.FileChanged.EndsWith(".tmp")){continue}
elseif ($Change.FileChanged.EndsWith(".partial")){continue}
$Change.MadeOn = $PermChange.TimeCreated.tostring()
$Change.OriginalPermissions = $PermChange.properties[8].value.tostring()
$Change.NewPermissions = $PermChange.properties[9].value.tostring()
write-output "{" >> $logfile
write-output ("Changed By : "+ $Change.ChangedBy) >> $logfile
write-output ("File Changed : "+ $Change.FileChanged) >> $logfile
write-output ("Change Made : "+ $Change.MadeOn) >> $logfile
write-output ("Original Permissions :
"+ $Change.OriginalPermissions) >> $logfile
write-output ("New Permissions :
"+ $Change.NewPermissions) >> $logfile
"}
" >> $logfile
}
}
GCI -include Archive-Security*.evtx -path L:\Security\ -recurse | ForEach-Object{
Try{
$PermChanges = Get-WinEvent -Path $_ -FilterXPath $XPath -ErrorAction Stop
}
Catch [Exception]{
if ($_.Exception -match "No events were found that match the specified selection criteria."){
}
else {
Throw $_
}
}
LogPermissionChange($PermChanges)
$PermChanges = $Null
$DeletableLogs += $_
}
foreach ($log in $DeletableLogs){
write-output $log.FullName >> $DeleteList
}
The second portion does the deletion, including the helper function above graciously provided by TheMadTechnician. The code still loops as the straight delete is faster than the function, but not always successful even ages after the files have not been touched.:
# Log Cleanup script. Works around open log issues caused by PS parsing of
# saved logs in EventLogParser.ps1
$DeleteList = "L:\DeletableLogs.txt"
$DeletableLogs = get-content $DeleteList
Function Close-LockedFile{
Param(
[Parameter(Mandatory=$true,ValueFromPipeline=$true)][String[]]$Filename
)
Begin{
$HandleApp = 'C:\sysinternals\Handle.exe'
If(!(Test-Path $HandleApp)){Write-Host "Handle.exe not found at $HandleApp`nPlease download it from www.sysinternals.com and save it in the afore mentioned location.";break}
}
Process{
$HandleOut = Invoke-Expression ($HandleApp+' '+$Filename)
$Locks = $HandleOut |?{$_ -match "(.+?)\s+pid: (\d+?)\s+type: File\s+(\w+?): (.+)\s*$"}|%{
[PSCustomObject]#{
'AppName' = $Matches[1]
'PID' = $Matches[2]
'FileHandle' = $Matches[3]
'FilePath' = $Matches[4]
}
}
ForEach($Lock in $Locks){
Invoke-Expression ($HandleApp + " -p " + $Lock.PID + " -c " + $Lock.FileHandle + " -y") | Out-Null
If ( ! $LastexitCode ) { "Successfully closed " + $Lock.AppName + "'s lock on " + $Lock.FilePath}
}
}
}
Function KillLogWithFire($log){
Try {
Close-LockedFile $Log -
}
Catch [System.IO.IOException]{
$Timer += 1
sleep $timer
write-host "Killing $Log in $Timer seconds with fire."
KillLogWithFire($Log)
}
}
foreach ($log in $DeletableLogs){
Try {
remove-item $log -ErrorAction Stop
}
Catch [System.IO.IOException]{
$Timer = 0
KillLogWithFire($Log)
}
}
remove-item $DeleteList

One solution would be to get HANDLE.EXE and use it to close any open handles. Here's a function that I use roughly based off of this script. It uses handle.exe, finds what has a file locked, and then closes handles locking that file open.
Function Close-LockedFile{
Param(
[Parameter(Mandatory=$true,ValueFromPipeline=$true)][String[]]$Filename
)
Begin{
$HandleApp = 'C:\sysinternals\Handle.exe'
If(!(Test-Path $HandleApp)){Write-Host "Handle.exe not found at $HandleApp`nPlease download it from www.sysinternals.com and save it in the afore mentioned location.";break}
}
Process{
$HandleOut = Invoke-Expression ($HandleApp+' '+$Filename)
$Locks = $HandleOut |?{$_ -match "(.+?)\s+pid: (\d+?)\s+type: File\s+(\w+?): (.+)\s*$"}|%{
[PSCustomObject]#{
'AppName' = $Matches[1]
'PID' = $Matches[2]
'FileHandle' = $Matches[3]
'FilePath' = $Matches[4]
}
}
ForEach($Lock in $Locks){
Invoke-Expression ($HandleApp + " -p " + $Lock.PID + " -c " + $Lock.FileHandle + " -y") | Out-Null
If ( ! $LastexitCode ) { "Successfully closed " + $Lock.AppName + "'s lock on " + $Lock.FilePath}
}
}
}
I have handle.exe saved in C:\Sysinternals, you may want to adjust the path in the function, or save the executable there.

I was having a very similar problem and after lots of searching found this article. Whilst handle.exe worked when I first tried I did note the -c carries a warning "Closing handles can cause application or system instability"
I am also using get-winevent and it seems to (sometimes) lock the .evtx file being processed. I have written a loop to wait 5 secs an retry. Sometimes it takes up to 2 minutes or the file to be released, I have had one run overnight and it had hundreds of retries.
When I used handle the first time it worked perfectly. I then implemented it into the script and later found it to be looping an "unexplained error". I ended up having to reboot the server to get things working again so removed the handle.exe from the script and back to waiting for the file to be closed.
I can reliably release the file by stopping the script and closing down the powershell ise. As soon as the ISE is closed the file can be deleted without a problem.
Unfortunately I need this script to keep running and not be held up by the file remaining open. I am surprised that have to resort to sysinternals to release the file and that powershell does not offer an easy way to close the file.

I had the same issue as GTEM where closing the handles would cause corruption when processing hundreds of event log files. Eventually Get-WinEvent would not work properly. It would either freeze or give me the same "unexplained error".
So I opened a premier case with MS. They lead me to the actual variable I was storing the Get-WinEvent events in was what was locking the file. I guess it doesn't actually unlock the file if you are still using that variable. So to resolve this I added some code to my script after I transferred the variable to a new variable. You can see the code I added in the 3rd region listed below.
#***************************************************************************
#region *** Get the log entries.
# clear the log entry for each pass
$LogEntry = #()
# Get the vent from the log file and export it to the logentry variable and output to the screen
Get-WinEvent -Path $NewPath -FilterXPath $XPathFilter -ErrorAction SilentlyContinue | Tee-Object -Variable LogEntry
#endregion *** End get the log entries
#***************************************************************************
#***************************************************************************
#region *** This is where I copy it to the new variable for later output.
# if there are any log entries
if ($LogEntry.Count -gt 0) {
# Add the log entries to the log file
$LogEntries += $LogEntry
} # if there are any log entries
#endregion *** End were I copy to the new variable.
#***************************************************************************
#***************************************************************************
#region *** This is where I added code to allow me to remove the file lock.
# Remove the variable to release the evtx file lock
Remove-Variable -Name LogEntry
# Garbage collect to remove any additional memory tied to the file lock.
[GC]::Collect()
# sleep for 1 seconds
Sleep -Seconds 1
#endregion **** Code to remove the file lock.
#***************************************************************************
After this was done, I no longer have to use Handle.exe to close the file anymore.

Related

Export Critical, Warning and Errors events from Windows Logs

I'm using most of the script from here.https://kb.webspy.com/s/article/windows-event-logs-and-powershell
However, I was wondering if there is a way only export Critical, Warning and Errors events. I know those events levels are 1-3
Get-WinEvent -FilterHashTable #{LogName = "System"; Level=1,2,3; StartTime=((Get-Date).AddDays(-7))} -ComputerName "server1" #| Out-GridView
I was just wondering where to add the level to this script.
# Logs to extract from server
$logArray = #("System","Security","Application")
# Grabs the server name to append to the log file extraction
$servername = $env:computername
# Provide the path with ending "\" to store the log file extraction.
$destinationpath = "C:\WindowsEventLogs\"
# Checks the last character of the destination path. If it does not end in '\' it adds one.
# '.+?\\$' +? means any character \\ is looking for the backslash $ is the end of the line charater
if ($destinationpath -notmatch '.+?\\$')
{
$destinationpath += '\'
}
# If the destination path does not exist it will create it
if (!(Test-Path -Path $destinationpath))
{
New-Item -ItemType directory -Path $destinationpath
}
# Get the current date in YearMonthDay format
$logdate = Get-Date -format yyyyMMddHHmm
# Start Process Timer
$StopWatch = [system.diagnostics.stopwatch]::startNew()
# Start Code
Clear-Host
Foreach($log in $logArray)
{
# If using Clear and backup
$destination = $destinationpath + $servername + "-" + $log + "-" + $logdate + ".evtx"
Write-Host "Extracting the $log file now."
# Extract each log file listed in $logArray from the local server.
wevtutil epl $log $destination
}
# End Code
# Stop Timer
$StopWatch.Stop()
$TotalTime = $StopWatch.Elapsed.TotalSeconds
$TotalTime = [math]::Round($totalTime, 2)
write-host "The Script took $TotalTime seconds to execute."
It seems like the code is using wevtutil to retrieve information about event logs.
wevtutil epl $log $destination
From the documentation wevtutil also accept different options and one of which is /q:<Query>.
Defines the XPath query to filter the events that are read or
exported. If this option is not specified, all events will be returned
or exported. This option is not available when /sq is true.
So you could create a Xpath query to apply filter based on event levels
wevtutil epl $log $destination /q:"*[System[(Level=1 or Level=2 or Level=3)]]"

Word com object failing

SCRIPT PURPOSE
The idea behind the script is to recursively extract the text from a large amount of documents and update a field in an Azure SQL database with the extracted text. Basically we are moving away from Windows Search of document contents to an SQL full text search to improve the speed.
ISSUE
When the script encounters an issue opening the file such as it being password protected, it fails for every single document that follows. Here is the section of the script that processes the files:
foreach ($list in (Get-ChildItem ( join-path $PSScriptRoot "\FileLists\*" ) -include *.txt )) {
## Word object
$word = New-Object -ComObject word.application
$word.Visible = $false
$saveFormat = [Enum]::Parse([Microsoft.Office.Interop.Word.WdSaveFormat], "wdFormatText")
$word.DisplayAlerts = 0
Write-Output ""
Write-Output "################# Parsing $list"
Write-Output ""
$query = "INSERT INTO tmp_CachedText (tCachedText, tOID)
VALUES "
foreach ($file in (Get-Content $list)) {
if ($file -like "*-*" -and $file -notlike "*~*") {
Write-Output "Processing: $($file)"
Try {
$doc = $word.Documents.OpenNoRepairDialog($file, $false, $false, $false, "ttt")
if ($doc) {
$fileName = [io.path]::GetFileNameWithoutExtension($file)
$fileName = $filename + ".txt"
$doc.SaveAs("$env:TEMP\$fileName", [ref]$saveFormat)
$doc.Close()
$4ID = $fileName.split('-')[-1].replace(' ', '').replace(".txt", "")
$text = Get-Content -raw "$env:TEMP\$fileName"
$text = $text.replace("'", "''")
$query += "
('$text', $4ID),"
Remove-Item -Force "$env:TEMP\$fileName"
<# Upload to azure #>
$query = $query.Substring(0,$query.Length-1)
$query += ";"
Invoke-Sqlcmd #params -Query $Query -ErrorAction "SilentlyContinue"
$query = "INSERT INTO tmp_CachedText (tCachedText, tOID)
VALUES "
}
}
Catch {
Write-Host "$($file) failed to process" -ForegroundColor RED;
continue
}
}
}
Remove-Item -Force $list.FullName
Write-Output ""
Write-Output "Uploading to azure"
Write-Output ""
<# Upload to azure #>
Invoke-Sqlcmd #params -Query $setQuery -ErrorAction "SilentlyContinue"
$word.Quit()
TASKKILL /f /PID WINWORD.EXE
}
Basically it parses through a folder of .txt files that contain x amount of document paths, creates a T-SQL update statement and runs against an Azure SQL database after each file is fully parsed. The files are generated with the following:
if (!($continue)) {
if ($pdf){
$files = (Get-ChildItem -force -recurse $documentFolder -include *.pdf).fullname
}
else {
$files = (Get-ChildItem -force -recurse $documentFolder -include *.doc, *.docx).fullname
}
$files | Out-File (Join-Path $PSScriptRoot "\documents.txt")
$i=0; Get-Content $documentFile -ReadCount $interval | %{$i++; $_ | Out-File (Join-Path $PSScriptRoot "\FileLists\documents_$i.txt")}
}
The $interval variable defines how many files are set to be extracted for each given upload to azure. Initially i had the word object being created outside the loop and never closed until the end. Unfortunately this doesn't seem to work as every time the script hits a file it cannot open, every file that follows will fail, until it reaches the end of the inner foreach loop foreach ($file in (Get-Content $list)) {.
This means that to get the expected outcome i have to run this with an interval of 1 which takes far too long.
This is a shot in the dark
But to me it sounds like the reason its failing is because the Word Com object is now prompting you for some action due since it cannot open the file so all following items in the loop also fail. This might explain why it works if you set the $Interval to 1 because when its 1 it is closing and opening the Com object every time and that takes forever (I did this with excel).
What you can do is in your catch statement, close and open a new Word Com object which should lets you continue on with the loop (but it will be a bit slower if it needs to open the Com object a lot).
If you want to debug the problem even more, set the Com object to be visible, and slowly loop through your program without interacting with Word. This will show you what is happening with Word and if there are any prompts that are causing the application to hang.
Of course, if you want to run it at full speed, you will need to detect which documents you can't open before hand or you could multithread it by opening several Word Com objects which will allow you to load several documents at a time.
As for...
ISSUE
When the script encounters an issue opening the file such as it being password protected, it fails for every single document that follows.
... then test for this as noted here...
How to check if a word file has a password?
$filename = "C:\path\to\your.doc"
$wd = New-Object -COM "Word.Application"
try {
$doc = $wd.Documents.Open($filename, $null, $null, $null, "")
} catch {
Write-Host "$filename is password-protected!"
}
... and skip the file to avoid the failure of the remaining files.

Powershell script suddenly failing

We have a script running daily on two separate servers, there have been no changes made to either copy of the script all year. Last weekend, there was a server outage that interrupted the script running on one of the servers, and ever since the script on that server partially fails each day.
Here is the code which continues to fail, I have broken it out and ran it locally without issue.
$rawlineCountFile ="C:\temp\files\test\linecount"
$rawlineCountFile = $rawlineCountFile +'RawlineCount' + 'test' + '.csv'
$filePath = "C:\temp\files\test"
# do line count in files
$bak = Get-ChildItem $filePath | Where-Object { $_.Extension -eq ".dat" }
Try
{
Write-Output "line_count , file_name"
foreach ($file in $bak) {
$name = $file.name
$measure =(Get-Content $filePath\$file | Measure-Object)
$lines = $measure.Count
Write-Output "$lines , $name"
Write-Output "$lines , $name" >> $rawlineCountFile
}
} catch [Exception] {
Write-Output $_.Exception.GetType().FullName
Write-Output $_.Exception.Message
}
This script above looks at a folder with .dat files and FOR EACH one, it writes the $lines within each file ($measure.count) and the $file.name into a rawlinecountfile.csv.
i.e.
123 , file1.dat
234 , file2.dat
987 , file3.dat
567 , file4.dat
etc. etc.
Each day there are 7 files moved into this folder, then this script runs, so there should be 7 rows added to the rawlinecountfile each day also, then later, after the rest of the process finishes, all the files are cleared out to prepare for the next day.
However, since the outage last week, it only writes 0-2 out of 7 rows onto the csv file each day, not FOR EACH file (still 7).
We are stumped as to why the For Each doesn't seem to be working anymore, while the script has not been changed and the same exact script still runs as expected on the sister server, and on my local machine.
Any thoughts?
When you were calling Get-Content, I believe you are trying to get the file from C:\Temp\Files\Test\C:\Temp\Files\Test\Filename.dat. You can use $file.FullName to get the full path. I suspect it was probably throwing an error and not adding the content to the file. This worked in PS 5.1.
$rawlineCountFile ="C:\temp\files\test\linecount"
$rawlineCountFile = $rawlineCountFile +'RawlineCount' + 'test' + '.csv'
$filePath = "C:\temp\files\test"
# do line count in files
$bak = Get-ChildItem $filePath | Where-Object { $_.Extension -eq ".dat" }
Try
{
Write-Output "line_count , file_name"
foreach ($file in $bak)
{
$name = $file.name
$lines = #(Get-Content -Path $File.FullName).Count
Write-Output "$lines , $name"
"$lines , $name" | Add-Content $rawlineCountFile
}
}
catch [Exception]
{
Write-Output $_.Exception.GetType().FullName
Write-Output $_.Exception.Message
}
Replacing just the single line I mentioned that could be an issue:
$rawlineCountFile ="C:\temp\files\test\linecount"
$rawlineCountFile = $rawlineCountFile +'RawlineCount' + 'test' + '.csv'
$filePath = "C:\temp\files\test"
# do line count in files
$bak = Get-ChildItem $filePath | Where-Object { $_.Extension -eq ".dat" }
Try
{
Write-Output "line_count , file_name"
foreach ($file in $bak) {
$name = $file.name
$measure =(Get-Content $file.FullName | Measure-Object)
$lines = $measure.Count
Write-Output "$lines , $name"
Write-Output "$lines , $name" >> $rawlineCountFile
}
} catch [Exception] {
Write-Output $_.Exception.GetType().FullName
Write-Output $_.Exception.Message
}

Pipe all Write-Output to the same Out-File in PowerShell

As the title suggests, how do you make it so all of the Write-Outputs - no matter where they appear - automatically append to your defined log file? That way the script will be nicer to read and it removes a tiny bit of work!
Little example below, id like to see none of the "| Out-File" if possible, yet have them still output to that file!
$Author = 'Max'
$Time = Get-Date -Format "HH:mm:ss.fff"
$Title = "Illegal Software Removal"
$LogName = "Illegal_Remove_$($env:COMPUTERNAME).log"
$Log = "C:\Windows\Logs\Software" + "\" + $LogName
$RemoteLog = "\\Server\Adobe Illegal Software Removal"
Set-PSBreakpoint -Variable Time -Mode Read -Action { $global:Time = Get-Date -format "HH:mm:ss.fff" } | Out-Null
If((Test-Path $Log) -eq $False){ New-Item $Log -ItemType "File" -Force | Out-Null }
Else { $Null }
"[$Time][Startup] $Title : Created by $Author" | Out-File $Log -Append
"[$Time][Startup] Configuring initial variables required before run..." | Out-File $Log -Append
EDIT: This needs to work on PS v2.0, I don't want the output to appear on screen at all only in the log. So I have the same functionality, but the script would look like so...
"[$Time][Startup] $Title : Created by $Author"
"[$Time][Startup] Configuring initial variables required before run..."
You have two options, one is to do the redirection at the point the script is invoked e.g.:
PowerShell.exe -Command "& {c:\myscript.ps1}" > c:\myscript.log
Or you can use the Start-Transcript command to record everything (except exe output) the shell sees. After the script is done call Stop-Transcript.

Get-Content - Get all Content, starting from a specific linenumber

My first question here, and just want to say thanks for all the input I've gotten over the years from this site.
I'm also new to powershell so the answar might be very simple.
I'm working on a Script that ment check a log file every 5 mins. (schedulded from ActiveBatch).
At the moment the script is searching for ERROR in a logfile. And it works fine.
But my problem is that the script searches the entire file throgh every time. So when an ERROR do occur, the check "fails" every 5 minutes the rest of the day. Untill a new logfile is generated.
My script:
Write-Host Opretter variabler...
$file = "${file}"
$errorString = "${errorString}"
Write-Host file variable is: $file
Write-Host errorString variable is: $errorString
Write-Host
Write-Host Select String Results:
$ssResult = Get-Content $file | Select-String $errorString -SimpleMatch
Write-Host
Write-Host There was $ssResult.Count `"$errorString`" statements found...
Write-Host
IF ($ssResult.Count -gt 0) {Exit $ssResult.Count}
So what i would like, is to Find the ERROR, and then Remeber the Linenumber (Perhaps in a file). Then in the next run (5minutes later) i want to start the search from that line.
for example. And error is found on line 142, the Script exits with error code 142. five minutes later the script is run again, and it should start from line 143, and go through the rest of the file.
You can remember number of error strings found in file:
$ssResult.Count > C:\path\to\file.txt
Then number of new erros is:
$errorCount = $ssResult.Count - (Get-Content C:\path\to\file.txt)
Remember to set the value in file to zero on first run of script and every time a new logfile is generated.
You basically gave a pretty good description of how it will work:
Read the last line number
$if (Test-Path $Env:TEMP\last-line-number.txt) {
[int]$LastLineNumber = #(Get-Content $Env:TEMP\last-line-number.txt)[0]
} else {
$LastLineNumber = 0
}
Read the file
$contents = Get-Content $file
Find the first error starting at $LastLineNumber (one of the rare cases where for is appropriate in PowerShell, lest we want to create nicer objects)
for ($i = $LastLineNumber; $i -lt $contents.Count; $i++) {
if ($contents[$i] -like "*$errorString*") {
$i + 1 > $Env:TEMP\last-line-number.txt
exit ($i + 1)
}
}
Select-String returns matchinfo objects, which have the line number, so you can should be able to do something like this:
$lasterror = Get-Content $lasterrorfile
$newerrors = select-string -Path $file -Pattern $errorstring -SimpleMatch |
where $_.LineNumber -gt $lasterror
Write-Host "$($newerrors.count) found."
if ($newerrors.count)
{$newerrors[-1].LineNumber | Set-Content $lasterrorfile}
So this is my final Script, Thanks Dano. I'm sure the Day-Reset thing can be done smarter, but this seems to work :)
#logic for Day-Reset
Write-Host checking if its a new day...
$today = Get-Date -format dddd
$yesterday = Get-Content $ENV:TEMP\${adapterName}_yesterday.txt
Write-Host today variable is: $today
Write-Host yesterday variable is: $yesterday
Write-Host
IF ($today.CompareTo($yesterday))
{
Get-Date -format dddd > $ENV:TEMP\${adapterName}_yesterday.txt
0 > $ENV:TEMP\${adapterName}_numberofErrors.txt
}
Write-Host Setting variables...
$file = "${file}"
$errorString = "${errorString}"
Write-Host file variable is: $file
Write-Host errorString variable is: $errorString
Write-Host
Write-Host Select String Results:
$ssResult = Get-Content $file | Select-String $errorString -SimpleMatch
Write-Host There was $ssResult.Count `"$errorString`" statements found...
$errorCount = $ssResult.Count - (Get-Content $ENV:TEMP\${adapterName}_numberofErrors.txt)
Write-Host There was $errorCount new `"$errorString`" statements found...
Write-Host
$ssResult.Count > $Env:TEMP\FXAll_numberofErrors.txt
Exit $errorCount