Powershell - Monitoring live log file Q2 - powershell

I had made an initial question here Which was answered but as i move along in my task I'm running into another problem.
Summary: I have a log file that's being written to via a serial device. I'm wanting to monitor this log file for particular strings (events) and when they happen i want to write those strings to a separate file.
Executing this one off does what I'm looking for:
$p = #("AC/BATT_PWR","COMM-FAULT")
$fileName = "SRAS_$(Get-Date -format yyyy-MM-dd).log"
$fullPath = "C:\temp\SRAS\$fileName"
Get-Content $fullpath -tail 1 -Wait | Select-String -Pattern $p -SimpleMatch | Out-File -Filepath C:\temp\SRAS\sras_pages.log -Append
The problem is the logfile gets a datestamp, putty saves it as SRAS_yyyy-mm-dd.log. So when the clock passes midnight this will no longer be looking at the correct file.
I found this post on SO which is exactly what I want to do, the OP claims it works for him. I modified it slightly for my purposes but it doesn't write events matching the desired strings to sras_pages.log
This is the 'modified' code:
while($true)
{
$now = Get-Date
$fileName = "SRAS_$(Get-Date -format yyyy-MM-dd).log"
$fullPath = "C:\temp\SRAS\$fileName"
$p = #("AC/BATT_PWR","COMM-FAULT")
Write-Host "[$(Get-Date)] Starting job for file $fullPath"
$latest = Start-Job -Arg $fullPath -ScriptBlock {
param($file)
# wait until the file exists, just in case
while(-not (Test-Path $fullpath)){ sleep -sec 10 }
Get-Content $file -Tail 1 -wait | Select-String -Pattern $p |
foreach { Out-File -Filepath "C:\temp\SRAS\sras_pages.log" -Append }
}
# wait until day changes, or whatever would cause new log file to be created
while($now.Date -eq (Get-Date).Date){ sleep -Sec 10 }
# kill the job and start over
Write-Host "[$(Get-Date)] Stopping job for file $fullPath"
$latest | Stop-Job
}
If I execute just the Get-Content segment of that code it does exactly what I'm looking for. I can't figure out what the issue is.
TIA for advice.

Here is a few suggested changes that should make it work:
$p does not exist within the job, add it as a parameter ($pattern in my example)
You are referring to $fullpath within your job (row 13), it should be $file.
Add parameter -SimpleMatch to select-string to search for literal strings instead of regular expressions. (This is not needed but will come in handy if you change search pattern)
Referring to $pattern instead of $p (see 1)
Skip the foreach on row 16.
Like this:
while($true)
{
$now = Get-Date
$fileName = "SRAS_$(Get-Date -format yyyy-MM-dd).log"
$fullPath = "C:\temp\SRAS\$fileName"
$p = #("AC/BATT_PWR","COMM-FAULT")
Write-Host "[$(Get-Date)] Starting job for file $fullPath"
$latest = Start-Job -Arg $fullPath, $p -ScriptBlock {
param($file,$pattern)
# wait until the file exists, just in case
while(-not (Test-Path $file)){ sleep -sec 10 }
Get-Content $file -Tail 1 -wait | Select-String -Pattern $pattern -SimpleMatch |
Out-File -Filepath "C:\temp\SRAS\sras_pages.log" -Append
}
# wait until day changes, or whatever would cause new log file to be created
while($now.Date -eq (Get-Date).Date){ sleep -Sec 10 }
# kill the job and start over
Write-Host "[$(Get-Date)] Stopping job for file $fullPath"
$latest | Stop-Job
}

Related

How do I ship logs from SharePoint in almost real time to a fileshare using PowerShell?

I've got a SharePoint farm where I'm trying to ship the log files in "real time" to a server that is available for the monitoring team using PowerShell.
I first had it going pretty well using Get-SPLogEvent, until i noticed that using the cmdlet in it self produced log entries. And I was going about it in 1 second intervalls :O
So back to my original idea with using Get-Content -Wait then.
I put the log shipping in a job which is aborted when a newer log file is created.
This works reasonably well except when the logfile I'm trying to ship is to big to start with.
Most often, that is the case. The first log file I try to ship is empty, and the shipping is started only with the second log file.
Is there a way to have Get-Content -Wait work correctly in a pipe with files as large as 75 - 100 MB?
$SourceLogFolder = <somewhere local>
$LogShipFolder = <somewhere remote>
while ($true) {
$NewLog = Get-ChildItem $SourceLogFolder | Select -Last 1 #Find the latest log file
if ($NewLog.Name -ne $CurrentLog.Name) {#the current log file has been closed or is new
$CurrentLog = $NewLog
Get-Job | Remove-Job -Force #clear any previous log shippings
Start-Job {#Tail CurrentLog
Get-Content $Using:CurrentLog.FullName -Wait |
Out-File "$($Using:LogShipFolder)\$($Using:CurrentLog.Name)" -Encoding utf8
}#end job
}#end if
sleep -seconds 30
}#end while
Ok, for some reason Out-File sometimes creates the new file once before using it when piping large files.
So using Out-File -Force will have it remove the 0 byte large shipped log file that was created.
[CmdletBinding()]
param (
[Parameter(Mandatory)][ValidateNotNullOrEmpty()]
[string]$SourceLogFolder,
[Parameter(Mandatory)][ValidateNotNullOrEmpty()]
[string]$LogShipFolder,
[int]$CheckInterval = 5, #seconds
[int]$CleanupTimeSpan = 3 #hours
)
$SourceLogFolder = $SourceLogFolder.TrimEnd('\')
$LogShipFolder = $LogShipFolder.TrimEnd('\')
$ShippedLogName = 'null'
while ($true) {
$NewLog = Get-ChildItem $SourceLogFolder | Select -Last 1
$ShippedLog = Get-ChildItem $ShippedLogName -ErrorAction SilentlyContinue
if (#the current log was closed or is new
$NewLog.Name -ne $CurrentLog.Name -and
$ShippedLog.Length -eq $ShippedLogJustNow.Length
) {
$CurrentLog = $NewLog
$ShippedLogName = "$LogShipFolder\$($CurrentLog.Name)"
Get-Job | Remove-Job -Force #kill previous log shipping
Start-ThreadJob {#Tail Current log
Get-Content ($Using:CurrentLog).FullName -Wait |
Out-File $Using:ShippedLogName -Encoding utf8 -Force
}#end job
Start-ThreadJob {#Cleanup shipped logs
$YoungestFileFound = 1
$OldLogFiles = Get-ChildItem $Using:LogShipFolder -Recurse |
where LastWriteTime -le (get-date).AddHours(-$Using:CleanupTimeSpan) |
Select -SkipLast $YoungestFileFound
$OldLogFiles | Remove-Item -Recurse -Force -Confirm:$false
}#end job
}#end if
$ShippedLogJustNow = Get-ChildItem $ShippedLogName -ErrorAction SilentlyContinue
sleep -Seconds $CheckInterval
[system.gc]::Collect() #Garbage collect to minimize memory leaks
}#end while

creating logfile for a powershell script day wise

$dateTime = Get-Date -Format "yyyyMMdd"
$Logfile = $logfile + $dateTime +".log"
if((Get-ChildItem $Logfile).CreationTime.Date -ne (Get-Date).Date)
{
Write-Host "creating new"
New-Item -Path $Logfile -ItemType File -Force
}
else
{
Write-Host "existing"
}
## This function facilitates in capturing various events into a log file when the script will run
function WriteLog
{
param([string]$Message)
filter timestamp {"$(Get-Date -Format G) $_"}
$Message = $Message | timestamp
Add-content $Logfile -value $Message
}
I am using this small code which will create log file per day if doesn't exist. It append log messages whenever Writelog function is triggered.
Problem facing --> this is working as per expectations for maximum 4 consecutive runs and After this the script is running fine, but not appending any message to the logfile.
If it was me I would define your log path variable once and then use throughout your script. You seem to be trying to access the same name by different methods which will be prone to errors. Your Write-Log method for example should take two parameters: Message and LogFile.
For an easier way to create logs using a Transcript, I have a logs sub folder and I just top and tail my script with the following:
# Start logging
#######################################################################
$log_path = $PSScriptRoot | Join-Path -ChildPath 'Logs'
$script_name = $PSCommandPath -replace '^.*\\([^\\]+)\.ps1$', '$1'
$log_name = '{0}_{1:yyyyMMddhhmmss}.log' -f $script_name, (Get-Date)
$log = $log_path | Join-Path -ChildPath $log_name
Start-Transcript -Path $log -Append -IncludeInvocationHeader
# Do stuff
# Use Write-Host for log comments
# Captures errors without additional code
# Stop logging
#######################################################################
Stop-Transcript

Powershell - How to rename a file after moving it?

I am currently working on a project that requires that I move a file, and then rename it. I am using this code to move it and that is working. However, the rename portion is not taking place as it should. I cannot figure out why this isn't working. What have I goofed up? I have been beating my head against my desk for at least 20 minutes trying to figure this out.
# Variables for Watcher
$folder = "C:\Program Files\Whatever\Connector\Export\JobStatus"
$filter = '*.txt'
$date=(get-date -Format d) -replace("/")
$time=(get-date -Format t) -replace(":")
# Watcher + Settings
$fsw = New-Object IO.FileSystemWatcher $folder, $filter
$fsw.IncludeSubdirectories = $false
$fsw.NotifyFilter = [IO.NotifyFilters]'FileName', 'DirectoryName'
# Register Event (when file is created)
$onCreated = Register-ObjectEvent $fsw Created -SourceIdentifier FileCreated
-Action {
# Foreach file loop
ForEach ($f in $fsw)
{
if (($File = Get-Item $Event.SourceEventArgs.FullPath | select -Expand
Extension) -eq ".txt")
{
#Used for file testing - Opens the text file for 10 secs, then kills
it.
#Start-Process -FilePath $Event.SourceEventArgs.FullPath | %{ sleep
10; $_ } | kill
# Variables for move
$folderpath = ($Event.SourceEventArgs.FullPath | Split-Path)
$folderfile = ($Event.SourceEventArgs.FullPath | Split-Path -Leaf)
$destination = "C:\Program Files\Whatever\Connector\Staging\"
$newname = "job.import.$date"+"_"+"$time.txt"
}
# Variables for logging
$logpath = 'C:\Program
Files\Whatever\Connector\Export\JobStatus\outlog.txt'
# Grab current file and move to "Staging" folder
try
{
Get-ChildItem -Path $folderpath -Filter $folderfile | Move-Item -
Destination $destination | sleep 5 | Write-Host Rename-Item
$destination$folderfile -NewName $newname | Out-File -FilePath $logpath -
Append
Write-Host $destination$newname
#sleep 5
#Rename-Item "$destination $folderfile" -NewName $newname
#Write-Host $destination $folderfile
#"File $folderfile renamed to $newname" | Out-File -FilePath
$logpath -Append
# Log the move in logfile
"File $folderfile moved to $destination" | Out-File -FilePath
$logpath -Append
}
# Log if errors + clear
catch
{
$error | Out-File -FilePath $logpath -Append
$Error.Clear()
}
}
}
The pipeline is broken when there's no object output. move-item doesn't output an object unless the -passthru parameter is used. Also, set-sleep doesn't output anything. So, rename-item is never reached.
Replace the pipes after move-item and sleep with semicolons, and it should work.
I actually fixed this removing the piped rename and replacing it with a 5 second sleep. I do the rename after the sleep and it works fine now. Still not sure why the rename wasnt working in the piped command though.

Combine output from two Jobs

I have a Windows command line utility which outputs text to standard output, as well as logging to a file. The two outputs complement each other, and so I want to be able combine both streams. The standard output will remain untouched. However, I aim to take chunks of the log file, process them, and also send it to standard output.
My first attempt was to run:
[HashTable] $queue = #{};
$roboDealerJob = Start-Job -ScriptBlock {
Param(
[string] $outputFile,
[HashTable] $queue
)
& "utility.exe" $outputFile |
ForEach-Object
{
$queue.Add($_, $_);
}
} -Name "MyUtility" -ArgumentList $outputFile,$queue;
While (!(Test-Path $logFile))
{
Start-Sleep -Milliseconds 100;
}
Get-Content -Path $logFile -Tail 1 -Wait |
ForEach-Object {
Receive-Job $roboDealerJob
ForEach ($item in $queue.Values)
{
Write-Host $item;
$queue.Remove($item);
}
Write-Host $_;
}
However, the job doesn't seem to run.
Any suggestions on how to do this?

PowerShell run script simultaneously

I created a PowerShell script to remove all files and folders older than X days. This works perfectly fine and the logging is also ok. Because PowerShell is a bit slow, it can take some time to delete these files and folders when big quantities are to be treated.
My questions: How can I have this script ran on multiple directories ($Target) at the same time?
Ideally, we would like to have this in a scheduled task on Win 2008 R2 server and have an input file (txt, csv) to paste some new target locations in.
Thank you for your help/advise.
The script
#================= VARIABLES ==================================================
$Target = \\share\dir1"
$OlderThanDays = "10"
$Logfile = "$Target\Auto_Clean.log"
#================= BODY =======================================================
# Set start time
$StartTime = (Get-Date).ToShortDateString()+", "+(Get-Date).ToLongTimeString()
Write-Output "`nDeleting folders that are older than $OlderThanDays days:`n" | Tee-Object $LogFile -Append
Get-ChildItem -Directory -Path $Target |
Where-Object { $_.LastWriteTime -lt (Get-Date).AddDays(-$OlderThanDays) } | ForEach {
$Folder = $_.FullName
Remove-Item $Folder -Recurse -Force -ErrorAction SilentlyContinue
$Timestamp = (Get-Date).ToShortDateString()+" | "+(Get-Date).ToLongTimeString()
# If folder can't be removed
if (Test-Path $Folder)
{ "$Timestamp | FAILLED: $Folder (IN USE)" }
else
{ "$Timestamp | REMOVED: $Folder" }
} | Tee-Object $LogFile -Append # Output folder names to console & logfile at the same time
# Set end time & calculate runtime
$EndTime = (Get-Date).ToShortDateString()+", "+(Get-Date).ToLongTimeString()
$TimeTaken = New-TimeSpan -Start $StartTime -End $EndTime
# Write footer to log
Write-Output ($Footer = #"
Start Time : $StartTime
End Time : $EndTime
Total runtime : $TimeTaken
$("-"*79)
"#)
# Create logfile
Out-File -FilePath $LogFile -Append -InputObject $Footer
# Clean up variables at end of script
$Target=$StartTime=$EndTime=$OlderThanDays = $null
One way to achieve this would be to write an "outer" script that passes the directory-to-be-cleaned, into the "inner" script, as a parameter.
For your "outer" script, have something like this:
$DirectoryList = Get-Content -Path $PSScriptRoot\DirList;
foreach ($Directory in $DirectoryList) {
Start-Process -FilePath powershell.exe -ArgumentList ('"{0}\InnerScript.ps1" -Path "{1}"' -f $PSScriptRoot, $Directory);
}
Note: Using Start-Process kicks off a new process that is, by default, asynchronous. If you use the -Wait parameter, then the process will run synchronously. Since you want things to run more quickly and asynchronously, omitting the -Wait parameter should achieve the desired results.
Invoke-Command
Alternatively, you could use Invoke-Command to kick off a PowerShell script, using the parameters: -File, -ArgumentList, -ThrottleLimit, and -AsJob. The Invoke-Command command relies on PowerShell Remoting, so that must enabled, at least on the local machine.
Add a parameter block to the top of your "inner" script (the one you posted above), like so:
param (
[Parameter(Mandatory = $true)]
[string] $Path
)
That way, your "outer" script can pass in the directory path, using the -Path parameter for the "inner" script.