I'm trying to put together a script to monitor the MSMQ on a server. I found this and it works like a charm but I also need to get the time of arrival.
When using the Get-Member cmdlet I get a list of properties but none of the seem to get me what I need.
Does anyone know how to get time of arrival?
/G
Finally figured it out.
[String]$cName = $Env:COMPUTERNAME
[Reflection.Assembly]::LoadWithPartialName("System.Messaging") | out-null
[System.Messaging.MessageQueue[]]$queues = [System.Messaging.MessageQueue]::GetPrivateQueuesByMachine($cName.ToLower())
Foreach ($queue in $queues) {
$queue.MessageReadPropertyFilter.SetAll()
try {
[System.Messaging.Message]$message = $queue.Peek(10)
Write-Host $queue.QueueName $message.ArrivedTime
}
catch {
#Write-Host "Timeout"
}
}
The trick is to set the MessageReadPropertyFilter. It is possible to set each property separately but this will do for now.
$qname = "<YourQueueNameHere>"
[Reflection.Assembly]::LoadWithPartialName("System.Messaging") | out-null
$q = new-object System.Messaging.MessageQueue($qname)
$msgs = $q.GetAllMessages()
foreach ( $msg in $msgs )
{
Write-Host $msg.ArrivedTime
}
Documentation for Message here.
Documentation for Message.ArrivedTime here.
Here is what worked for me. In my case, the script counts how many messages have been received for the past hour.
$QueueArray = "MyQueue1.ERRORS", "MyQueue2.ERRORS";
foreach ($queue in $QueueArray)
{
# Create each MSMQ object
[System.Reflection.Assembly]::LoadWithPartialName("System.Messaging") | Out-Null
$MessageQueue = New-Object System.Messaging.MessageQueue (".\private$\" + $queue)
# Get the amount of the messages in the queue for the past hour.
$MessageQueue.MessageReadPropertyFilter.SetAll()
$PastHour = (Get-Date).AddHours(-1)
$countFailedMessages = 0
foreach ($message in $MessageQueue )
{
if ($PastHour -lt $message.ArrivedTime)
{
#Write-Host $message.ArrivedTime
$countFailedMessages++
#Write-Host $countFailedMessages
}
}
Write-Host $queue `t $countFailedMessages "failed messages for the past hour"
}
Related
I have a scenario where I am doing any number of things with a network resource, copying files or folders, deleting a ZIP file after it has been unzipped locally, checking an EXE to see if it was downloaded from the internet and needs unblocked, running a software install, etc. And all of these tasks are impacted by things like an installer that hasn't released a file lock on a log file to be deleted, or the unzip hasn't released the file lock on the zip file, or the network is for a fleeting moment "unavailable" so a network file is not found.
I have a technique that works well for handling this scenario, I do a loop and react to the specific exception, and when that exception occurs I just sleep for 6 seconds. The loop happens up to 10 times before I abandon. Here is the code for dealing with an Autodesk log that is still locked by the Autodesk installer.
$waitRetryCount = 10
:waitForAccess do {
try {
Remove-Item $odisLogPath -Recurse -errorAction:Stop
$retry = $False
} catch {
if (($PSItem.Exception -is [System.IO.IOException]) -or ($PSItem.Exception.InnerException -and ($PSItem.Exception.InnerException -is [System.IO.IOException]))) {
if ($waitRetryCount -eq 0) {
$invokeODISLogManagement.log.Add('E_Error deleting ODIS logs: retry limit exceeded')
break waitForFolderAccess
} else {
$retry = $True
$waitRetryCount --
Start-Sleep -s:6
}
} else {
$invokeODISLogManagement.log.Add('E_Error deleting ODIS logs')
$invokeODISLogManagement.log.Add("=_$($PSItem.Exception.GetType().FullName)")
$invokeODISLogManagement.log.Add("=_$($PSItem.Exception.Message)")
if ($PSItem.Exception.InnerException) {
$invokeODISLogManagement.log.Add("=_$($PSItem.Exception.InnerException.GetType().FullName)")
$invokeODISLogManagement.log.Add("=_$($PSItem.Exception.InnerException.Message)")
}
$retry = $False
}
}
} while ($retry)
The thing is, I would like to convert this to a function, since it needs to be handled in a lot of places. So I would need to pass to the function the specific exception I am looking for and the code to be run in the try block, and get back a log (as a generic.list) that I can then add to the actual log list. The first and last aspects I have, but I am unsure the best approach for the code to try. In the above example it is a single line, Remove-Item $odisLogPath -Recurse -errorAction:Stop, but it could be multiple lines I suspect.
To start playing with this I verified that this does seem to work, at least with a single line of code.
$code = {Get-Item '\\noServer\folder\file.txt' -errorAction:Stop}
try {
& $code
} catch {
Write-Host "$($_.Exception.GetType().FullName)"
}
But the error action is going to be duplicated a lot, so I thought to maybe address that within the function, however
$code = {Get-Item '\noServer\folder\file.txt'}
try {
& $code -errorAction:Stop
} catch {
Write-Host "$($_.Exception.GetType().FullName)"
}
does NOT work. I get the exception uncaught.
So, my questions are
1: Is this the right direction? I am pretty sure it is but perhaps someone has a gotcha that I am not seeing, yet. :)
2: Is there a mechanism to add the -errorAction:Stop in the try, so I don't need to do it/remember to do it, at every use of this new function.
3: I seem to remember reading about a programming concept of passing code to a function, and I can't remember what that is called, but I would like to know the generic term. Indeed, it probably would help if I could tag it for this post. I had thought it might be lama, but a quick search suggests that is not the case? Being self taught sucks sometimes.
EDIT:
I have now implemented a function, that starts to do what I want.
function Invoke-PxWaitForAccess {
param (
[System.Management.Automation.ScriptBlock]$code,
[String]$path
)
try {
(& $code -path $path)
} catch {
Return "$($_.Exception.GetType().FullName)!!"
}
}
$path = '\\noServer\folder\file.txt'
$code = {param ([String]$path) Write-Host "$path!"; Get-Item $path}
Invoke-PxWaitForAccess -code $code -path $path
I do wonder if the path couldn't somehow be encapsulated in the $code variable itself, since this implementation means it can ONLY be used where the code being run has a single variable called $path.
And, still wondering if this really is the best, or even a good, way to proceed? Or are there arguments for just implementing my loop 50 some odd times in all the situations where I need this behavior.
Also worth noting that this code does not yet implement the loop or address the fact that different exceptions apply in different situations.
EDIT #2:
And here is a more complete implementation, though it fails because it seems I am not actually passing a type, even though it looks like I am. So I get an error because what is to the right of -is must be an actual type.
function Invoke-PxWaitForAccess {
param (
[System.Management.Automation.ScriptBlock]$code,
[String]$path,
[Type]$exceptionType
)
$invokeWaitForAccess = #{
success = $Null
log = [System.Collections.Generic.List[String]]::new()
}
$waitRetryCount = 2
:waitForAccess do {
try {
Write-Host "$path ($waitRetryCount)"
& $code -path $path
$retry = $False
} catch {
Write-Host "!$($PSItem.Exception.GetType().FullName)"
if (($PSItem.Exception -is $exceptionType) -or ($PSItem.Exception.InnerException -and ($PSItem.Exception.InnerException -is $exceptionType))) {
Write-Host "($waitRetryCount)"
if ($waitRetryCount -eq 0) {
$invokeWaitForAccess.log.Add('E_retry limit exceeded')
break waitForFolderAccess
} else {
$retry = $True
$waitRetryCount --
Start-Sleep -s:6
}
} else {
$invokeWaitForAccess.log.Add("=_$($PSItem.Exception.GetType().FullName)")
$invokeWaitForAccess.log.Add("=_$($PSItem.Exception.Message)")
if ($PSItem.Exception.InnerException) {
$invokeWaitForAccess.log.Add("=_$($PSItem.Exception.InnerException.GetType().FullName)")
$invokeWaitForAccess.log.Add("=_$($PSItem.Exception.InnerException.Message)")
}
$retry = $False
}
}
} while ($retry)
if ($invokeWaitForAccess.log.Count -eq 0) {
$invokeWaitForAccess.success = $True
} else {
$invokeWaitForAccess.success = $False
}
return $invokeWaitForAccess
}
$path = '\\noServer\folder\file.txt'
$code = {param ([String]$path) Get-Item $path -errorAction:Stop}
if ($invoke = (Invoke-PxWaitForAccess -code $code -path $path -type ([System.Management.Automation.ItemNotFoundException])).success) {
Write-Host 'Good'
} else {
foreach ($line in $invoke.log) {
Write-Host "$line"
}
}
EDIT #3: This is what I have now, and it seems to work fine. But the code I am passing will sometimes be something like Remove-Object and the error is [System.IO.IOException], but at other times I actually need to return more than an error, like here where the code involves Get-Item. And that means defining the code block outside the function with a reference to the variable inside the function, which seems, fugly, to me. It may be that what I am trying to do is just more complicated than PowerShell is really designed to handle, but it seems MUCH more likely that there is a more elegant way to do what I am trying to do? Without being able to manipulate the script block from within the function I don't see any good options.
For what it is worth this last example shows a failure where the exception I am accepting for the repeat occurs and hits the limit, as well as an exception that just immediately fails because it is not the exception I am looping on, and an example where I return something. A fourth condition would be when I am trying to delete, and waiting on [System.IO.IOException] and a success would return nothing, no item, and no error log.
function Invoke-PxWaitForAccess {
param (
[System.Management.Automation.ScriptBlock]$code,
[String]$path,
[Type]$exceptionType
)
$invokeWaitForAccess = #{
item = $null
errorLog = [System.Collections.Generic.List[String]]::new()
}
$waitRetryCount = 2
:waitForSuccess do {
try {
& $code -path $path
$retry = $False
} catch {
if (($PSItem.Exception -is $exceptionType) -or ($PSItem.Exception.InnerException -and ($PSItem.Exception.InnerException -is $exceptionType))) {
if ($waitRetryCount -eq 0) {
$invokeWaitForAccess.errorLog.Add('E_Retry limit exceeded')
break waitForSuccess
} else {
$retry = $True
$waitRetryCount --
Start-Sleep -s:6
}
} else {
$invokeWaitForAccess.errorLog.Add("=_$($PSItem.Exception.GetType().FullName)")
$invokeWaitForAccess.errorLog.Add("=_$($PSItem.Exception.Message)")
if ($PSItem.Exception.InnerException) {
$invokeWaitForAccess.errorLog.Add("=_$($PSItem.Exception.InnerException.GetType().FullName)")
$invokeWaitForAccess.errorLog.Add("=_$($PSItem.Exception.InnerException.Message)")
}
$retry = $False
}
}
} while ($retry)
return $invokeWaitForAccess
}
CLS
$path = '\\noServer\folder\file.txt'
$code = {param ([String]$path) Get-Item $path -errorAction:Stop}
$invoke = (Invoke-PxWaitForAccess -code $code -path $path -exceptionType:([System.Management.Automation.ItemNotFoundException]))
if ($invoke.errorLog.count -eq 0) {
Write-Host "Good $path"
} else {
foreach ($line in $invoke.errorLog) {
Write-Host "$line"
}
}
Write-Host
$path = '\\noServer\folder\file.txt'
$code = {param ([String]$path) Get-Item $path -errorAction:Stop}
$invoke = (Invoke-PxWaitForAccess -code $code -path $path -exceptionType:([System.IO.IOException]))
if ($invoke.errorLog.count -eq 0) {
Write-Host "Good $path"
} else {
foreach ($line in $invoke.errorLog) {
Write-Host "$line"
}
}
Write-Host
$path = '\\Mac\iCloud Drive\Px Tools 3.#\# Dev 3.4.5\Definitions.xml'
$code = {param ([String]$path) $invokeWaitForAccess.item = Get-Item $path -errorAction:Stop}
$invoke = (Invoke-PxWaitForAccess -code $code -path $path -exceptionType:([System.Management.Automation.ItemNotFoundException]))
if ($invoke.errorLog.count -eq 0) {
Write-Host "Good $path !"
Write-Host "$($invoke.item)"
} else {
foreach ($line in $invoke.errorLog) {
Write-Host "$line"
}
}
Write-Host
As the title suggests, how do I do an if statement for comparing 2 files? This is what I have so far but even though the files are the same it still keeps prompting me for an update
Add-Type -AssemblyName 'PresentationFramework'
$continue = 'Yes'
Invoke-WebRequest -Uri "https://<url>/file" -OutFile "C:\temp\file"
if ( "C:\temp\file" -ne "C:\app\file" ) {
$caption = "Update Available!"
$message = "There is a new version available. Do you want to update to the latest version?"
$continue = [System.Windows.MessageBox]::Show($message, $caption, 'YesNo');
} else {
Write-Output "You have the latest version!"
}
if ($continue -eq 'Yes') {
Write-Output "Downloading application Please wait..."
// Do something //
} else {
Write-Output "Cancelled"
}
Any help would be greatly appreciated! Thanks
This is the logic I would personally follow, you can re-estructure it later for your need but in very basic steps:
Get the MD5 sum of my current file.
Query the new file with Invoke-WebRequest and using a MemoryStream check if the new hash is the same as the previous one.
If it's not, write that stream to a FileStream.
$ErrorActionPreference = 'Stop'
try {
$previousFile = "C:\app\file"
$previousHash = (Get-FileHash $previousFile -Algorithm MD5).Hash
$webReq = Invoke-WebRequest 'https://<url>/file'
$memStream = [IO.MemoryStream] $webReq.Content
$newHash = (Get-FileHash -InputStream $memStream -Algorithm MD5).Hash
if($previousHash -ne $newHash) {
$file = [IO.File]::Create('path\to\mynewfile.ext')
$memStream.WriteTo($file)
}
}
finally {
$file, $memStream | ForEach-Object Dispose
}
The main idea behind it is to check all in memory so as to avoid the need to download a file if not needed.
I'm trying to check if a process is running on a remote computer (Eventually will be about 100 computers). If the process is not running, I'd like it to put the computername/IP into a CSV and then email that out. If the process is running on all machines, I'd like the script to not send an email out at all. To do this, I'd like to test the machines first to check they're online (If they're offline, we've either got bigger problems or it's off for a reason, but that's not what this process is checking for.
I'm going to be testing this script on a few machines with just the notepad process at the moment as it's something I can do on a test machines reletively quickly.
I'm a little stuck at the moment, because I don't know how to get the results from the process check to be put into a CSV and then emailed. In the code snippet below, it's not generating the outfile, but have left the variable I was testing with and the path to where the attachment would be in the send-mailmessage. Any advice will be appreciated, I'm still learning powershell at the moment so don't know all the tricks and tips yet.
Cheers
# Mail server Configuration
$MailServer = "mail.server.co.uk"
$MailFrom = MailFrom#server.co.uk"
# Mail Content Configuration
$MailTo = "Recipient#Server.co.uk"
$MailSubjectFail = "INS Process not running on $DAU"
$MailBodyFail = "The INS Process on the DAU $DAU is not running. Please manually start process on DAU $DAU"
# Process Info
$Process = "Notepad"
$ProcessIsRunning = { Get-Process $Process -ErrorAction SilentlyContinue }
#Results Info
$Exportto = "C:\Scripts\Content\INSChecker\Results.csv"
# Get DAU Information
foreach($line in (Get-Content C:\Scripts\Content\INSChecker\INSList.cfg)){
$line = $line.split(",")
$DAU = $line[0]
$DAUIP = $line[1]
# Test Connection to INS DAU
write-host "Testing: $DAU / $DAUIP"
$TestDAU = Test-Connection $DAU -quiet
$TestDAUIP = Test-Connection $DAUIP -quiet
write-host "Tests: $TestDAU / $TestDAUIP"
If($TestDAU -ne 'True'){
If($TestDAUIP -ne 'True'){
write-host "DNS Not resolved for $DAU"
Write-Output "INS $DAU/$DAUIP is OFFLINE" | Out-File C:\Scripts\Content\INSChecker\INSProcessCheck.log -append
}
}
Else{
# Get Process Running State and Send Email
if(!$ProcessIsRunning.Invoke()) {
Send-MailMessage -To $MailTo -From $MailFrom -SmtpServer $MailServer -Subject $MailSubjectFail -Body $MailBodyFail -Attachments C:\Scripts\Content\INSChecker\Results.csv
} else {
"Running"
}
}
}
Hopefully this gives a you a hint on where to begin and how to approach the problem, I have removed the irrelevant parts of the script and only left the logic I would personally follow.
The result of $report should be an object[] (object array) which should be very easy to manipulate and very easy to export to CSV:
#($report).where({ $_.SendMail }) | Export-Csv $exportTo -NoTypeInformation
I'll leave you the remaining tasks (attach the CSV, send the emails, etc) for your own research and design.
$ErrorActionPreference = 'Stop'
# Process Info
$Process = "Notepad"
$ProcessIsRunning = {
param($computer, $process)
# On Windows PowerShell -ComputerName is an option,
# this was removed on PS Core
try
{
$null = Get-Process $process -ComputerName $computer
# If process is running return 'Running'
'Running'
}
catch
{
# else return 'Not Running'
'Not Running'
# send a Warning to the console to understand why did this
# fail ( couldn't connect or the process is not running? )
Write-Warning $_.Exception.Message
}
}
#Results Info
$ExportTo = "C:\Scripts\Content\INSChecker\Results.csv"
$exportProps = 'Server', 'IP', 'Ping', 'DNSResolution', 'Process', 'SendMail'
# Get DAU Information
$report = foreach($line in Get-Content path/to/file.txt)
{
$status = [ordered]#{} | Select-Object $exportProps
$DAU, $DAUIP = $line = $line.split(",")
$status.SendMail = $false
$status.Server = $DAU
$status.IP = $DAUIP
# Test ICMP Echo Request and DNS Resolution
$ping = Test-Connection $DAUIP -Quiet
$dns = Test-Connection $DAU -Quiet
$status.Ping = ('Failed', 'Success')[$ping]
$status.DNSResolution = ('Failed', 'Success')[$dns]
$status.Process = & $ProcessIsRunning -computer $DAUIP -process $Process
if(-not $ping -or -not $dns -or $status.Process -eq 'Not Running')
{
$status.SendMail = $true
}
[pscustomobject]$status
}
#($report).where({ $_.SendMail }) # => This is what should be mailed
I want to know if its bad form to use try blocks to test if a file is locked. Here's the background.
I need to send text output of an application to two serial printers simultaneously. My solution was to use MportMon, and a Powershell script. The way it's supposed to work is the application default prints to the MportMon virtual printer port, which actually makes a uniquely named file in a "dropbox" folder. The powershell script uses a filesystemwatcher to monitor the folder and when a new file is created, it takes the textual content and pushes it out two serial printers, then deletes the file, so as not to fill up the folder. I was having a problem when trying to read the text from the file that the virtual printer created. I found that I was getting errors becasue the file was still locked. To fixed the problem, I used a FSM to impliment the logic and instead of checking for a lock everytime before attempting to get the content from the file, I used a try block that attempts to read content from the file, if it fails, the catch block just reaffirms the state that the FSM is in, and the process is repeated until successful. It seems to work fine, but I've read somewhere that its bad practice. Is there any danger in this method, or is it safe and reliable? Below is my code.
$fsw = New-Object system.io.filesystemwatcher
$q = New-Object system.collections.queue
$path = "c:\DropBox"
$fsw.path = $path
$state = "waitforQ"
[string]$tempPath = $null
Register-ObjectEvent -InputObject $fsw -EventName created -Action {
$q.enqueue( $event.sourceeventargs.fullpath )
}
while($true) {
switch($state)
{
"waitforQ" {
echo "waitforQ"
if ($q.count -gt 0 ) {$state = "retrievefromQ"}
}
"retrievefromQ" {
echo "retrievefromQ"
$tempPath = $q.dequeue()
$state = "servicefile"
}
"servicefile" {
echo " in servicefile "
try
{
$text = Get-Content -ErrorAction stop $tempPath
#echo "in try"
$text | out-printer db1
$text | out-printer db2
echo " $text "
$state = "waitforQ"
rm $tempPath
}
catch
{
#echo "in catch"
$state = "servicefile"
}
}
Default { $state = "waitforQ" }
}
}
I wouldn't say it's bad practice to test a file to see if it's locked, but it's not as clean as checking the handles used by other processes. Personally I'd test the file like you do, but I adjust a few parts to make it safer/better.
That switch-statement looks way to complicated (for me), I'd replace it with a simple if-test. "If files in queue, proceed, if not, wait".
You need to slow down.. You will try to read the file as many times as possible while it's locked. This is a waste of resources since it will take some time for the current application to let it go and save the data to a HDD. Add some pauses. You won't notice them, but your CPU will love them. The same applies when there are no files in the queue.
You might benefit from adding a timeout, like max 50 attempts to read the file, to avoid the script getting stuck if one specific file is never released.
Try:
$fsw = New-Object system.io.filesystemwatcher
$q = New-Object system.collections.queue
$path = "c:\DropBox"
$fsw.path = $path
$MaxTries = 50 #50times * 0,2s sleep = 10sec timeout
[string]$tempPath = $null
Register-ObjectEvent -InputObject $fsw -EventName created -Action {
$q.enqueue( $event.sourceeventargs.fullpath )
}
while($true) {
if($q.Count -gt 0) {
#Get next file in queue
$tempPath = $q.dequeue()
#Read file
$text = $null
$i = 0
while($text -eq $null) {
#If locked, wait and try again
try {
$text = Get-Content -Path $tempPath -ErrorAction Stop
} catch {
$i++
if($i -eq $MaxTries) {
#Max attempts reached. Stops script
Write-Error -Message "Script is stuck on locked file '$tempPath'" -ErrorAction Stop
} else {
#Wait
Start-Sleep -Milliseconds 200
}
}
}
#Print file
$text | Out-Printer db1
$text | Out-Printer db2
echo " $text "
#Remove temp-file
Remove-Item $tempPath
}
#Relax..
Start-Sleep -Milliseconds 500
}
I have a small PowerShell program that starts a few threads to do parallel calculations and then when they are finished they append a line with the results to a text file and proceed to do some more. This worked fine in development and testing, but occasionally in production it hangs, and it seems the file is "jammed open". I have the writes wrapped in "try" blocks, but that does not help. I have written a toy application to illustrate the problem, it hangs after about 10-15 minutes usually (and writing about 3000 lines).
It seems to me I would have been better off with a Python solution using mutexs or something, but I am pretty far down this road now. Looking for ideas how I can easily fix this. I really thought Add-Content would have been atomic...
Parentjob.ps1
# Start a bunch of jobs
$curdir = "c:\transfer\filecollide"
$tokens = "tok00","tok01","tok02",
"tok03","tok04","tok05",
"tok06","tok07","tok08"
$jobs = #()
foreach ($tok in $tokens)
{
$job = Start-Job -FilePath ".\childjob.ps1" -ArgumentList "${curdir}",$tok,2,1000
Start-Sleep -s 3 # stagger things a bit
Write-Output " Starting:${tok} job"
$jobs += ,$job
}
foreach ($job in $jobs)
{
wait-job $job
$out = receive-job $job
Write-Output($out)
}
childjob.ps1
param(
[string]$curdir = ".",
[string]$tok = "tok?",
[int]$interval = 10,
[int]$ntodo = 1
)
$nwritefails = 0
$nwritesuccess = 0
$nwrite2fails = 0
function singleLine
{
param(
[string]$tok,
[string]$fileappendout = "",
[int]$timeout = 3
)
$curdatetime = (Get-Date)
$sout = "${curdatetime},${tok},${global:nwritesuccess},${global:nwritefails},${global:nwrite2fails}"
$global:nwritesuccess++
try
{
Add-Content -Path $fileappendout -Value "${sout}"
}
catch
{
$global:nwritefails++
try
{
Start-Sleep -s 1
Add-Content -Path $fileappendout -Value "${sout}"
}
catch
{
$global:nwrite2fails++
Write-Output "Failed to write to ${fileappendout}"
}
}
}
Write-Output "Starting to process ${tok}"
#Start of main code
cd "${curdir}"
$ndone = 0
while ($true)
{
singleLine $tok "outfile.txt"
$ndone++
if ($ndone -gt $ntodo){ break }
Start-Sleep -s $interval
}
Write-Output "Successful ${tok} appends:${nwritesuccess} failed:${nwritefails} failed2:${nwrite2fails}"
Why not have the jobs write the results to the output stream, and use Receive-Job in the main thread to collect the results and update the file? You can do this while the jobs are still running. What you're writing to the out stream now looks like it might be more appropriately written to the Progress stream.