Multiple io.filesystemwatchers in parallel - powershell

I have three different tasks that I wish to outsource to filesystem watchers in powershell. I have the code all set up to initialize two watchers and to check every ten seconds to make sure they are running. However the tasks that they perform last under a minute, and 5 minutes respectively. The third task I wish to outsource to a watcher takes about an hour. I am concerned that if I have all of them running simultaneously, tasks that the first two should watch for will not get done at all if the third watcher is executing its change action. Is there a way to implement or run them such that the change actions can be executed in parallel?

You can use the Start-ThreadJob cmdlet to run your file-watching tasks in parallel.
Start-ThreadJob comes with the ThreadJob module and offers a lightweight, thread-based alternative to the child-process-based regular background jobs.
It comes with PowerShell [Core] v6+ and in Windows PowerShell can be installed on demand with, e.g., Install-Module ThreadJob -Scope CurrentUser.
In most cases, thread jobs are the better choice, both for performance and type fidelity - see the bottom section of this answer for why.
The following self-contained sample code:
uses thread jobs to run 2 distinct file-monitoring and processing tasks in parallel,
which neither block each other nor the caller.
Note:
Each task creates its own System.IO.FileSystemWatcher instance in the code below, though creating too many of them can put a significant load on the system, possibly resulting in events getting missed.
An alternative is to share instances, such as creating a single one in the caller's context, which the thread jobs can access (see comments in source code below).
[This is in part speculative; do tell us if I got things wrong] Direct FileSystemWatcher .NET event-handler delegates should be kept short, but subscribing to the events from PowerShell via an event job created by Register-ObjectEvent queues the events on the PowerShell side, which PowerShell then dispatches to the -Action script blocks, so that these blocks perform long-running operations below shouldn't be an immediate concern (the tasks may take a long time to catch up, though).
# Make sure that the ThreadJob module is available.
# In Windows PowerShell, it must be installed first.
# In PowerShell [Core], it is available by default.
Import-Module ThreadJob -ea Stop
try {
# Use the system's temp folder in this example.
$dir = (Get-Item -EA Ignore temp:).FullName; if (-not $dir) { $dir = $env:TEMP }
# Define the tasks as an array of custom objects that specify the dir.
# and file name pattern to monitor as well as the action script block to
# handle the events.
$tasks = # array of custom objects to describe the
[pscustomobject] #{
DirToMonitor = $dir
FileNamePattern = '*.tmp1'
Action = {
# Print status info containing the event data to the host, synchronously.
Write-Host -NoNewLine "`nINFO: Event 1 raised:`n$($EventArgs | Format-List | Out-String)"
# Sleep to simulate blocking the thread with a long-running task.
Write-Host "INFO: Event 1: Working for 4 secs."
Start-Sleep 4
# Create output, which Receive-Job can collect.
"`nEvent 1 output: " + $EventArgs.Name
}
},
[pscustomobject] #{
DirToMonitor = $dir
FileNamePattern = '*.tmp2'
Action = {
# Print status info containing the event data to the host, synchronously
Write-Host -NoNewLine "`nINFO: Event 2 raised:`n$($EventArgs | Format-List | Out-String)"
# Sleep to simulate blocking the thread with a long-running task.
Write-Host "INFO: Event 2: Working for 2 secs"
Start-Sleep 2
# Create output, which Receive-Job can collect.
"`nEvent 2 output: " + $EventArgs.Name
}
}
# Start a separate thread job for each action task.
$threadJobs = $tasks | ForEach-Object {
Start-ThreadJob -ArgumentList $_ {
param([pscustomobject] $task)
# Create and initialize a thread-specific watcher.
# Note: To keep system load low, it's generally better to use a *shared*
# watcher, if feasible. You can define it in the caller's scope
# and access here via $using:watcher
$watcher = [System.IO.FileSystemWatcher] [ordered] #{
Path = $task.DirToMonitor
Filter = $task.FileNamePattern
EnableRaisingEvents = $true # start watching.
}
# Subscribe to the watcher's Created events, which returns an event job.
# This indefinitely running job receives the output from the -Action script
# block whenever the latter is called after an event fires.
$eventJob = Register-ObjectEvent -ea stop $watcher Created -Action $task.Action
Write-Host "`nINFO: Watching $($task.DirToMonitor) for creation of $($task.FileNamePattern) files..."
# Indefinitely wait for output from the action blocks and relay it.
try {
while ($true) {
Receive-Job $eventJob
Start-Sleep -Milliseconds 500 # sleep a little
}
}
finally {
# !! This doesn't print, presumably because this is killed by the
# !! *caller* being killed, which then doesn't relay the output anymore.
Write-Host "Cleaning up thread for task $($task.FileNamePattern)..."
# Dispose of the watcher.
$watcher.Dispose()
# Remove the event job (and with it the event subscription).
$eventJob | Remove-Job -Force
}
}
}
$sampleFilesCreated = $false
$sampleFiles = foreach ($task in $tasks) { Join-Path $task.DirToMonitor ("tmp_$PID" + ($task.FileNamePattern -replace '\*')) }
Write-Host "Starting tasks...`nUse Ctrl-C to stop."
# Indefinitely wait for and display output from the thread jobs.
# Use Ctrl+C to stop.
$dtStart = [datetime]::UtcNow
while ($true) {
# Receive thread job output, if any.
$threadJobs | Receive-Job
# Sleep a little.
Write-Host . -NoNewline
Start-Sleep -Milliseconds 500
# A good while after startup, create sample files that trigger all tasks.
# NOTE: The delay must be long enough for the task event handlers to already be
# in place. How long that takes can vary.
# Watch the status output to make sure the files are created
# *after* the event handlers became active.
# If not, increase the delay or create files manually once
# the event handlers are in place.
if (-not $sampleFilesCreated -and ([datetime]::UtcNow - $dtStart).TotalSeconds -ge 10) {
Write-Host
foreach ($sampleFile in $sampleFiles) {
Write-Host "INFO: Creating sample file $sampleFile..."
$null > $sampleFile
}
$sampleFilesCreated = $true
}
}
}
finally {
# Clean up.
# Clean up the thread jobs.
Remove-Job -Force $threadJobs
# Remove the temp. sample files
Remove-Item -ea Ignore $sampleFiles
}
The above creates output such as the following (sample from a macOS machine):
Starting tasks...
Use Ctrl-C to stop.
.
INFO: Watching /var/folders/19/0lxcl7hd63d6fqd813glqppc0000gn/T/ for creation of *.tmp1 files...
INFO: Watching /var/folders/19/0lxcl7hd63d6fqd813glqppc0000gn/T/ for creation of *.tmp2 files...
.........
INFO: Creating sample file /var/folders/19/0lxcl7hd63d6fqd813glqppc0000gn/T/tmp_91418.tmp1...
INFO: Creating sample file /var/folders/19/0lxcl7hd63d6fqd813glqppc0000gn/T/tmp_91418.tmp2...
.
INFO: Event 1 raised:
ChangeType : Created
FullPath : /var/folders/19/0lxcl7hd63d6fqd813glqppc0000gn/T/tmp_91418.tmp1
Name : tmp_91418.tmp1
INFO: Event 1: Working for 4 secs.
INFO: Event 2 raised:
ChangeType : Created
FullPath : /var/folders/19/0lxcl7hd63d6fqd813glqppc0000gn/T/tmp_91418.tmp2
Name : tmp_91418.tmp2
INFO: Event 2: Working for 2 secs
....
Event 2 output: tmp_91418.tmp2
....
Event 1 output: tmp_91418.tmp1
.................

Related

Do threads still execute using -asjob with wait-job?

Hello all and good afternoon!
I had a quick question regarding -asjob running with invoke-command.
If I run 2 Invoke-Command's using -asjob, does it run simultaneously when I try to receive the ouput? Does this mean wait-job waits till the first job specified is finished running to get the next results?
Write-Host "Searching for PST and OST files. Please be patient!" -BackgroundColor White -ForegroundColor DarkBlue
$pSTlocation = Invoke-Command -ComputerName localhost -ScriptBlock {Get-Childitem "C:\" -Recurse -Filter "*.pst" -ErrorAction SilentlyContinue | % {Write-Host $_.FullName,$_.lastwritetime}} -AsJob
$OSTlocation = Invoke-Command -ComputerName localhost -ScriptBlock {Get-Childitem "C:\Users\me\APpdata" -Recurse -Filter "*.ost" -ErrorAction SilentlyContinue | % {Write-Host $_.FullName,$_.lastwritetime} } -AsJob
$pSTlocation | Wait-Job | Receive-Job
$OSTlocation | Wait-Job | Receive-Job
Also, another question: can i save the output of the jobs to a variable without it showing to the console? Im trying to make it where it checks if theres any return, and if there is output it, but if theres not do something else.
I tried:
$job1 = $pSTlocation | Wait-Job | Receive-Job
if(!$job1){write-host "PST Found: $job1"} else{ "No PST Found"}
$job2 = $OSTlocation | Wait-Job | Receive-Job
if(!$job2){write-host "OST Found: $job2"} else{ "No OST Found"}
No luck, it outputs the following:
Note: This answer does not directly answer the question - see the other answer for that; instead, it shows a reusable idiom for a waiting for multiple jobs to finish in a non-blocking fashion.
The following sample code uses the child-process-based Start-Job cmdlet to create local jobs, but the solution equally works with local thread-based jobs created by Start-ThreadJob as well as jobs based on remotely executing Invoke-Command -ComputerName ... -AsJob commands, as used in the question.
It shows a reusable idiom for a waiting for multiple jobs to finish in a non-blocking fashion that allows for other activity while waiting, along with collecting per-job output in an array.
Here, the output is only collected after each job completes, but note that collecting it piecemeal, as it becomes available, is also an option, using (potentially multiple) Receive-Job calls even before a job finishes.
# Start two jobs, which run in parallel, and store the objects
# representing them in array $jobs.
# Replace the Start-Job calls with your
# Invoke-Command -ComputerName ... -AsJob
# calls.
$jobs = (Start-Job { Get-Date; sleep 1 }),
(Start-Job { Get-Date '1970-01-01'; sleep 2 })
# Initialize a helper array to keep track of which jobs haven't finished yet.
$remainingJobs = $jobs
# Wait iteratively *without blocking* until any job finishes and receive and
# output its output, until all jobs have finished.
# Collect all results in $jobResults.
$jobResults =
while ($remainingJobs) {
# Check if at least 1 job has terminated.
if ($finishedJob = $remainingJobs | Where State -in Completed, Failed, Stopped, Disconnected | Select -First 1) {
# Output the just-finished job's results as part of custom object
# that also contains the original command and the
# specific termination state.
[pscustomobject] #{
Job = $finishedJob.Command
State = $finishedJob.State
Result = $finishedJob | Receive-Job
}
# Remove the just-finished job from the array of remaining ones...
$remainingJobs = #($remainingJobs) -ne $finishedJob
# ... and also as a job managed by PowerShell.
Remove-Job $finishedJob
} else {
# Do other things...
Write-Host . -NoNewline
Start-Sleep -Milliseconds 500
}
}
# Output the jobs' results
$jobResults
Note:
It's tempting to try $remainingJobs | Wait-Job -Any -Timeout 0 to momentarily check for termination of any one job without blocking execution, but as of PowerShell 7.1 this doesn't work as expected: even already completed jobs are never returned - this appears to be bug, discussed in GitHub issue #14675.
If I run 2 Invoke-Command's using -asjob, does it run simultaneously when I try to receive the output?
Yes, PowerShell jobs always run in parallel, whether they're executing remotely, as in your case (with Invoke-Command -AsJob, assuming that localhost in the question is just a placeholder for the actual name of a different computer), or locally (using Start-Job or Start-ThreadJob).
However, by using (separate) Wait-Job calls, you are synchronously waiting for each jobs to finish (in a fixed sequence, too). That is, each Wait-Job calls blocks further execution until the target job terminates.[1]
Note, however, that both jobs continue to execute while you're waiting for the first one to finish.
If, instead of waiting in a blocking fashion, you want to perform other operations while you wait for both jobs to finish, you need a different approach, detailed in the the other answer.
can i save the output of the jobs to a variable without it showing to the console?
Yes, but the problem is that in your remotely executing script block ({ ... }) you're mistakenly using Write-Host in an attempt to output data.
Write-Host is typically the wrong tool to use, unless the intent is to write to the display only, bypassing the success output stream and with it the ability to send output to other commands, capture it in a variable, or redirect it to a file. To output a value, use it by itself; e.g., $value instead of Write-Host $value (or use Write-Output $value, though that is rarely needed); see this answer.
Therefore, your attempt to collect the job's output in a variable failed, because the Write-Host output bypassed the success output stream that variable assignments capture and went straight to the host (console):
# Because the job's script block uses Write-Host, its output goes to the *console*,
# and nothing is captured in $job1
$job1 = $pSTlocation | Wait-Job | Receive-Job
(Incidentally, the command could be simplified to
$job1 = $pSTlocation | Receive-Job -Wait).
[1] Note that Wait-Job has an optional -Timeout parameter, which allows you to limit waiting to at most a given number of seconds and return without output if the target job hasn't finished yet. However, as of PowerShell 7.1, -Timeout 0 for non-blocking polling for whether jobs have finished does not work - see GitHub issue #14675.

Powershell - Loop run console app >Wait for Ctrl+c

I have a windows console app that currently runs to process some files, at the end of the run, if successful, it starts a windows service and I get the output > xxx service is now running, press control_c to exit.
The console app looks at a config file to pull some parameters, I need to be able to re-run this multiple times while changing the parameters in the config file first. To do this manually I'd do the following:
change config file
run the app from powershell
wait for the message above to appear
click ctrl + c to terminate
change config file and run again
I thought it makes sense to automate this in a PS script where I can just pass the config values for all the runs, then the script loops through the values, edit the config file and run the exe.
Issue I have is the loop gets "stuck" at first run because the application is waiting for the ctrl+c command so never progresses through the loop.
what I have at the moment looks like this:
foreach ($dt in $datesarr)
{
##edit config values with stuff in $dt
$output=(<path to app here>)
while ($output[-1] -notlike "*Control-C*")
{
Start-Sleep -Seconds 10
}
}
problem I have is the script never reaches the while loop as it's just stuck after running the app awaiting for ctrl + c... What I want it to do is launch the app, wait for it to get to the ctrl + c bit then exit the loop and pick the second value in the parameter.
Any thoughts would be hugely appreciated!
Try the following approach, which is based on direct use of the following, closely related .NET APIs:
System.Diagnostics.ProcessStartInfo
System.Diagnostics.Process
Instead of trying to programmatically press Ctrl-C, the process associated with the external program is simply killed (terminated).
# Remove the next line if you don't want to see verbose output.
$VerbosePreference = 'Continue'
$psi = [System.Diagnostics.ProcessStartInfo] #{
UseShellExecute = $false
WindowStyle = 'Hidden'
FileName = '<path to app here>'
Arguments = '<arguments string here>' # only if args must be passed
RedirectStandardOutput = $true
RedirectStandardError = $true # optional - if you also want to examine stderr
}
Write-Verbose "Launching $($psi.FileName)..."
$ps = [System.Diagnostics.Process]::Start($psi)
Write-Verbose "Waiting for launched process $($ps.Id) to output the line of interest..."
$found = $false
while (
-not $ps.HasExited -and
-not ($found = ($line = $ps.StandardOutput.ReadLine()) -match 'Control-C')
) {
Write-Verbose "Stdout line received: $line"
}
if ($found) {
Write-Verbose "Line of interest received; terminating process $($ps.Id)."
# Note: If the process has already terminated, this will be a no-op.
# .Kill() kills only the target process itself.
# In .NET Core / .NET 5+, you can use .Kill($true) to also
# kill descendants of the process, i.e. all processes launched
# by it, directly and via its children.
$ps.Kill()
} else {
Write-Error "Process $($ps.Id) terminated before producing the expected output."
}
$ps.Dispose()

How can I speed up a PowerShelll foreach loop

I have a PowerShell script that connects to a database and pulls a list of user data. I take this data and create a foreach loop to run a script for the data.
This is working but its slow as the results could be 1000+ entries, and it has to complete the Script.bat for User A before it can start User B. The Script.bat for a single user is independent from another and takes ~30s for each user.
Is there a way to speed this up at all? I've been playing with -Parallel, ForEach-Object and workflow but I can't get it to work, likely due to me being a noob in PS.
foreach ($row in $Dataset.tables[0].rows)
{
$UserID=$row.value
$DeviceID=$row.value1
$EmailAddress=$row.email_address
cmd.exe /c "`"$PSScriptRoot`"\bin\Script.bat -c `" -Switch $UserID`" >> `"$PSScriptRoot`"\${FileName3}_REST_${DateTime}.txt 2> nul";
}
You said it yourself, your bottleneck is with the batch file in your script, not the loop itself. foreach (as opposed to ForEach-Object) is already the faster foreach loop mechanism in PowerShell. Investigate your batch file to find out why it takes 30 seconds to complete, and optimize it where you can.
Using Jobs
Note: Start-Job will run the job under another process. If you have PowerShell Core you can make use of the Start-ThreadJob cmdlet in lieu of Start-Job. This will start your job as part of another thread of the same process instead of starting another process.
If you can't optimize your batch script or optimize it to meet your needs, then you can consider using Start-Job to kick off the job to execute asynchronously, and then check the result and get any output from it using Receive-Job. For example:
# Master list of jobs you need to check the result of later
$jobs = New-Object System.Collections.Generic.List[System.Management.Automation.Job]
# Run your script for each row
foreach ($row in $Dataset.tables[0].rows)
{
$UserID=$row.value
$DeviceID=$row.value1
$EmailAddress=$row.email_address
# Use Start-Job here to kick off the script and store the job information
# for later retrieval.
# The $using: scope modifier allows you to make use of variables that were
# defined in the session calling Start-Job
$job = Start-Job -ScriptBlock { cmd.exe /c "`"${using:PSScriptRoot}`"\bin\Script.bat -c `" -Switch ${using:UserID}`" >> `"${using:PSScriptRoot}`"\${using:FileName3}_REST_${DateTime}.txt 2> nul"; }
# Add the execution to the $jobs list to check the result of later
# Casting to void here prevents the Add method from returning the object
# we've added.
[void]$jobs.Add($job)
}
# Wait for the jobs to be done
Write-Host 'Waiting for all jobs to complete...'
while( $jobs | Where-Object { $_.State -eq 'Running' } ){
Start-Sleep -s 10
}
# Retrieve the output of the jobs
foreach( $j in $jobs ) {
Receive-Job $j
}
Note: Since you have ~1000 times you need to execute this script, you may want to consider writing your logic to only run a certain number of jobs at a time. My example above starts all necessary jobs without regarding the number that may execute at once.
For more information about jobs and the properties you can inspect on a running/completed job, check the links below:
About Jobs
Job Class
Using Scope*
* The documentation states that the using scope can only be declared when working with remote sessions, but this seems to work fine with Start-Job even if the job is local.

InvocationStateChanged Event Raised Twice in PowerShell Script

I have been writing a script in PowerShell V4 on Windows 8.1 which makes use of background processes and events. I have found something which is a little strange. Rather than post the 2,500 lines or so of my script I have included a much shorter program which exhibits the odd behaviour. I expect it is something I am doing wrong but I cannot see what the problem is. The code is as follows:
`
# Scriptblocks to simulate the background task and the action to
# take when an event is raised
[scriptblock] $MyScript = {
for ($i = 0;$i -lt 30;$i++)
{
[console]::writeline("This is a test - $i")
start-sleep -m 500
}
}
[scriptblock] $StateChanged = {
[console]::writeline("The state changed")
}
# Create a runspace pool
$RunspacePool = [RunspaceFactory]::CreateRunspacePool(1, [int] $env:NUMBER_OF_PROCESSORS + 1)
$RunspacePool.ApartmentState = "MTA"
$RunspacePool.Open()
# Create and start the background task to run
$PS = [powershell]::Create()
[void] $PS.AddScript($MyScript)
$PS.RunspacePool = $RunspacePool
$Asyncresult = $PS.BeginInvoke()
# Register an interest in the InvocationStateChanged event for
# the background task. Should the event happen (which it will)
# run the $StateChanged scriptblock
Register-ObjectEvent -InputObject $PS -EventName InvocationStateChanged -Action $StateChanged
# The loop that simulates the main purpose of the script
[int] $j = 0
while ($PS.InvocationStateInfo.State -eq [System.Management.Automation.PSInvocationState]::Running)
{
if ($j -eq 2)
{
[void] $PS.BeginStop($NULL, $NULL)
}
"Running: $j" | out-host
sleep -m 400
$j = $j + 1
}
sleep 10
`
Essentially all it does is create a runspace to run a powershell scriptblock and while that is running something else happens in the foreground. I simulate someone pressing a button or, for whatever reason, a beginstop method being executed to stop the background process. That all works and the background process duly stops. However, I have registered an event for the background powershell script which runs a scriptblock when the background job changes state. The strange thing is that the scriptblock gets invoked twice and I cannot work out why.
Here is some output from running the script:
E:\Test Programs>powershell -file .\strange.ps1
This is a test - 0
Running: 0
This is a test - 1
Running: 1
The state changed
Running: 2
The state changed
E:\Test Programs>
As you can see it displays "The state changed" twice. They are a fraction of a second apart. I put a sleep 10 at the end to eliminate the possibility that it is the script stopping that is causing the second "The state changed" message.
If anyone can explain what is wrong I would be very grateful.
The InvocationStateChanged event is likely being called when on Running and Completed
If you change $StateChanged to also include the $Sender.InvocationStateInfo.State automatic variable properties, like this:
[scriptblock] $StateChanged = {
[console]::writeline("The state changed to: $($Sender.InvocationStateInfo.State)")
}
Your output will probably look like:
This is a test - 0
Running: 0
This is a test - 1
Running: 1
The state changed to: Running
Running: 2
The state changed: Completed
I am very sorry that I didn't reply a year and a half ago as I should have done. For some reason I never had a notification that there was an answer and, to be honest, I just forgot to check.
Anyway, thanks for the answer. I still have my little test script although the thing I was actually writing has been dumped and rewritten but I did check with my little test script and what I get is:
Running: 0
Running: 1
This is a test - 1
The state changed to: Stopped
The state changed to: Stopped
Running: 2
This is now on Windows 10.
Oh well, thanks for the suggestion anyway. As I said I've rewritten the original script so this is now just for interest.
Best wishes........
Colin

Web service call, if application is running

I'm looking for a way to execute a web form submittal if an application is running. I'm not sure the best approach, but I did create a PowerShell script that accomplishes what I want.
while($true) {
(Invoke-WebRequest -Method post 'Http://website.com').Content;
Start-Sleep -Seconds 600;
}
Now what I'd like to do is run this only when an application is running and then quit if the application is no longer running. I suspect maybe a Windows service would be the answer? If so, any idea how I could accomplish this?
I had also thought about running this as a Google Chrome extension, but then my googlefu was exhausted. For Chrome, I would just need the script and no need to check on the .exe.
Any thoughts or help would be appreciated. Again, I'm way out of my depth here but have found a need to create something so dummy steps would be much desired.
If you know the name of the process that runs for the application, you can do the following:
$processname = "thing"
# Wait until the process is detected
Do {
Sleep 60
} Until (Get-Process $processName)
# Once it is detected, run the script
# < SCRIPT RUN CODE HERE >
While (1) {
# Monitor the process to make sure it is still running
If (Get-Process $processName) {
Continue
}
Else {
# Stop the script, because the process isn't running.
# < SCRIPT STOP CODE HERE >
# Wait until the process is detected again
Do {
Sleep 60
} Until (Get-Process $processName)
# Once it is detected again, run the script
# < SCRIPT RUN CODE HERE >
}
# You can add in a delay here to slow down the loop
# Sleep 60
}
I think what you're looking for might be WMI eventing. You can register for (and respond to) events that occur within WMI, such as:
When a process starts or stops
When a service starts or stops
When a process exceeds a certain amount of memory usage
When a new version device driver is installed
When a computer is assigned to a new organizational unit
When a user logs on or off
When an environment variables changes
When a laptop battery drops below a certain threshold
Thousands of other cases
To register for WMI events, use the Register-WmiEvent cmdlet. You can use the -Action parameter to declare what PowerShell statements to execute when a matching event is detected. Here is a simple example:
# 1. Start notepad.exe
notepad;
# 2. Register for events when Notepad disappears
# 2a. Declare the WMI event query
$WmiEventQuery = "select * from __InstanceDeletionEvent within 5 where TargetInstance ISA 'Win32_Process' and TargetInstance.Name = 'notepad.exe'";
# 2b. Declare the PowerShell ScriptBlock that will execute when event is matched
$Action = { Write-Host -ForegroundColor Green -Object ('Process stopped! {0}' -f $event.SourceEventArgs.NewEvent.TargetInstance.Name) };
# 2c. Register for WMI events
Register-WmiEvent -Namespace root\cimv2 -Query $WmiEventQuery -Action $Action -SourceIdentifier NotepadStopped;
# 3. Stop notepad.exe
# Note: For some reason, if you terminate the process as part of the same thread, the event
# doesn't seem to fire correctly. So, wrap the Stop-Process command in Start-Job.
Start-Job -ScriptBlock { Stop-Process -Name notepad; };
# 4. Wait for event consumer (action) to fire and clean up the event registration
Start-Sleep -Seconds 6;
Unregister-Event -SourceIdentifier NotepadStopped;
FYI: I developed a PowerShell module called PowerEvents, which is hosted on CodePlex. The module includes the ability to register permanent WMI event subscriptions, and includes a 30+ page PDF document that helps you to understand WMI eventing. You can find this open-source project at: http://powerevents.codeplex.com.
If I were to adapt your code to something that is more practical for you, it might look something like the example below. You could invoke the code on a periodic basis using the Windows Task Scheduler.
# 1. If process is not running, then exit immediately
if (-not (Get-Process -Name notepad)) { throw 'Process is not running!'; return; }
# 2. Register for events when Notepad disappears
# 2a. Declare the WMI event query
$WmiEventQuery = "select * from __InstanceDeletionEvent within 5 where TargetInstance ISA 'Win32_Process' and TargetInstance.Name = 'notepad.exe'";
# 2b. Declare the PowerShell ScriptBlock that will execute when event is matched
# In this case, it simply appends the value of the $event automatic variable to a
# new, global variable named NotepadEvent.
$Action = { $global:NotepadEvent += $event; };
# 2c. Register for WMI events
Register-WmiEvent -Namespace root\cimv2 -Query $WmiEventQuery -Action $Action -SourceIdentifier NotepadStopped;
# 3. Wait indefinitely, or until $global:NotepadEvent variable is NOT $null
while ($true -and -not $global:NotepadEvent) {
Start-Sleep -Seconds 600;
(Invoke-WebRequest -Method post 'Http://website.com').Content;
}