Why does Write-Host not work when run in a powershell job? - powershell

Sorry if I'm being a dumb powershell noob, but what's wrong with jobs apparently being unable to write to the terminal? And how can I fix that?
# test.ps1
function myjob {
Write-Host "Hello, World!" # doesn't show
}
Start-Job -Name MyJob -ScriptBlock ${function:myjob}
Wait-Job MyJob
Remove-Job MyJob

It sounds like you're trying to use Write-Host to directly, synchronously write to the console (terminal) from a background job.
However, PowerShell jobs do not allow direct access to the caller's console. Any output - even to the PowerShell host (which in foreground use is the console, if run in one) is routed through PowerShell's system of output streams (see the conceptual about_Redirection help topic).
Therefore, you always need the Receive-Job cmdlet in order to receive output from a PowerShell job.
The following example receives the job output synchronously, i.e. it blocks execution until the job finishes (-Wait) and then removes it (-AutoRemoveJob); see the bottom section for an asynchronous (polling, non-blocking) approach.
$null = Start-Job -Name MyJob -ScriptBlock { Write-Host "Hello, World!" }
Receive-Job -Wait -AutoRemoveJob -Name MyJob
Caveat re use of Write-Host in jobs:
In foreground use, Write-Host output - even though primarily designed to go to the host (console) - can be redirected or captured via the information stream (whose number is 6, available in PSv5+); e.g.:
# OK - no output
Write-Host 'silence me' 6>$null
Write-Host output received via a (child-process-based) background job, however, can not be redirected or captured, as of PowerShell 7.2.1:
# !! `silence me` still prints.
Start-Job { Write-Host 'silence me' } | Receive-Job -Wait -AutoRemoveJob 6>$null
By contrast, it can be redirected/captured when using a (generally preferable) thread-based background job (as opposed to a child-process-based background job), via Start-ThreadJob:
# OK - no output
Start-ThreadJob { Write-Host 'silence me' } | Receive-Job -Wait -AutoRemoveJob 6>$null
Waiting for a job to complete in a non-blocking fashion, passing job output through as it becomes available:
# Start a simple job that writes a "." to the host once a second,
# for 5 seconds
$job = Start-Job $job -ScriptBlock {
1..5| ForEach-Object { Write-Host -NoNewLine .; Start-Sleep 1 }
}
"Waiting for job $($job.Id) to terminate while passing its output through..."
do {
$job | Receive-Job # See if job output is available (non-blocking) and pass it through
Start-Sleep 1 # Do other things or sleep a little.
} while (($job | Get-Job).State -in 'NotStarted', 'Running')
"`nJob terminated with state '$($job.State)'."
$job | Remove-Job # Clean up.
Note: In this simple case, the expected termination states are Completed (either no or only non-terminating errors occurred) or Failed (a script-terminating error was generated with throw (and not caught inside the job)).
See the [System.Management.Automation.JobState] enumeration for the complete list of possible states.
The job object returned by Start-Job - rather than a self-chosen name via the -Name parameter - is used to interact with the job. This eliminates the ambiguity of possibly multiple jobs being present with a given -Name, all of which would be targeted.

Related

Return test result from pester test run start-job

I'm using pester to test my custom cmdlets. The tests are end-to-end tests that spin up an environment; compile and load assemblies and execute tests on the dynamically compiled code.
I need to do the actual tests in a new PowerShell process (so the assemblies can be compiled and loaded each time) so I use start-job to run the tests in the back ground using start-job and wait for the results with receive-job -wait. Something like this:
Describe 'Run' {
It 'StartJobTest' {
start-job -script {invoke-pester .\MyTests.Tests.ps1} | receive-job -wait
}
}
Everything works fine except I would can't get any test success or failure status to be returned from the job so that the invoking pester test can be marked success or failure.
Does anyone know how to achieve this. I tried setting $global:PesterPreference.Run.Exit = $true but this didn't make any difference.
Any help or suggestions gratefully received,
David
I think you need to use -PassThru switch:
Returns a custom object (PSCustomObject) that contains the test results. By default, Invoke-Pester writes to the host program, not to the output stream (stdout). If you try to save the result in a variable, the variable is empty unless you use the PassThru parameter.
Describe 'Run' {
It 'StartJobTest' {
$Results = Start-Job -Name Invoke-Pester -ScriptBlock {
Invoke-Pester -Path '.\MyTests.Tests.ps1' -PassThru
} | Receive-Job -Wait -AutoRemoveJob
$Results.FailedCount | Should -BeExactly 0
}
}
P.S. In Pester v5, this switch is replaced with ConfigurationProperty Run.PassThru: https://pester.dev/docs/migrations/v4-to-v5

Do threads still execute using -asjob with wait-job?

Hello all and good afternoon!
I had a quick question regarding -asjob running with invoke-command.
If I run 2 Invoke-Command's using -asjob, does it run simultaneously when I try to receive the ouput? Does this mean wait-job waits till the first job specified is finished running to get the next results?
Write-Host "Searching for PST and OST files. Please be patient!" -BackgroundColor White -ForegroundColor DarkBlue
$pSTlocation = Invoke-Command -ComputerName localhost -ScriptBlock {Get-Childitem "C:\" -Recurse -Filter "*.pst" -ErrorAction SilentlyContinue | % {Write-Host $_.FullName,$_.lastwritetime}} -AsJob
$OSTlocation = Invoke-Command -ComputerName localhost -ScriptBlock {Get-Childitem "C:\Users\me\APpdata" -Recurse -Filter "*.ost" -ErrorAction SilentlyContinue | % {Write-Host $_.FullName,$_.lastwritetime} } -AsJob
$pSTlocation | Wait-Job | Receive-Job
$OSTlocation | Wait-Job | Receive-Job
Also, another question: can i save the output of the jobs to a variable without it showing to the console? Im trying to make it where it checks if theres any return, and if there is output it, but if theres not do something else.
I tried:
$job1 = $pSTlocation | Wait-Job | Receive-Job
if(!$job1){write-host "PST Found: $job1"} else{ "No PST Found"}
$job2 = $OSTlocation | Wait-Job | Receive-Job
if(!$job2){write-host "OST Found: $job2"} else{ "No OST Found"}
No luck, it outputs the following:
Note: This answer does not directly answer the question - see the other answer for that; instead, it shows a reusable idiom for a waiting for multiple jobs to finish in a non-blocking fashion.
The following sample code uses the child-process-based Start-Job cmdlet to create local jobs, but the solution equally works with local thread-based jobs created by Start-ThreadJob as well as jobs based on remotely executing Invoke-Command -ComputerName ... -AsJob commands, as used in the question.
It shows a reusable idiom for a waiting for multiple jobs to finish in a non-blocking fashion that allows for other activity while waiting, along with collecting per-job output in an array.
Here, the output is only collected after each job completes, but note that collecting it piecemeal, as it becomes available, is also an option, using (potentially multiple) Receive-Job calls even before a job finishes.
# Start two jobs, which run in parallel, and store the objects
# representing them in array $jobs.
# Replace the Start-Job calls with your
# Invoke-Command -ComputerName ... -AsJob
# calls.
$jobs = (Start-Job { Get-Date; sleep 1 }),
(Start-Job { Get-Date '1970-01-01'; sleep 2 })
# Initialize a helper array to keep track of which jobs haven't finished yet.
$remainingJobs = $jobs
# Wait iteratively *without blocking* until any job finishes and receive and
# output its output, until all jobs have finished.
# Collect all results in $jobResults.
$jobResults =
while ($remainingJobs) {
# Check if at least 1 job has terminated.
if ($finishedJob = $remainingJobs | Where State -in Completed, Failed, Stopped, Disconnected | Select -First 1) {
# Output the just-finished job's results as part of custom object
# that also contains the original command and the
# specific termination state.
[pscustomobject] #{
Job = $finishedJob.Command
State = $finishedJob.State
Result = $finishedJob | Receive-Job
}
# Remove the just-finished job from the array of remaining ones...
$remainingJobs = #($remainingJobs) -ne $finishedJob
# ... and also as a job managed by PowerShell.
Remove-Job $finishedJob
} else {
# Do other things...
Write-Host . -NoNewline
Start-Sleep -Milliseconds 500
}
}
# Output the jobs' results
$jobResults
Note:
It's tempting to try $remainingJobs | Wait-Job -Any -Timeout 0 to momentarily check for termination of any one job without blocking execution, but as of PowerShell 7.1 this doesn't work as expected: even already completed jobs are never returned - this appears to be bug, discussed in GitHub issue #14675.
If I run 2 Invoke-Command's using -asjob, does it run simultaneously when I try to receive the output?
Yes, PowerShell jobs always run in parallel, whether they're executing remotely, as in your case (with Invoke-Command -AsJob, assuming that localhost in the question is just a placeholder for the actual name of a different computer), or locally (using Start-Job or Start-ThreadJob).
However, by using (separate) Wait-Job calls, you are synchronously waiting for each jobs to finish (in a fixed sequence, too). That is, each Wait-Job calls blocks further execution until the target job terminates.[1]
Note, however, that both jobs continue to execute while you're waiting for the first one to finish.
If, instead of waiting in a blocking fashion, you want to perform other operations while you wait for both jobs to finish, you need a different approach, detailed in the the other answer.
can i save the output of the jobs to a variable without it showing to the console?
Yes, but the problem is that in your remotely executing script block ({ ... }) you're mistakenly using Write-Host in an attempt to output data.
Write-Host is typically the wrong tool to use, unless the intent is to write to the display only, bypassing the success output stream and with it the ability to send output to other commands, capture it in a variable, or redirect it to a file. To output a value, use it by itself; e.g., $value instead of Write-Host $value (or use Write-Output $value, though that is rarely needed); see this answer.
Therefore, your attempt to collect the job's output in a variable failed, because the Write-Host output bypassed the success output stream that variable assignments capture and went straight to the host (console):
# Because the job's script block uses Write-Host, its output goes to the *console*,
# and nothing is captured in $job1
$job1 = $pSTlocation | Wait-Job | Receive-Job
(Incidentally, the command could be simplified to
$job1 = $pSTlocation | Receive-Job -Wait).
[1] Note that Wait-Job has an optional -Timeout parameter, which allows you to limit waiting to at most a given number of seconds and return without output if the target job hasn't finished yet. However, as of PowerShell 7.1, -Timeout 0 for non-blocking polling for whether jobs have finished does not work - see GitHub issue #14675.

Multiple io.filesystemwatchers in parallel

I have three different tasks that I wish to outsource to filesystem watchers in powershell. I have the code all set up to initialize two watchers and to check every ten seconds to make sure they are running. However the tasks that they perform last under a minute, and 5 minutes respectively. The third task I wish to outsource to a watcher takes about an hour. I am concerned that if I have all of them running simultaneously, tasks that the first two should watch for will not get done at all if the third watcher is executing its change action. Is there a way to implement or run them such that the change actions can be executed in parallel?
You can use the Start-ThreadJob cmdlet to run your file-watching tasks in parallel.
Start-ThreadJob comes with the ThreadJob module and offers a lightweight, thread-based alternative to the child-process-based regular background jobs.
It comes with PowerShell [Core] v6+ and in Windows PowerShell can be installed on demand with, e.g., Install-Module ThreadJob -Scope CurrentUser.
In most cases, thread jobs are the better choice, both for performance and type fidelity - see the bottom section of this answer for why.
The following self-contained sample code:
uses thread jobs to run 2 distinct file-monitoring and processing tasks in parallel,
which neither block each other nor the caller.
Note:
Each task creates its own System.IO.FileSystemWatcher instance in the code below, though creating too many of them can put a significant load on the system, possibly resulting in events getting missed.
An alternative is to share instances, such as creating a single one in the caller's context, which the thread jobs can access (see comments in source code below).
[This is in part speculative; do tell us if I got things wrong] Direct FileSystemWatcher .NET event-handler delegates should be kept short, but subscribing to the events from PowerShell via an event job created by Register-ObjectEvent queues the events on the PowerShell side, which PowerShell then dispatches to the -Action script blocks, so that these blocks perform long-running operations below shouldn't be an immediate concern (the tasks may take a long time to catch up, though).
# Make sure that the ThreadJob module is available.
# In Windows PowerShell, it must be installed first.
# In PowerShell [Core], it is available by default.
Import-Module ThreadJob -ea Stop
try {
# Use the system's temp folder in this example.
$dir = (Get-Item -EA Ignore temp:).FullName; if (-not $dir) { $dir = $env:TEMP }
# Define the tasks as an array of custom objects that specify the dir.
# and file name pattern to monitor as well as the action script block to
# handle the events.
$tasks = # array of custom objects to describe the
[pscustomobject] #{
DirToMonitor = $dir
FileNamePattern = '*.tmp1'
Action = {
# Print status info containing the event data to the host, synchronously.
Write-Host -NoNewLine "`nINFO: Event 1 raised:`n$($EventArgs | Format-List | Out-String)"
# Sleep to simulate blocking the thread with a long-running task.
Write-Host "INFO: Event 1: Working for 4 secs."
Start-Sleep 4
# Create output, which Receive-Job can collect.
"`nEvent 1 output: " + $EventArgs.Name
}
},
[pscustomobject] #{
DirToMonitor = $dir
FileNamePattern = '*.tmp2'
Action = {
# Print status info containing the event data to the host, synchronously
Write-Host -NoNewLine "`nINFO: Event 2 raised:`n$($EventArgs | Format-List | Out-String)"
# Sleep to simulate blocking the thread with a long-running task.
Write-Host "INFO: Event 2: Working for 2 secs"
Start-Sleep 2
# Create output, which Receive-Job can collect.
"`nEvent 2 output: " + $EventArgs.Name
}
}
# Start a separate thread job for each action task.
$threadJobs = $tasks | ForEach-Object {
Start-ThreadJob -ArgumentList $_ {
param([pscustomobject] $task)
# Create and initialize a thread-specific watcher.
# Note: To keep system load low, it's generally better to use a *shared*
# watcher, if feasible. You can define it in the caller's scope
# and access here via $using:watcher
$watcher = [System.IO.FileSystemWatcher] [ordered] #{
Path = $task.DirToMonitor
Filter = $task.FileNamePattern
EnableRaisingEvents = $true # start watching.
}
# Subscribe to the watcher's Created events, which returns an event job.
# This indefinitely running job receives the output from the -Action script
# block whenever the latter is called after an event fires.
$eventJob = Register-ObjectEvent -ea stop $watcher Created -Action $task.Action
Write-Host "`nINFO: Watching $($task.DirToMonitor) for creation of $($task.FileNamePattern) files..."
# Indefinitely wait for output from the action blocks and relay it.
try {
while ($true) {
Receive-Job $eventJob
Start-Sleep -Milliseconds 500 # sleep a little
}
}
finally {
# !! This doesn't print, presumably because this is killed by the
# !! *caller* being killed, which then doesn't relay the output anymore.
Write-Host "Cleaning up thread for task $($task.FileNamePattern)..."
# Dispose of the watcher.
$watcher.Dispose()
# Remove the event job (and with it the event subscription).
$eventJob | Remove-Job -Force
}
}
}
$sampleFilesCreated = $false
$sampleFiles = foreach ($task in $tasks) { Join-Path $task.DirToMonitor ("tmp_$PID" + ($task.FileNamePattern -replace '\*')) }
Write-Host "Starting tasks...`nUse Ctrl-C to stop."
# Indefinitely wait for and display output from the thread jobs.
# Use Ctrl+C to stop.
$dtStart = [datetime]::UtcNow
while ($true) {
# Receive thread job output, if any.
$threadJobs | Receive-Job
# Sleep a little.
Write-Host . -NoNewline
Start-Sleep -Milliseconds 500
# A good while after startup, create sample files that trigger all tasks.
# NOTE: The delay must be long enough for the task event handlers to already be
# in place. How long that takes can vary.
# Watch the status output to make sure the files are created
# *after* the event handlers became active.
# If not, increase the delay or create files manually once
# the event handlers are in place.
if (-not $sampleFilesCreated -and ([datetime]::UtcNow - $dtStart).TotalSeconds -ge 10) {
Write-Host
foreach ($sampleFile in $sampleFiles) {
Write-Host "INFO: Creating sample file $sampleFile..."
$null > $sampleFile
}
$sampleFilesCreated = $true
}
}
}
finally {
# Clean up.
# Clean up the thread jobs.
Remove-Job -Force $threadJobs
# Remove the temp. sample files
Remove-Item -ea Ignore $sampleFiles
}
The above creates output such as the following (sample from a macOS machine):
Starting tasks...
Use Ctrl-C to stop.
.
INFO: Watching /var/folders/19/0lxcl7hd63d6fqd813glqppc0000gn/T/ for creation of *.tmp1 files...
INFO: Watching /var/folders/19/0lxcl7hd63d6fqd813glqppc0000gn/T/ for creation of *.tmp2 files...
.........
INFO: Creating sample file /var/folders/19/0lxcl7hd63d6fqd813glqppc0000gn/T/tmp_91418.tmp1...
INFO: Creating sample file /var/folders/19/0lxcl7hd63d6fqd813glqppc0000gn/T/tmp_91418.tmp2...
.
INFO: Event 1 raised:
ChangeType : Created
FullPath : /var/folders/19/0lxcl7hd63d6fqd813glqppc0000gn/T/tmp_91418.tmp1
Name : tmp_91418.tmp1
INFO: Event 1: Working for 4 secs.
INFO: Event 2 raised:
ChangeType : Created
FullPath : /var/folders/19/0lxcl7hd63d6fqd813glqppc0000gn/T/tmp_91418.tmp2
Name : tmp_91418.tmp2
INFO: Event 2: Working for 2 secs
....
Event 2 output: tmp_91418.tmp2
....
Event 1 output: tmp_91418.tmp1
.................

PowerShell Receive-Job Output To Variable Or File But Not Screen

I am trying to get the job details without outputting the data to the screen. However, regardless of what option I try, the job logs always get sent to the console. Any ideas on how to save the logs in a variable or file without outputting that data to console?
Receive-Job -Id $id -Keep -ErrorAction Continue > C:\Temp\Transcript-$VM.txt
$info = Receive-Job -Id $id -Keep -ErrorAction Continue
You state that your job uses Write-Host output and that you're running Windows PowerShell v5.1.
In order to also capture Write-Host output - which in v5+ is sent to the information stream (stream number 6) - use redirection 6>&1:
# Capture both success output and information-stream output
# (Write-Host) output in $info.
$info = Receive-Job -Id $id -Keep -ErrorAction Continue 6>&1
Unfortunately, due to a known bug, you'll still get console output as well (bug is still present in PowerShell Core 7.0.0-preview.5).
Catch-all redirection *>&1 normally routes all streams through the success output stream.
Unfortunately, due to the bug linked to above, the following streams cannot be captured or redirected at all when using background jobs or remoting:
verbose messages (4)
debug messages (5)
The only workaround is to capture the streams inside the job and save them to a file from there, and then access the files from the caller later.
Of course, this requires that you have control over how the jobs are created.
A simplified example:
# Redirect all output streams *inside* the job to a file...
Start-Job {
& {
# The job's commands go here.
# Note that for any *verbose* output to be captured,
# verbose output must explicitly turned on, such as with
# the -Verbose common parameter here.
# You can also set $VerbosePreference = 'Continue', which
# cmdlets (including advanced functions/scripts) will honor.
'success'; write-verbose -Verbose 'verbose'; write-host 'host'
} *> $HOME/out.txt
} | Receive-Job -Wait -AutoRemove
# ... then read the resulting file.
Get-Content $HOME/out.txt
Note that I've used a full path as the redirection target, because, unfortunately, in v6- versions of PowerShell script blocks executed in background jobs do not inherit the caller's current location. This will change in PowerShell Core v7.0.
Try placing it in a pipeline, and see if that works:
Receive-Job -Id $id -Keep -ErrorAction Continue | Set-Content 'C:\Temp\Transcript-$VM.txt'

Assign a variable inside of a scriptblock while running a job

Related to Terminate part of powershell script and continue.
Partially related to Powershell Job Always Shows Complete.
My script runs locally and access the registry hive of a remote PC. I need the value of registry keys to be written into a $RegHive variable. And I want to monitor it as a job in case some PC freezes, I can terminate the command and move on to another PC.
My original code would be:
$global:RegHive = $null
$job = Start-Job -ScriptBlock {
$RegHive = [Microsoft.Win32.RegistryKey]::OpenRemoteBaseKey("SomeKeyName", "SomePCName")
}
But no matter what I do, the variable $RegHive is empty.
If I do $RegHive = (Get-Job | Receive-Job) some value gets assigned to $RegHive that on one side looks exactly as if I would run it normally without a job/scriptblock, ie:
$RegHive = [Microsoft.Win32.RegistryKey]::OpenRemoteBaseKey("SomeKeyName", "SomePCName")
and even has the same $RegHive.SubKeyCount
But the "normal" one has $RegHive.GetSubKeyName() method and the one from job doesn't.
How do I escape assigning a variable with Receive-Job and do the assignment directly inside the scriptblock, which is run as a job?
In simple words:
$job = Start-Job -ScriptBlock {$a = 1 + 2}
How to get $a be equal to 3 without $a = (Get-job | Receive-job)?
This might be helpful for you. The job is sort of like a variable
What you can do is name the job and then call it by name with -Keep to maintain it's value stored - aka it will store all final output inside itself until you call it. (it can be kept but the default is to remove it once called)
$global:RegHive = $null
Start-Job -Name "RegHive" -ScriptBlock {
[Microsoft.Win32.RegistryKey]::OpenRemoteBaseKey("SomeKeyName", "SomePCName")
}
Receive-Job -Name "RegHive" -Keep
obviously calling the Receive-Job immediately after defeats the purpose of jobs, they add a lot of overhead, and are only efficient when needing to do multiple things at once. - if you call for 100s or thousands at once, you could do get-job | wait-job then when finished start using their outputs ---- wait-job also accepts job names or can wait on your entire list of jobs.
another option to set the variable is
$RegHive = "Receive-Job -Name "RegHive"
and finally, you can do this to use the value
get-<insert command> -value "$(Receive-Job -Name 'RegHive' -Keep)" -argument2 "YADA YADA"
remember keep will not delete the value and can be "Received" again later.