I want to run 15 instances of a script that pipelines 5 scripts together, and so far I'm missing the pixie dust. I've boiled the problem down to a test case with a master script that calls a slave script that in turn calls a sub_slave script (no pipelines are necessary to reproduce the failure.)
If master.ps1 calls slave.ps1 each background job hangs indefinitely at the point slave.ps1 calls to sub_slave.ps1. If I comment out the call in slave.ps1 to sub_slave.ps1, master.ps1 runs to completion. And if I start slave.ps1 directly, it runs the call to sub_slave.ps1 just fine. The problem seems intractable, though if I try the double-chain. I've not seen anything in the docs saying you cannot daisy chain these scripts past some arbitrary depth, but maybe I'm not reading deeply enough?
You'll see I'm tracking progress through the script by add-content to a simple text file.
d:\jobs\master.ps1
$indexes = #(0,1,2)
$initString = "cd $pwd"
$initCmd = [scriptblock]::create($initString)
set-content -path out.txt -value "Master started"
foreach ($index in $indexes)
{
add-content -path out.txt -value "job starting"
start-job -filepath slave.ps1 -InitializationScript $initCmd -ArgumentList #("master index $index originated")
}
add-content -path out.txt -value "master completed"
d:\jobs\slave.ps1
#requires -version 2.0
param (
[parameter(Mandatory=$false)]
[string]$message = "slave_originated"
)
begin
{
Set-StrictMode -version Latest
add-content -path out.txt -value "slave beginning in $pwd"
}
process
{
add-content -path out.txt -value "slave processing $message"
invoke-expression "D:\jobs\sub_slave.ps1 -message $message"
}
end
{
add-content -path out.txt -value "slave ending"
}
d:\jobs\sub_slave.ps1
#requires -version 2.0
param (
[parameter(Mandatory=$false)]
[string]$message = "sub_slave originated"
)
begin
{
Set-StrictMode -version Latest
add-content -path out.txt -value "sub_slave beginning"
}
process
{
add-content -path out.txt -value "sub_slave processing $message"
}
end
{
add-content -path out.txt -value "sub_slave ending"
}
When the code is run as written, without anything commented out, I get this in out.txt:
d:\jobs\out.txt
Master started
job starting
job starting
job starting
master completed
slave beginning in D:\jobs
slave processing master index 0 originated
slave beginning in D:\jobs
slave processing master index 1 originated
slave beginning in D:\jobs
slave processing master index 2 originated
Note that the very next command after each farewell message is the call to sub_slave.ps1 and that no message from sub_slave.ps1 appears at all. The system will hang with 3 powershell.exe processes running happily forever. Get-job reports the 3 processes are still Running and HasMoreData.
The only clue I've got is that if I crash dump the hung processes, I see:
Number of exceptions of this type: 2
Exception MethodTable: 79330e98
Exception object: 012610fc
Exception type: System.Threading.ThreadAbortException
Message: <none>
InnerException: <none>
StackTrace (generated):
<none>
StackTraceString: <none>
HResult: 80131530
-----------------
Number of exceptions of this type: 2
Exception MethodTable: 20868118
Exception object: 01870344
Exception type: System.Management.Automation.ParameterBindingException
Message: System error.
InnerException: <none>
StackTrace (generated):
<none>
StackTraceString: <none>
HResult: 80131501
-----------------
Perhaps I'm having some sort of parameter issue? If I remove all parameters from sub_slave.ps1 the behaviour is unchanged, so I kind of doubt that, but it's possible. I'm open to any idea.
You are calling sub_slave.ps1 wrong. This:
invoke-expression "D:\jobs\sub_slave.ps1 -message $message"
Should be this:
D:\jobs\sub_slave.ps1 -message $message
The message was getting evaluated and became multiple arguments.
Just taking a guess here, but having multiple threads/jobs write to the same file out.txt could be causing issues.
Related
Context
On a build server, a PowerShell 7 script script.ps1 will be started and will be running in the background in the remote computer.
What I want
A safenet to ensure that at most 1 instance of the script.ps1 script is running at once on the build server or remote computer, at all times.
What I tried:
I tried meddling with PowerShell 7 background jobs (by executing the script.ps1 as a job inside a wrapper script wrapper.ps1), however that didn't solve the problem as jobs do not carry over (and can't be accessed) in other PowerShell sessions.
What I tried looks like this:
# inside wrapper.ps1
$running_jobs = $(Get-Job -State Running) | Where-Object {$_.Name -eq "ImportantJob"}
if ($running_jobs.count -eq 0) {
Start-Job .\script.ps1 -Name "ImportantJob" -ArgumentList #($some_variables)
} else {
Write-Warning "Could not start new job; Existing job detected must be terminated beforehand."
}
To reiterate, the problem with that is that $running_jobs only returns the jobs running in the current session, so this code only limits one job per session, allowing for multiple instances to be ran if multiple sessions were mistakenly opened.
What I also tried:
I tried to look into Get-CimInstance:
$processes = Get-CimInstance -ClassName Win32_Process | Where-Object {$_.Name -eq "pwsh.exe"}
While this does return the current running PowerShell instances, these elements carry no information on the script that is being executed, as shown after I run:
foreach ($p in $processes) {
$p | Format-List *
}
I'm therefore lost and I feel like I'm missing something.
I appreciate any help or suggestions.
I like to define a config path in the $env:ProgramData location using a CompanyName\ProjectName scheme so I can put "per system" configuration.
You could use a similar scheme with a defined location to store a lock file created when the script run and deleted at the end of it (as suggested already within the comments).
Then, it is up to you to add additional checks if needed (What happen if the script exit prematurely while the lock is still present ?)
Example
# Define default path (Not user specific)
$ConfigLocation = "$Env:ProgramData\CompanyName\ProjectName"
# Create path if it does not exist
New-Item -ItemType Directory -Path $ConfigLocation -EA 0 | Out-Null
$LockFilePath = "$ConfigLocation\Instance.Lock"
$Locked = $null -ne (Get-Item -Path $LockFilePath -EA 0)
if ($Locked) {Exit}
# Lock
New-Item -Path $LockFilePath
# Do stuff
# Remove lock
Remove-Item -Path $LockFilePath
Alternatively, on Windows, you could also use a scheduled task without a schedule and with the setting "If the task is already running, then the following rule applies: Do not start a new instance". From there, instead of calling the original script, you call a proxy script that just launch the scheduled task.
Sorry if I'm being a dumb powershell noob, but what's wrong with jobs apparently being unable to write to the terminal? And how can I fix that?
# test.ps1
function myjob {
Write-Host "Hello, World!" # doesn't show
}
Start-Job -Name MyJob -ScriptBlock ${function:myjob}
Wait-Job MyJob
Remove-Job MyJob
It sounds like you're trying to use Write-Host to directly, synchronously write to the console (terminal) from a background job.
However, PowerShell jobs do not allow direct access to the caller's console. Any output - even to the PowerShell host (which in foreground use is the console, if run in one) is routed through PowerShell's system of output streams (see the conceptual about_Redirection help topic).
Therefore, you always need the Receive-Job cmdlet in order to receive output from a PowerShell job.
The following example receives the job output synchronously, i.e. it blocks execution until the job finishes (-Wait) and then removes it (-AutoRemoveJob); see the bottom section for an asynchronous (polling, non-blocking) approach.
$null = Start-Job -Name MyJob -ScriptBlock { Write-Host "Hello, World!" }
Receive-Job -Wait -AutoRemoveJob -Name MyJob
Caveat re use of Write-Host in jobs:
In foreground use, Write-Host output - even though primarily designed to go to the host (console) - can be redirected or captured via the information stream (whose number is 6, available in PSv5+); e.g.:
# OK - no output
Write-Host 'silence me' 6>$null
Write-Host output received via a (child-process-based) background job, however, can not be redirected or captured, as of PowerShell 7.2.1:
# !! `silence me` still prints.
Start-Job { Write-Host 'silence me' } | Receive-Job -Wait -AutoRemoveJob 6>$null
By contrast, it can be redirected/captured when using a (generally preferable) thread-based background job (as opposed to a child-process-based background job), via Start-ThreadJob:
# OK - no output
Start-ThreadJob { Write-Host 'silence me' } | Receive-Job -Wait -AutoRemoveJob 6>$null
Waiting for a job to complete in a non-blocking fashion, passing job output through as it becomes available:
# Start a simple job that writes a "." to the host once a second,
# for 5 seconds
$job = Start-Job $job -ScriptBlock {
1..5| ForEach-Object { Write-Host -NoNewLine .; Start-Sleep 1 }
}
"Waiting for job $($job.Id) to terminate while passing its output through..."
do {
$job | Receive-Job # See if job output is available (non-blocking) and pass it through
Start-Sleep 1 # Do other things or sleep a little.
} while (($job | Get-Job).State -in 'NotStarted', 'Running')
"`nJob terminated with state '$($job.State)'."
$job | Remove-Job # Clean up.
Note: In this simple case, the expected termination states are Completed (either no or only non-terminating errors occurred) or Failed (a script-terminating error was generated with throw (and not caught inside the job)).
See the [System.Management.Automation.JobState] enumeration for the complete list of possible states.
The job object returned by Start-Job - rather than a self-chosen name via the -Name parameter - is used to interact with the job. This eliminates the ambiguity of possibly multiple jobs being present with a given -Name, all of which would be targeted.
Hello all and good afternoon!
I had a quick question regarding -asjob running with invoke-command.
If I run 2 Invoke-Command's using -asjob, does it run simultaneously when I try to receive the ouput? Does this mean wait-job waits till the first job specified is finished running to get the next results?
Write-Host "Searching for PST and OST files. Please be patient!" -BackgroundColor White -ForegroundColor DarkBlue
$pSTlocation = Invoke-Command -ComputerName localhost -ScriptBlock {Get-Childitem "C:\" -Recurse -Filter "*.pst" -ErrorAction SilentlyContinue | % {Write-Host $_.FullName,$_.lastwritetime}} -AsJob
$OSTlocation = Invoke-Command -ComputerName localhost -ScriptBlock {Get-Childitem "C:\Users\me\APpdata" -Recurse -Filter "*.ost" -ErrorAction SilentlyContinue | % {Write-Host $_.FullName,$_.lastwritetime} } -AsJob
$pSTlocation | Wait-Job | Receive-Job
$OSTlocation | Wait-Job | Receive-Job
Also, another question: can i save the output of the jobs to a variable without it showing to the console? Im trying to make it where it checks if theres any return, and if there is output it, but if theres not do something else.
I tried:
$job1 = $pSTlocation | Wait-Job | Receive-Job
if(!$job1){write-host "PST Found: $job1"} else{ "No PST Found"}
$job2 = $OSTlocation | Wait-Job | Receive-Job
if(!$job2){write-host "OST Found: $job2"} else{ "No OST Found"}
No luck, it outputs the following:
Note: This answer does not directly answer the question - see the other answer for that; instead, it shows a reusable idiom for a waiting for multiple jobs to finish in a non-blocking fashion.
The following sample code uses the child-process-based Start-Job cmdlet to create local jobs, but the solution equally works with local thread-based jobs created by Start-ThreadJob as well as jobs based on remotely executing Invoke-Command -ComputerName ... -AsJob commands, as used in the question.
It shows a reusable idiom for a waiting for multiple jobs to finish in a non-blocking fashion that allows for other activity while waiting, along with collecting per-job output in an array.
Here, the output is only collected after each job completes, but note that collecting it piecemeal, as it becomes available, is also an option, using (potentially multiple) Receive-Job calls even before a job finishes.
# Start two jobs, which run in parallel, and store the objects
# representing them in array $jobs.
# Replace the Start-Job calls with your
# Invoke-Command -ComputerName ... -AsJob
# calls.
$jobs = (Start-Job { Get-Date; sleep 1 }),
(Start-Job { Get-Date '1970-01-01'; sleep 2 })
# Initialize a helper array to keep track of which jobs haven't finished yet.
$remainingJobs = $jobs
# Wait iteratively *without blocking* until any job finishes and receive and
# output its output, until all jobs have finished.
# Collect all results in $jobResults.
$jobResults =
while ($remainingJobs) {
# Check if at least 1 job has terminated.
if ($finishedJob = $remainingJobs | Where State -in Completed, Failed, Stopped, Disconnected | Select -First 1) {
# Output the just-finished job's results as part of custom object
# that also contains the original command and the
# specific termination state.
[pscustomobject] #{
Job = $finishedJob.Command
State = $finishedJob.State
Result = $finishedJob | Receive-Job
}
# Remove the just-finished job from the array of remaining ones...
$remainingJobs = #($remainingJobs) -ne $finishedJob
# ... and also as a job managed by PowerShell.
Remove-Job $finishedJob
} else {
# Do other things...
Write-Host . -NoNewline
Start-Sleep -Milliseconds 500
}
}
# Output the jobs' results
$jobResults
Note:
It's tempting to try $remainingJobs | Wait-Job -Any -Timeout 0 to momentarily check for termination of any one job without blocking execution, but as of PowerShell 7.1 this doesn't work as expected: even already completed jobs are never returned - this appears to be bug, discussed in GitHub issue #14675.
If I run 2 Invoke-Command's using -asjob, does it run simultaneously when I try to receive the output?
Yes, PowerShell jobs always run in parallel, whether they're executing remotely, as in your case (with Invoke-Command -AsJob, assuming that localhost in the question is just a placeholder for the actual name of a different computer), or locally (using Start-Job or Start-ThreadJob).
However, by using (separate) Wait-Job calls, you are synchronously waiting for each jobs to finish (in a fixed sequence, too). That is, each Wait-Job calls blocks further execution until the target job terminates.[1]
Note, however, that both jobs continue to execute while you're waiting for the first one to finish.
If, instead of waiting in a blocking fashion, you want to perform other operations while you wait for both jobs to finish, you need a different approach, detailed in the the other answer.
can i save the output of the jobs to a variable without it showing to the console?
Yes, but the problem is that in your remotely executing script block ({ ... }) you're mistakenly using Write-Host in an attempt to output data.
Write-Host is typically the wrong tool to use, unless the intent is to write to the display only, bypassing the success output stream and with it the ability to send output to other commands, capture it in a variable, or redirect it to a file. To output a value, use it by itself; e.g., $value instead of Write-Host $value (or use Write-Output $value, though that is rarely needed); see this answer.
Therefore, your attempt to collect the job's output in a variable failed, because the Write-Host output bypassed the success output stream that variable assignments capture and went straight to the host (console):
# Because the job's script block uses Write-Host, its output goes to the *console*,
# and nothing is captured in $job1
$job1 = $pSTlocation | Wait-Job | Receive-Job
(Incidentally, the command could be simplified to
$job1 = $pSTlocation | Receive-Job -Wait).
[1] Note that Wait-Job has an optional -Timeout parameter, which allows you to limit waiting to at most a given number of seconds and return without output if the target job hasn't finished yet. However, as of PowerShell 7.1, -Timeout 0 for non-blocking polling for whether jobs have finished does not work - see GitHub issue #14675.
I have three different tasks that I wish to outsource to filesystem watchers in powershell. I have the code all set up to initialize two watchers and to check every ten seconds to make sure they are running. However the tasks that they perform last under a minute, and 5 minutes respectively. The third task I wish to outsource to a watcher takes about an hour. I am concerned that if I have all of them running simultaneously, tasks that the first two should watch for will not get done at all if the third watcher is executing its change action. Is there a way to implement or run them such that the change actions can be executed in parallel?
You can use the Start-ThreadJob cmdlet to run your file-watching tasks in parallel.
Start-ThreadJob comes with the ThreadJob module and offers a lightweight, thread-based alternative to the child-process-based regular background jobs.
It comes with PowerShell [Core] v6+ and in Windows PowerShell can be installed on demand with, e.g., Install-Module ThreadJob -Scope CurrentUser.
In most cases, thread jobs are the better choice, both for performance and type fidelity - see the bottom section of this answer for why.
The following self-contained sample code:
uses thread jobs to run 2 distinct file-monitoring and processing tasks in parallel,
which neither block each other nor the caller.
Note:
Each task creates its own System.IO.FileSystemWatcher instance in the code below, though creating too many of them can put a significant load on the system, possibly resulting in events getting missed.
An alternative is to share instances, such as creating a single one in the caller's context, which the thread jobs can access (see comments in source code below).
[This is in part speculative; do tell us if I got things wrong] Direct FileSystemWatcher .NET event-handler delegates should be kept short, but subscribing to the events from PowerShell via an event job created by Register-ObjectEvent queues the events on the PowerShell side, which PowerShell then dispatches to the -Action script blocks, so that these blocks perform long-running operations below shouldn't be an immediate concern (the tasks may take a long time to catch up, though).
# Make sure that the ThreadJob module is available.
# In Windows PowerShell, it must be installed first.
# In PowerShell [Core], it is available by default.
Import-Module ThreadJob -ea Stop
try {
# Use the system's temp folder in this example.
$dir = (Get-Item -EA Ignore temp:).FullName; if (-not $dir) { $dir = $env:TEMP }
# Define the tasks as an array of custom objects that specify the dir.
# and file name pattern to monitor as well as the action script block to
# handle the events.
$tasks = # array of custom objects to describe the
[pscustomobject] #{
DirToMonitor = $dir
FileNamePattern = '*.tmp1'
Action = {
# Print status info containing the event data to the host, synchronously.
Write-Host -NoNewLine "`nINFO: Event 1 raised:`n$($EventArgs | Format-List | Out-String)"
# Sleep to simulate blocking the thread with a long-running task.
Write-Host "INFO: Event 1: Working for 4 secs."
Start-Sleep 4
# Create output, which Receive-Job can collect.
"`nEvent 1 output: " + $EventArgs.Name
}
},
[pscustomobject] #{
DirToMonitor = $dir
FileNamePattern = '*.tmp2'
Action = {
# Print status info containing the event data to the host, synchronously
Write-Host -NoNewLine "`nINFO: Event 2 raised:`n$($EventArgs | Format-List | Out-String)"
# Sleep to simulate blocking the thread with a long-running task.
Write-Host "INFO: Event 2: Working for 2 secs"
Start-Sleep 2
# Create output, which Receive-Job can collect.
"`nEvent 2 output: " + $EventArgs.Name
}
}
# Start a separate thread job for each action task.
$threadJobs = $tasks | ForEach-Object {
Start-ThreadJob -ArgumentList $_ {
param([pscustomobject] $task)
# Create and initialize a thread-specific watcher.
# Note: To keep system load low, it's generally better to use a *shared*
# watcher, if feasible. You can define it in the caller's scope
# and access here via $using:watcher
$watcher = [System.IO.FileSystemWatcher] [ordered] #{
Path = $task.DirToMonitor
Filter = $task.FileNamePattern
EnableRaisingEvents = $true # start watching.
}
# Subscribe to the watcher's Created events, which returns an event job.
# This indefinitely running job receives the output from the -Action script
# block whenever the latter is called after an event fires.
$eventJob = Register-ObjectEvent -ea stop $watcher Created -Action $task.Action
Write-Host "`nINFO: Watching $($task.DirToMonitor) for creation of $($task.FileNamePattern) files..."
# Indefinitely wait for output from the action blocks and relay it.
try {
while ($true) {
Receive-Job $eventJob
Start-Sleep -Milliseconds 500 # sleep a little
}
}
finally {
# !! This doesn't print, presumably because this is killed by the
# !! *caller* being killed, which then doesn't relay the output anymore.
Write-Host "Cleaning up thread for task $($task.FileNamePattern)..."
# Dispose of the watcher.
$watcher.Dispose()
# Remove the event job (and with it the event subscription).
$eventJob | Remove-Job -Force
}
}
}
$sampleFilesCreated = $false
$sampleFiles = foreach ($task in $tasks) { Join-Path $task.DirToMonitor ("tmp_$PID" + ($task.FileNamePattern -replace '\*')) }
Write-Host "Starting tasks...`nUse Ctrl-C to stop."
# Indefinitely wait for and display output from the thread jobs.
# Use Ctrl+C to stop.
$dtStart = [datetime]::UtcNow
while ($true) {
# Receive thread job output, if any.
$threadJobs | Receive-Job
# Sleep a little.
Write-Host . -NoNewline
Start-Sleep -Milliseconds 500
# A good while after startup, create sample files that trigger all tasks.
# NOTE: The delay must be long enough for the task event handlers to already be
# in place. How long that takes can vary.
# Watch the status output to make sure the files are created
# *after* the event handlers became active.
# If not, increase the delay or create files manually once
# the event handlers are in place.
if (-not $sampleFilesCreated -and ([datetime]::UtcNow - $dtStart).TotalSeconds -ge 10) {
Write-Host
foreach ($sampleFile in $sampleFiles) {
Write-Host "INFO: Creating sample file $sampleFile..."
$null > $sampleFile
}
$sampleFilesCreated = $true
}
}
}
finally {
# Clean up.
# Clean up the thread jobs.
Remove-Job -Force $threadJobs
# Remove the temp. sample files
Remove-Item -ea Ignore $sampleFiles
}
The above creates output such as the following (sample from a macOS machine):
Starting tasks...
Use Ctrl-C to stop.
.
INFO: Watching /var/folders/19/0lxcl7hd63d6fqd813glqppc0000gn/T/ for creation of *.tmp1 files...
INFO: Watching /var/folders/19/0lxcl7hd63d6fqd813glqppc0000gn/T/ for creation of *.tmp2 files...
.........
INFO: Creating sample file /var/folders/19/0lxcl7hd63d6fqd813glqppc0000gn/T/tmp_91418.tmp1...
INFO: Creating sample file /var/folders/19/0lxcl7hd63d6fqd813glqppc0000gn/T/tmp_91418.tmp2...
.
INFO: Event 1 raised:
ChangeType : Created
FullPath : /var/folders/19/0lxcl7hd63d6fqd813glqppc0000gn/T/tmp_91418.tmp1
Name : tmp_91418.tmp1
INFO: Event 1: Working for 4 secs.
INFO: Event 2 raised:
ChangeType : Created
FullPath : /var/folders/19/0lxcl7hd63d6fqd813glqppc0000gn/T/tmp_91418.tmp2
Name : tmp_91418.tmp2
INFO: Event 2: Working for 2 secs
....
Event 2 output: tmp_91418.tmp2
....
Event 1 output: tmp_91418.tmp1
.................
I'm relatively new to powershell scripting so I have been coding based on multiple examples that I have seen online.
I have a script that executes multiple batch files in parallel and each batch file contains a bcp command to execute. I'm trying to catch any errors that may occur running the batch file but it's not working as expected. I specifically forced an error on product.bat by having an invalid select syntax.
workflow Test-Workflow
{
Param ([string[]] $file_names)
$file_names = Get-Content "D:/EDW/data/informatica/ming/Powersh/bcplist.lst"
foreach -parallel ($line in $file_names)
{
try
{
Write-Output ("processing... " + $line + ".bat")
start-process D:/EDW/data/informatica/ming/Powersh/$line.bat -ErrorAction Stop -wait
}
catch
{
$ErrorMessage = $_.Exception.Message
$FailedItem = $_.Exception.ItemName
Write-Output $line : $ErrorMessage $FailedItem
}
}
}
bcplist.lst:
ing_channel
ing_product
ing_channel:
bcp "SELECT * FROM CHANNEL" queryout ing_channel.txt -T -S99.999.999.9,99999 -t"\t" -c -q
ing_product:
bcp "SELT * FROM PRODUCT" queryout ing_product.txt -T -S99.999.999.9,99999 -t"\t" -c -q
Any help or suggestion would be greatly appreciated.
Exceptions are only thrown/caught when terminating errors are thrown, which are only thrown by cmdlets, .NET libraries, or native code when P/Invoke is in play. In order to handle failures with external commands, such as checking whether a bat or exe succeeded, you will need to check the $LASTEXITCODE yourself. $LASTEXITCODE is the PowerShell equivalent of %ERRORLEVEL% in cmd.exe. Here is an example of some basic boilerplate code to check this for the ping command:
&ping nonexistant.domain.tld
if( $LASTEXITCODE -ne 0 ){
# Handle the error here
# This example writes to the error stream and throws a terminating error
Write-Error "Unable to ping server, ping returned $LASTEXITCODE" -EA Stop
}
Note that the -ErrorAction argument has a shorthand of -EA, so either the long or short form will work.