Test-Path timeout for PowerShell - powershell

I'm trying to routinely check the presence of particular strings in text files on hundreds of computers on our domain.
foreach ($computer in $computers) {
$hostname = $computer.DNSHostName
if (Test-Connection $hostname -Count 2 -Quiet) {
$FilePath = "\\" + $hostname + "c$\SomeDirectory\SomeFile.txt"
if (Test-Path -Path $FilePath) {
# Check for string
}
}
}
For the most part, the pattern of Test-Connection and then Test-Path is effective and fast. There are certain computers, however, that ping successfully but Test-Path takes around 60 seconds to resolve to FALSE. I'm not sure why, but it may be a domain trust issue.
For situations like this, I would like to have a timeout for Test-Path that defaults to FALSE if it takes more than 2 seconds.
Unfortunately the solution in a related thread (How can I wrap this Powershell cmdlet into a timeout function?) does not apply to my situation. The proposed do-while loop gets hung up in the code block.
I've been experimenting with Jobs but it appears even this won't force quit the Test-Path command:
Start-Job -ScriptBlock {param($Path) Test-Path $Path} -ArgumentList $Path | Wait-Job -Timeout 2 | Remove-Job -Force
The job continues to hang in the background. Is this the cleanest way I can achieve my requirements above? Is there a better way to timeout Test-Path so the script doesn't hang besides spawning asynchronous activities? Many thanks.

Wrap your code in a [powershell] object and call BeginInvoke() to execute it asynchronously, then use the associated WaitHandle to wait for it to complete only for a set amount of time.
$sleepDuration = Get-Random 2,3
$ps = [powershell]::Create().AddScript("Start-Sleep -Seconds $sleepDuration; 'Done!'")
# execute it asynchronously
$handle = $ps.BeginInvoke()
# Wait 2500 milliseconds for it to finish
if(-not $handle.AsyncWaitHandle.WaitOne(2500)){
throw "timed out"
return
}
# WaitOne() returned $true, let's fetch the result
$result = $ps.EndInvoke($handle)
return $result
In the example above, we randomly sleep for either 2 or 3 seconds, but set a 2 and a half second timeout - try running it a couple of times to see the effect :)

Related

Copy-item using invoke-async in Powershell

This article shows how to use Invoke-Async in PowerShell: https://sqljana.wordpress.com/2018/03/16/powershell-sql-server-run-in-parallel-collect-sql-results-with-print-output-from-across-your-sql-farm-fast/
I wish to run in parallel the copy-item cmdlet in PowerShell because the alternative is to use FileSystemObject via Excel and copy one file at a time out of a total of millions of files.
I have cobbled together the following:
.SYNOPSIS
<Brief description>
For examples type:
Get-Help .\<filename>.ps1 -examples
.DESCRIPTION
Copys files from one path to another
.PARAMETER FileList
e.g. C:\path\to\list\of\files\to\copy.txt
.PARAMETER NumCopyThreads
default is 8 (but can be 100 if you want to stress the machine to maximum!)
.EXAMPLE
.\CopyFilesToBackup -filelist C:\path\to\list\of\files\to\copy.txt
.NOTES
#>
[CmdletBinding()]
Param(
[String] $FileList = "C:\temp\copytest.csv",
[int] $NumCopyThreads = 8
)
$filesToCopy = New-Object "System.Collections.Generic.List[fileToCopy]"
$csv = Import-Csv $FileList
foreach($item in $csv)
{
$file = New-Object fileToCopy
$file.SrcFileName = $item.SrcFileName
$file.DestFileName = $item.DestFileName
$filesToCopy.add($file)
}
$sb = [scriptblock] {
param($file)
Copy-item -Path $file.SrcFileName -Destination $file.DestFileName
}
$results = Invoke-Async -Set $filesToCopy -SetParam file -ScriptBlock $sb -Verbose -Measure:$true -ThreadCount 8
$results | Format-Table
Class fileToCopy {
[String]$SrcFileName = ""
[String]$DestFileName = ""
}
the csv input for which looks like this:
SrcFileName,DestFileName
C:\Temp\dummy-data\101438\101438-0154723869.zip,\\backupserver\Project Archives\101438\0154723869.zip
C:\Temp\dummy-data\101438\101438-0165498273.xlsx,\\backupserver\Project Archives\101438\0165498273.xlsx
What am I missing to get this working, because when I run .\CopyFiles.ps1 -FileList C:\Temp\test.csv nothing happens. The files exist in the source path, but the file objects aren't being pulled from the -Set collection. (Unless I have misunderstood how the collection is used?)
No, I can't use robocopy to do this because there are millions of files which resolve to different paths depending upon their original location.
I have no explanation for your symptom based on the code in your question (see bottom section), but I suggest basing your solution on the (now) standard Start-ThreadJob cmdlet (comes with PowerShell Core; in Windows PowerShell, install it with Install-Module ThreadJob -Scope CurrentUser, for instance[1]):
Such a solution is more efficient than use of the third-party Invoke-Async function, which as of this writing is flawed in that it waits for jobs to finish in a tight loop, which creates unnecessary processing overhead.
Start-ThreadJob jobs are a lightweight, thread-based alternative to the process-based Start-Job background jobs, yet they integrate with the standard job-management cmdlets, such as Wait-Job and Receive-Job.
Here's a self-contained example based on your code that demonstrates its use:
Note: Whether you use Start-ThreadJob or Invoke-Async, you won't be able to explicit reference custom classes such as [fileToCopy] in the script block that runs in separate threads (runspaces; see bottom section), so the solution below simply uses [pscustomobject] instances with the properties of interest for simplicity and brevity.
# Create sample CSV file with 10 rows.
$FileList = Join-Path ([IO.Path]::GetTempPath()) "tmp.$PID.csv"
#'
Foo,SrcFileName,DestFileName,Bar
1,c:\tmp\a,\\server\share\a,baz
2,c:\tmp\b,\\server\share\b,baz
3,c:\tmp\c,\\server\share\c,baz
4,c:\tmp\d,\\server\share\d,baz
5,c:\tmp\e,\\server\share\e,baz
6,c:\tmp\f,\\server\share\f,baz
7,c:\tmp\g,\\server\share\g,baz
8,c:\tmp\h,\\server\share\h,baz
9,c:\tmp\i,\\server\share\i,baz
10,c:\tmp\j,\\server\share\j,baz
'# | Set-Content $FileList
# How many threads at most to run concurrently.
$NumCopyThreads = 8
Write-Host 'Creating jobs...'
$dtStart = [datetime]::UtcNow
# Import the CSV data and transform it to [pscustomobject] instances
# with only .SrcFileName and .DestFileName properties - they take
# the place of your original [fileToCopy] instances.
$jobs = Import-Csv $FileList | Select-Object SrcFileName, DestFileName |
ForEach-Object {
# Start the thread job for the file pair at hand.
Start-ThreadJob -ThrottleLimit $NumCopyThreads -ArgumentList $_ {
param($f)
$simulatedRuntimeMs = 2000 # How long each job (thread) should run for.
# Delay output for a random period.
$randomSleepPeriodMs = Get-Random -Minimum 100 -Maximum $simulatedRuntimeMs
Start-Sleep -Milliseconds $randomSleepPeriodMs
# Produce output.
"Copied $($f.SrcFileName) to $($f.DestFileName)"
# Wait for the remainder of the simulated runtime.
Start-Sleep -Milliseconds ($simulatedRuntimeMs - $randomSleepPeriodMs)
}
}
Write-Host "Waiting for $($jobs.Count) jobs to complete..."
# Synchronously wait for all jobs (threads) to finish and output their results
# *as they become available*, then remove the jobs.
# NOTE: Output will typically NOT be in input order.
Receive-Job -Job $jobs -Wait -AutoRemoveJob
Write-Host "Total time lapsed: $([datetime]::UtcNow - $dtStart)"
# Clean up the temp. file
Remove-Item $FileList
The above yields something like:
Creating jobs...
Waiting for 10 jobs to complete...
Copied c:\tmp\b to \\server\share\b
Copied c:\tmp\g to \\server\share\g
Copied c:\tmp\d to \\server\share\d
Copied c:\tmp\f to \\server\share\f
Copied c:\tmp\e to \\server\share\e
Copied c:\tmp\h to \\server\share\h
Copied c:\tmp\c to \\server\share\c
Copied c:\tmp\a to \\server\share\a
Copied c:\tmp\j to \\server\share\j
Copied c:\tmp\i to \\server\share\i
Total time lapsed: 00:00:05.1961541
Note that the output received does not reflect the input order, and that the overall runtime is roughly 2 times the per-thread runtime of 2 seconds (plus overhead), because 2 "batches" have to be run due to the input count being 10, whereas only 8 threads were made available.
If you upped the thread count to 10 or more (50 is the default), the overall runtime would drop to 2 seconds plus overhead, because all jobs then run concurrently.
Caveat: The above numbers stem from running in PowerShell Core, version on Microsoft Windows 10 Pro (64-bit; Version 1903), using version 2.0.1 of the ThreadJob module.
Inexplicably, the same code is much slower in Windows PowerShell, v5.1.18362.145.
However, for performance and memory consumption it is better to use batching (chunking) in your case, i.e, to process multiple file pairs per thread.
The following solution demonstrates this approach; tweak $chunkSize to find a batch size that works for you.
# Create sample CSV file with 10 rows.
$FileList = Join-Path ([IO.Path]::GetTempPath()) "tmp.$PID.csv"
#'
Foo,SrcFileName,DestFileName,Bar
1,c:\tmp\a,\\server\share\a,baz
2,c:\tmp\b,\\server\share\b,baz
3,c:\tmp\c,\\server\share\c,baz
4,c:\tmp\d,\\server\share\d,baz
5,c:\tmp\e,\\server\share\e,baz
6,c:\tmp\f,\\server\share\f,baz
7,c:\tmp\g,\\server\share\g,baz
8,c:\tmp\h,\\server\share\h,baz
9,c:\tmp\i,\\server\share\i,baz
10,c:\tmp\j,\\server\share\j,baz
'# | Set-Content $FileList
# How many threads at most to run concurrently.
$NumCopyThreads = 8
# How many files to process per thread
$chunkSize = 3
# The script block to run in each thread, which now receives a
# $chunkSize-sized *array* of file pairs.
$jobScriptBlock = {
param([pscustomobject[]] $filePairs)
$simulatedRuntimeMs = 2000 # How long each job (thread) should run for.
# Delay output for a random period.
$randomSleepPeriodMs = Get-Random -Minimum 100 -Maximum $simulatedRuntimeMs
Start-Sleep -Milliseconds $randomSleepPeriodMs
# Produce output for each pair.
foreach ($filePair in $filePairs) {
"Copied $($filePair.SrcFileName) to $($filePair.DestFileName)"
}
# Wait for the remainder of the simulated runtime.
Start-Sleep -Milliseconds ($simulatedRuntimeMs - $randomSleepPeriodMs)
}
Write-Host 'Creating jobs...'
$dtStart = [datetime]::UtcNow
$jobs = & {
# Process the input objects in chunks.
$i = 0
$chunk = [pscustomobject[]]::new($chunkSize)
Import-Csv $FileList | Select-Object SrcFileName, DestFileName | ForEach-Object {
$chunk[$i % $chunkSize] = $_
if (++$i % $chunkSize -ne 0) { return }
# Note the need to wrap $chunk in a single-element helper array (, $chunk)
# to ensure that it is passed *as a whole* to the script block.
Start-ThreadJob -ThrottleLimit $NumCopyThreads -ArgumentList (, $chunk) -ScriptBlock $jobScriptBlock
$chunk = [pscustomobject[]]::new($chunkSize) # we must create a new array
}
# Process any remaining objects.
# Note: $chunk -ne $null returns those elements in $chunk, if any, that are non-null
if ($remainingChunk = $chunk -ne $null) {
Start-ThreadJob -ThrottleLimit $NumCopyThreads -ArgumentList (, $remainingChunk) -ScriptBlock $jobScriptBlock
}
}
Write-Host "Waiting for $($jobs.Count) jobs to complete..."
# Synchronously wait for all jobs (threads) to finish and output their results
# *as they become available*, then remove the jobs.
# NOTE: Output will typically NOT be in input order.
Receive-Job -Job $jobs -Wait -AutoRemoveJob
Write-Host "Total time lapsed: $([datetime]::UtcNow - $dtStart)"
# Clean up the temp. file
Remove-Item $FileList
While the output is effectively the same, note how only 4 jobs were created this time, each of which processed (up to) $chunkSize (3) file pairs.
As for what you tried:
The screen shot you show suggests that the problem is that your custom class, [fileToCopy], isn't visible to the script block run by Invoke-Async.
Since Invoke-Async invokes the script block via the PowerShell SDK in separate runspaces that know nothing about the caller's state, it is to be expected that these runspaces don't know your class (this equally applies to Start-ThreadJob).
However, it is unclear why that is a problem in your code, because your script block doesn't make an explicit reference to you class: your script-block parameter $file is not type-constrained (it is implicitly [object]-typed).
Therefore, simply accessing the properties of your custom-class instance inside the script block should work, and indeed does in my tests on Windows PowerShell v5.1.18362.145 on Microsoft Windows 10 Pro (64-bit; Version 1903).
However, if your real script-block code were to explicitly reference custom class [fileToCopy] - such as by defining the parameter as param([fileToToCopy] $file) - you would see the symptom.
[1] In Windows PowerShell v3 and v4, which do not come with the PowerShellGet module, Install-Module isn't available by default. However, the module can be installed on demand, as described in Installing PowerShellGet.

Why is the command in ScriptBlock not working?

I want to measure elapsed time from a cmdlet
Invoke-ASCmd
I am using it the following way
$elapsedTime = [system.diagnostics.stopwatch]::StartNew()
$j = Start-Job -ScriptBlock {
Invoke-ASCmd –InputFile $XMLF -Server "$Server[-1]" >$output_file
}
do {
write-progress -activity "Syncing..." -status "$([string]::Format("Time Elapsed: {0:d2}:{1:d2}:{2:d2}", $elapsedTime.Elapsed.hours, $elapsedTime.Elapsed.minutes, $elapsedTime.Elapsed.seconds))"
#-percentcomplete ($_/10);
Start-Sleep -milliseconds 250
} while ($j.State -eq 'Running')
Receive-Job -Job $j
$elapsedTime.stop()
However, all i see on the console is a flashing blue progress bar that doesnt appear to be even elapsing the time at all...and frankly, i dont even think the scriptblock is being executed at all (the Invoke cmdlet)
why is that?
and it appears to last 1 second
I know that the scriptblock is not working because the syncing is supposed to take at least 20 seconds so something is wrong
Also, i would like to get the percentage (circles animation/prgress), this is not working
-percentcomplete ($_/10);
One last thing, i would like to save the final elapsed time to a variable $FinalTime, would i do it inside the loop or outside?
I am combining these two answers here and modifying for my needs:
https://stackoverflow.com/a/9813370/8397835
https://stackoverflow.com/a/8468024/8397835
Yes, the progress is quick because it takes PowerShell 1 second to load the module before erroring out. We can see the error message with Receive-Job:
PS C:\> Receive-Job $j
InputFile "" not found
+ CategoryInfo : InvalidData: (:) [Invoke-ASCmd], FileNotFoundException
+ FullyQualifiedErrorId : DataValidation,Microsoft.AnalysisServices.PowerShell.Cmdlets.ExecuteScriptCommand
+ PSComputerName : localhost
InputFile "" not found indicates that the variables were empty. They are empty because you can't reference variables directly inside of the Script Block. Using Start-Job, you must pass it into the Script Block as an argument, and receive it as a parameter inside the Script Block. Something like this:
$j = Start-Job -Arg $XMLF, $Server, $output_file -ScriptBlock {
Param($XMLF, $Server, $output_file)
Invoke-ASCmd –InputFile $XMLF -Server "$Server" >$output_file
}
As for progress, since there is no "Direct" way to measure how far the progress is to 100%, we "fake it". Since we know that it takes about 20 seconds to execute, we simply have our progress do some math using the time from 0 to 20 as our 0 to 100 progress:
[Math]::Min(100*($elapsedTime.Elapsed.Seconds / 20),100)
Essentially use $elapsedTime for 0 to 100 percent over 20 seconds. That 20 seconds can be changed to any number that is close to the approximate execution time. Using [Math]::Min we ensure that if it takes longer than 20 seconds, the progress will show 100 percent, but the status will continue to show the time. So it would look like this:
do {
write-progress -activity "Syncing..." -status "$($elapsedTime.Elapsed.ToString())" -percentcomplete ([Math]::Min(100*($elapsedTime.Elapsed.Seconds / 20),100));
Start-Sleep -Milliseconds 250
} while ($j.State -eq 'Running')

RDS User logoff Script Slow

With the help of the several online articles I was able to compile a powershell script that logs off all users for each of my RD Session hosts. I wanted something to be really gentle on logging off users and it writing profiles back to their roaming profile location on the storage system. However, this is too gentle and takes around four hours to complete with the amount of users and RDS servers I have.
This script is designed to set each RDS server drain but allow redirection if a server is available so the thought around this was within the first 15 minutes I would have the first few servers ready for users to log into.
All of this works but I would like like to see if there are any suggestions on speeding this up a little.
Here is the loop that goes through each server and logs users out and then sets the server logon mode to enabled:
ForEach ($rdsserver in $rdsservers){
try {
query user /server:$rdsserver 2>&1 | select -skip 1 | ? {($_ -split "\s+")[-5]} | % {logoff ($_ -split "\s+")[-6] /server:$rdsserver /V}
Write-Host "Giving the RDS Server time"
Write-Progress "Pausing Script" -status "Giving $rdsserver time to settle" -perc (5/(5/100))
Start-Sleep -Seconds 5
$RDSH=Get-WmiObject -Class "Win32_TerminalServiceSetting" -Namespace "root\CIMV2\terminalservices" -ComputerName $rdsserver -Authentication PacketPrivacy -Impersonation Impersonate
$RDSH.SessionBrokerDrainMode=0
$RDSH.put() > $null
Write-Host "$rdsserver is set to:"
switch ($RDSH.SessionBrokerDrainMode) {
0 {"Allow all connections."}
1 {"Allow incoming reconnections but until reboot prohibit new connections."}
2 {"Allow incoming reconnections but prohibit new connections."}
default {"The user logon state cannot be determined."}
}
}
catch {}
}
Not sure how many Servers you have but if its less than 50 or so you can do this in parallel with PSJobs. You'll have to wrap your code in a scriptblock, launch each server as a separate job, then wait for them to complete and retrieve any data returned. You won't be able to use Write-Host when doing this but I've swapped those to Out-Files. I also didn't parse out your code for collecting your list of servers but I'm going to assume that works and you can have it return a formatted list to a variable $rdsservers. You'll probably also want to modify the messages a bit so you can tell which server is which in the log file, or do different logs for each server. If you want anything other than the names of jobs to hit the console you'll have to output it with Write-Output or a return statement.
$SB = {
param($rdsserver)
Start-Sleep -Seconds 5
$RDSH=Get-WmiObject -Class "Win32_TerminalServiceSetting" -Namespace "root\CIMV2\terminalservices" -ComputerName $rdsserver -Authentication PacketPrivacy -Impersonation Impersonate
$RDSH.SessionBrokerDrainMode=0
$RDSH.put() > $null
"$rdsserver is set to:" | out-file $LogPath #Set this to whatever you want
switch ($RDSH.SessionBrokerDrainMode) {
0 {"Allow all connections." | out-file $LogPath}
1 {"Allow incoming reconnections but until reboot prohibit new connections." | out-file $LogPath}
2 {"Allow incoming reconnections but prohibit new connections." | out-file $LogPath}
default {"The user logon state cannot be determined." | out-file $LogPath}
}
foreach ($server in $rdsservers){
Start-Job -Scriptblock -ArgumentList $server
}
Get-Job | Wait-Job | Receive-Job
The foreach loop launches the jobs and then the last line waits for all of them to complete before getting any data that was output. You can also set a timeout on the wait if there is a chance your script never completes. If you've got a ton of boxes you may want to look into runspaces over jobs as they have better performance but take more work to use. This Link can help you out if you decide to go that way. I don't have an RDS deployment at the moment to test on so if you get any errors or have trouble getting it to work just post a comment and I'll see what I can do.
I have something ready for testing but it may break fantastically. You wizards out there may look at this and laugh. If i did this wrong please let me know.
$Serverperbatch = 2
$job = 0
$job = $Serverperbatch - 1
$batch = 1
While ($job -lt $rdsservers.count) {
$ServerBatch = $rdsservers[$job .. $job]
$jobname = "batch$batch"
Start-job -Name $jobname -ScriptBlock {
param ([string[]]$rdsservers)
Foreach ($rdsserver in $rdsservers) {
try {
query user /server:$rdsserver 2>&1 | select -skip 1 | ? {($_ -split "\s+")[-5]} | % {logoff ($_ -split "\s+")[-6] /server:$rdsserver /V}
$RDSH=Get-WmiObject -Class "Win32_TerminalServiceSetting" -Namespace "root\CIMV2\terminalservices" -ComputerName $rdsserver -Authentication PacketPrivacy -Impersonation Impersonate
$RDSH.SessionBrokerDrainMode=0
$RDSH.put() > $null
}
catch {}
} -ArgumentList (.$serverbatch)
$batch += 1
$Job = $job + 1
$job += $serverperbatch
If ($Job -gt $rdsservers.Count) {$Job = $rdsservers.Count}
If ($Job -gt $rdsservers.Count) {$Job = $rdsservers.Count}
}
}
Get-Job | Wait-Job | Receive-Job

Limited powershell start-jobs

I curious if you can answer this or point me in the right direction.
I've written a script that tests/monitors urls. I'm not posting the code ( unless you want me to ) because there is no error in the code. It works great. I can even scriptblock run it as part of start job. The issue I have seems to be that I can not run more than 3 jobs at time.. or they hang. I'm not sure why this is. I can run it for a total of 15 urls throttled to 3 and it's great. If I try to run it on 15 urls with 4 as my run limit, they will hang.. and I can kill one at a time.. until only 3 remain and those will finish. So it seems that I can only start a total of 3 powershell instances or they hang. Anyone explain why this is? All my searches lead me to pages that show how to throttle and it's not really my issue.
Watching the processes, each consumes about 25MBs of memory and sits there idle... If I kill one the other 3 will start using cpu and process go up to maybe 30MBs of memory and terminate completed. System has 8GBs of memory & a quad cord I5-2400 CPU # 3.10GHz. As requested...
Param(
$file
)
$testscript =
{
Param(
[string]$url,
#[ValidateSet('InternetExplorer','Chrome','Firefox','Safari','Opera', IgnoreCase = $true)]
[string]$browser="InternetExplorer",
[string]$teststring="Solution Center",
[int]$timeout=20,
[int]$retry
)
$i=0
do {
$userAgent = [Microsoft.PowerShell.Commands.PSUserAgent]::$browser
$data = Invoke-WebRequest $url -UserAgent $userAgent -TimeoutSec $timeout
$data.Content
$findit = $data.Content.Contains($teststring)
$i++
If ($findit){
break
}
}
while ($i -lt $retry)
if(!$findit) {
Echo "opcmsg a=PSURLCheck o=NHTSA msg_t='$teststring was not found on $url or $url failed to load'"
}
}
$urls = Import-Csv $file | % {
Start-Job -ScriptBlock $testscript -ArgumentList $_.url, $_.browser, $_.teststring, $_.retry
}
While (#(Get-Job | Where { $_.State -eq "Running" }).Count -ne 0)
{ Write-Host "Processing URLs..."
Get-Job
Start-Sleep -Seconds 5
}
$Data = ForEach ($Job in (Get-Job)) {
Receive-Job $Job
Remove-Job $Job
}
$data | select *
So I've used new system.net.webclient and I've even tried doing this with [System.Collections.Queue]... but all three methods use Jobs... so it appears.. I can not run more than three start jobs at any one time.
Are you sure your code is fine? If you're calling separate powershell sessions multiple times memory can be consumed very quickly. Check process monitor for high CPU or memory usage and ensure your blocks are terminating. Or post the code.

Powershell command timeout

I am trying to execute a function or a scriptblock in powershell and set a timeout for the execution.
Basically I have the following (translated into pseudocode):
function query{
#query remote system for something
}
$computerList = Get-Content "C:\scripts\computers.txt"
foreach ($computer in $computerList){
$result = query
#do something with $result
}
The query can range from a WMI query using Get-WmiObject to a HTTP request and the script has to run in a mixed environment, which includes Windows and Unix machines which do not all have a HTTP interface.
Some of the queries will therefore necessarily hang or take a VERY long time to return.
In my quest for optimization I have written the following:
$blockofcode = {
#query remote system for something
}
foreach ($computer in $computerList){
$Job = Start-Job -ScriptBlock $blockofcode -ArgumentList $computer
Wait-Job $Job.ID -Timeout 10 | out-null
$result = Receive-Job $Job.ID
#do something with result
}
But unfortunately jobs seem to carry a LOT of overhead. In my tests a query that executes in 1.066 seconds (according to timers inside $blockofcode) took 6.964 seconds to return a result when executed as a Job. Of course it works, but I would really like to reduce that overhead. I could also start all jobs together and then wait for them to finish, but the jobs can still hang or take ridiculous amounts to time to complete.
So, on to the question: is there any way to execute a statement, function, scriptblock or even a script with a timeout that does not comprise the kind of overhead that comes with jobs? If possible I would like to run the commands in parallel, but that is not a deal-breaker.
Any help or hints would be greatly appreciated!
EDIT: running powershell V3 in a mixed windows/unix environment
Today, I ran across a similar question, and noticed that there wasn't an actual answer to this question. I created a simple PowerShell class, called TimedScript. This class provides the following functionality:
Method: Start() method to kick off the job, when you're ready
Method:GetResult() method, to retrieve the output of the script
Constructor: A constructor that takes two parameters:
ScriptBlock to execute
[int] timeout period, in milliseconds
It currently lacks:
Passing in arguments to the PowerShell ScriptBlock
Other useful features you think up
Class: TimedScript
class TimedScript {
[System.Timers.Timer] $Timer = [System.Timers.Timer]::new()
[powershell] $PowerShell
[runspace] $Runspace = [runspacefactory]::CreateRunspace()
[System.IAsyncResult] $IAsyncResult
TimedScript([ScriptBlock] $ScriptBlock, [int] $Timeout) {
$this.PowerShell = [powershell]::Create()
$this.PowerShell.AddScript($ScriptBlock)
$this.PowerShell.Runspace = $this.Runspace
$this.Timer.Interval = $Timeout
Register-ObjectEvent -InputObject $this.Timer -EventName Elapsed -MessageData $this -Action ({
$Job = $event.MessageData
$Job.PowerShell.Stop()
$Job.Runspace.Close()
$Job.Timer.Enabled = $False
})
}
### Method: Call this when you want to start the job.
[void] Start() {
$this.Runspace.Open()
$this.Timer.Start()
$this.IAsyncResult = $this.PowerShell.BeginInvoke()
}
### Method: Once the job has finished, call this to get the results
[object[]] GetResult() {
return $this.PowerShell.EndInvoke($this.IAsyncResult)
}
}
Example Usage of TimedScript Class
# EXAMPLE: The timeout period is set longer than the execution time of the script, so this will succeed
$Job1 = [TimedScript]::new({ Start-Sleep -Seconds 2 }, 4000)
# EXAMPLE: This script will fail. Even though Get-Process returns quickly, the Start-Sleep call will cause it to be terminated by its Timer.
$Job2 = [TimedScript]::new({ Get-Process -Name s*; Start-Sleep -Seconds 3 }, 2000)
# EXAMPLE: This job will fail, because the timeout is less than the script execution time.
$Job3 = [TimedScript]::new({ Start-Sleep -Seconds 3 }, 1000)
$Job1.Start()
$Job2.Start()
$Job3.Start()
Code is also hosted on GitHub Gist.
I think you might want to investigate using Powershell runspaces:
http://learn-powershell.net/2012/05/13/using-background-runspaces-instead-of-psjobs-for-better-performance/