Capture Verbose Stream from Job - powershell

I am trying to be a good a powerscript user and use Write-Verbose as per best practices, but I have no way to get the Verbose stream from a running Job.
$Job = Start-Job -Name "Scanning Work Item" -ScriptBlock{
Write-Verbose "Write-Verbose"
Write-Host "Write-Host"
}
while ($Job.HasMoreData -or $Job.State -eq "Running") {
Receive-Job -Job $Job -Verbose
Start-Sleep -Seconds 1
}
The output for this is
Write-Host
Please only answer with tested code as I have spent hours trying various permutations of Powershell script.

First of all, you're not getting any verbose ouput because you haven't changed the default VerbosePreference for the session.
As for reading Verbose ouput while the job is running, you can read each of the output stream buffers from the associated child job individually, without doing a Receive-job, and without affecting later output when you do the Receive-Job,
$Job = Start-Job -Name "Scanning Work Item" -ScriptBlock{
$VerbosePreference = 'Continue'
Write-Verbose "Write-Verbose"
Write-Host "Write-Host"
Start-Sleep -Seconds 10
}
Start-sleep -Seconds 2
$Verbose = $Job.ChildJobs[0].verbose.readall()
$verbose
while ($Job.HasMoreData -or $Job.State -eq "Running") {
Receive-Job -Job $Job
Start-Sleep -Seconds 1
}
Write-Verbose
VERBOSE: Write-Verbose
Write-Host

Related

Powershell running invoke-command on several servers using -Asjob but log completion log locally

I am trying to run a powershell script that installs some software on a bunch of remote servers.
I am using the -Asjob option to run them synchronously.
I'm also using for loop to run the remote commands on each server, but i want to write a "Done" log file locally where i am running the script to notify me exactly when each server completes the commands.
This is the sample code i am testing, and the script runs fine, but the "Done" log files gets generated immediately, and not as each server finishes.
$VerbosePreference = 'Continue'
$servers = Get-Content -Path f:\temp\servers.txt
foreach($server in $servers) {
Write-Verbose "Start batch file as a job on $server"
Start-Sleep -Seconds 3
Invoke-Command -ComputerName $server -ScriptBlock {
echo testfile1 > f:\temp\testfile1.txt
Start-Sleep -Seconds 20
echo testfile2 > f:\temp\testfile2.txt
Start-Sleep -Seconds 20
echo testfile3 > f:\temp\testfile3.txt
echo DONE} > f:\temp\$server.done.txt -Asjob
}
thanks
Remove the redirection operator after Invoke-Command { ... } - otherwise you'll be redirecting the resulting job objects to file, rather than the output from the jobs - instead, collect all the job objects to a variable $jobs:
$VerbosePreference = 'Continue'
$servers = Get-Content -Path f:\temp\servers.txt
$jobs = foreach($server in $servers) {
Write-Verbose "Start batch file as a job on $server"
Start-Sleep -Seconds 3
Invoke-Command -ComputerName $server -ScriptBlock {
echo testfile1 > f:\temp\testfile1.txt
Start-Sleep -Seconds 20
echo testfile2 > f:\temp\testfile2.txt
Start-Sleep -Seconds 20
echo testfile3 > f:\temp\testfile3.txt
echo DONE} -Asjob
}
Now that we've kicked off all the remoting jobs and collected the pertaining job object, we simply need to wait and then collect the output:
foreach($job in $jobs){
# Wait for jobs to finish, retrieve their output
$jobOutput = $job |Receive-Job -Wait
# Grab the remote server name from the output
$server = $jobOutput |ForEach-Object PSComputerName |Select -First 1
# Write output to file
$jobOutput > f:\temp\$server.done.txt
}

How to modify the output of PowerShell Jobs if they take too long

I have a relatively long/complex script that blindly runs a batch of commands against several devices (as jobs). Once in awhile, a couple of these jobs continue to run indefinitely. I’ve added a Wait-Job -Timeout command to my script (see below) in order to force-stop jobs that are taking too long to run.
I’d like to change the output for these hung jobs to read “This device is busy or unstable”. How can I do this? I'm guessing I need to add something to the tail-end of the pipeline (in the last line of code below).
$Jobs += Get-Job
$jobs | Wait-Job -Timeout 5 | out-null
Get-Job | ? {$_.State -eq 'Running'} | Stop-Job -PassThru
One way is to iterate over the jobs that are currently running and write your message for each one:
$Jobs += Get-Job
$Jobs | Wait-Job -Timeout 5 | Out-Null
Get-Job | ? { $_.State -eq 'Running' } | Stop-Job -PassThru | % { Write-Host "This device is busy or unstable" }
You can also add info from the jobs that are being stopped, like the job ID for example:
Get-Job | ? { $_.State -eq 'Running' } | Stop-Job -PassThru | % { Write-Host "This device is busy or unstable: $($_.Id)" }
UPDATE: You can use a hashtable to store the job IDs that were "force stopped". Then iterate through the Jobs using Receive-Job to get the output, and check if the job is in the ForceStoppedIds table. If it is write your custom message, otherwise just output the message. Here's a simple test I ran.
Start-Job -ScriptBlock { Write-Output "Starting 1."; Start-Sleep -Seconds 3; Write-Output "1 complete."; } | Out-Null
Start-Job -ScriptBlock { Write-Output "Starting 2."; Start-Sleep -Seconds 60; Write-Output "2 complete."; } | Out-Null
Start-Job -ScriptBlock { Write-Output "Starting 3."; Start-Sleep -Seconds 2; Write-Output "3 complete."; } | Out-Null
$Jobs += Get-Job
$Jobs | Wait-Job -Timeout 5 | Out-Null
$ForceStoppedIds = #{}
$Jobs | ? { $_.State -eq 'Running' } | Stop-Job -PassThru | % { $ForceStoppedIds[$_.Id] = $true }
foreach ($job in $Jobs) {
$jobOutput = Receive-Job $job -Wait
if ($ForceStoppedIds.Contains($job.Id)) {
Write-Host "Custom message about stopped job: $($jobOutput)"
}
else {
Write-Host $jobOutput
}
}
One thing to be cautious of is how jobs output information (ie. Write-Host, Write-Output, return, etc.). If you're not getting the results you expect, double check the job's ScriptBlock to see how the information is being written/returned/etc. I'm sure there are much more elegant ways of doing this, but hopefully this will help.

Using Powershell To Distribute Script Level Jobs On Remote Servers

More of a theory question...
I have a powershell script that exists on three servers. In this example the three servers are:
server1
server2
server3
I am using another machine, server4, to call script C:\ExampleScript.ps1 remotely using Invoke-Command while specifying the remote machine via the ComputerName parameter. The ultimate goal of the script is to detect whether powershell is running, if it is not, then the computer is "not busy" and can open up the script being called remotely. If the computer is "busy", move onto the next server and continue on through the three machines until all the parameter values have been exhausted. If all machines are busy, it would be ideal if there was a way to periodically check the processes and see if they are still open. In this way, execution of the script can be balanced across the various machines, in an albeit primitive fashion.
Consider the following code:
$servers = "server1","server2","server3"
$data = "param1", "param2", "param3", "param4", "param5", "param6"
#somehow loop through the different servers/data using the above arrays
$job = Invoke-Command $servers[0] {
$ProcessActive = Get-Process powershell -ErrorAction SilentlyContinue
if($ProcessActive -eq $null)
{
"Running"
Invoke-Command -ComputerName $env:computername -FilePath C:\ExampleScript.ps1 -ArgumentList $data[0]
}
else
{
"Busy go to next machine"
}
} -AsJob
Wait-Job $job
$r = Receive-Job $job
$r
The expected result trying to be achieved is attempting to load balance the script across the machines based on whether there is an active powershell process, if not move onto the next machine and perform the same test and subsequent possible execution. The script should go through all the values as specified in the $data array (or whatever).
I found this question interesting, so I wanted to give it a try.
$servers = "server1","server2","server3"
$data = New-Object System.Collections.ArrayList
$data.AddRange(#("param1", "param2", "param3", "param4", "param5", "param6"))
$jobs = New-Object System.Collections.ArrayList
do
{
Write-Host "Checking job states." -ForegroundColor Yellow
$toremove = #()
foreach ($job in $jobs)
{
if ($job.State -ne "Running")
{
$result = Receive-Job $job
if ($result -ne "ScriptRan")
{
Write-Host " Adding data back to que >> $($job.InData)" -ForegroundColor Green
$data.Add($job.InData) | Out-Null
}
$toremove += $job
}
}
Write-Host "Removing completed/failed jobs" -ForegroundColor Yellow
foreach ($job in $toremove)
{
Write-Host " Removing job >> $($job.Location)" -ForegroundColor Green
$jobs.Remove($job) | Out-Null
}
# Check if there is room to start another job
if ($jobs.Count -lt $servers.Count -and $data.Count -gt 0)
{
Write-Host "Checking servers if they can start a new job." -ForegroundColor Yellow
foreach ($server in $servers)
{
$job = $jobs | ? Location -eq $server
if ($job -eq $null)
{
Write-Host " Adding job for $server >> $($data[0])" -ForegroundColor Green
# No active job was found for the server, so add new job
$job = Invoke-Command $server -ScriptBlock {
param($data, $hostname)
$ProcessActive = Get-Process powershell -ErrorAction SilentlyContinue
if($ProcessActive -eq $null)
{
# This will block the thread on the server, so the JobState will not change till it's done or fails.
Invoke-Command -ComputerName $hostname -FilePath C:\ExampleScript.ps1 -ArgumentList $data
Write-Output "ScriptRan"
}
} -ArgumentList $data[0], $env:computername -AsJob
$job | Add-Member -MemberType NoteProperty -Name InData -Value $data[0]
$jobs.Add($job) | Out-Null
$data.Remove($data[0])
}
}
}
# Just a manual check of $jobs
Write-Output $jobs
# Wait a bit before checking again
Start-Sleep -Seconds 10
} while ($data.Count -gt 0)
Basically I create an array, and keep it constantly populated with one job for each server.
Data is removed from the list when a new job starts, and is added back if a job fails. This is to avoid servers running the script with the same data/params.
I lack a proper environment to test this properly at the moment, but will give it a whirl at work tomorrow and update my answer with any changes if needed.

Wait for process to end

I have a powershell script that will automatically print all .pdf files in a folder with Foxit. I have it set to wait until Foxit has exited until the script continues. Every once in a while, the script will pause and wait even though Foxit has already exited. Is there a way to make it time out after a certain amount of time?
Here is the code I have:
Start-Process $foxit -ArgumentList $argument -Wait
Move-Item $filePath $printed[$location]
Add-Content "$printLogDir\$logFileName" $logEntry
I've tried the recommendations here and they don't seem to work. For example if I do:
$proc = Start-Process $foxit -ArgumentList $argument
$proc.WaitForExit()
Move-Item $filePath $printed[$location]
Add-Content "$printLogDir\$logFileName" $logEntry
I get:
You cannot call a method on a null-valued expression.
Any help would be greatly appreciated.
I think I figured it out. If I start it with Invoke-WmiMethod I can get the process ID and wait for it, then ask it to time out.
$proc = Invoke-WmiMethod -Class win32_process -Name create -ArgumentList "$foxit $argument"
Wait-Process -Id $proc.ProcessId -Timeout 120
Move-Item $filePath $printed[$location]
Add-Content "$printLogDir\$logFileName" $logEntry
This seems to work pretty consistantly.
One way to implement a timeout is to use a background job:
$Job = start-job -ScriptBlock { write-output 'start';Start-Sleep -Seconds 15 }
$timeout = New-TimeSpan -Seconds 10
$timer = [diagnostics.stopwatch]::StartNew()
While ($timer.Elapsed -le $timeout)
{
if ($Job.State -eq 'Completed' )
{
Receive-Job $Job
Remove-Job $Job
Return
}
Start-Sleep -Seconds 1
}
write-warning 'Job timed out. Stopping job'
Stop-Job $Job
Receive-Job $Job
Remove-Job $Job
I ran into this same problem and found a slightly simpler solution that also captures the output of the child process. The -PassThru argument is key as it returns a process object for each process that the cmdlet starts.
$proc = Start-Process $foxit -ArgumentList $argument -PassThru
Wait-Process -Id $proc.Id -Timeout 120
Move-Item $filePath $printed[$location]
Add-Content "$printLogDir\$logFileName" $logEntry
I prefer this style to have better control what to do during the runtime of the external process:
$waitTime = 60
$dism = Start-Process "$env:windir\system32\dism.exe" -ArgumentList "/Online /Cleanup-Image /ScanHealth" -PassThru -WindowStyle Hidden
while (!$dism.HasExited -or $waitTime -gt 0) {sleep -Seconds 1; $waitTime--}
"done."
By cheking the HasExited-attribute I can continue with my code with any other tasks in parallel.

Equivalent for tail -f output\worker*.log in Powershell

I need to output several logfiles - while they are written - to the shell.
In the unix version of my script this is achieved by tail -f output\worker*.log. Note the wildcard.
In Powershell I tried Get-Content -Path "output\worker*.log" -Wait, but this only prints the first logfile it can find to the shell.
For completion, here is my code where i call my program:
foreach ($worker in $workers)
{
Write-Host " Start $worker in background"
$block = {& $args[0] $args[1] $args[2] $args[3] $args[4] $args[5] $args[6] 2> $args[7] > $args[8]}
start-job -name $worker -scriptblock $block -argumentlist `
"$strPath\worker\bin\win32\php.exe", `
"-q", `
"-c", `
"$strPath\worker\conf\php_win32.ini", `
"$strPath\worker\bin\os-independant\logfilefilter\logfilefilter.php", `
"-f", `
"$strPath\worker\$worker\conf\logfilefilter-$worker.xml", `
"$strPath\output\$worker-error.log", `
"$strPath\output\$worker.log"
}
Get-Content -Path "output\worker*.log" -Wait
In my test case there are 8 workers and logfiles( output\worker01.log, output\worker02.log, output\worker03.log, output\worker04.log, output\worker05.log, output\worker06.log, output\worker07.log, output\worker08.log )
Is there a workaround to output all these logfiles? Or is it possible to duplicate the stdout stream from the background process to print it in the shell?
You can duplicate the the output streams by reading them directly from the output buffers of the child jobs.
Here's a demo script. Note that for the Verbose output, I'm removing output from the buffer as it's read, so that it doesn't get re-displayed on subsequent reads. This doesn't appear to have any affect on the Receive Job buffers for the job. If you do a Receive Job on it after the script completes, you'll still get the Verbose output all over again.
$sb = {
$VerbosePreference = 'Continue'
for ($i = 1; $i -le 100; $i++ )
{
start-sleep -Milliseconds 150
Write-Verbose "$(get-date)"
write-progress -activity "Search in Progress" -status "$i% Complete:" -percentcomplete $i;}
}
$job = start-job -ScriptBlock $sb
$verbose = ($job.ChildJobs[0].Verbose)
While ($job.State -ne 'Completed')
{
$job.ChildJobs |
foreach {
Start-Sleep -seconds 1
$Pct_Complete = $_.Progress | select -last 1 | select -ExpandProperty PercentComplete
Write-Host "`rBackground job $($_.ID) is $Pct_Complete percent completed." -ForegroundColor Cyan
While ($verbose.count){
Write-Host $verbose[0] -ForegroundColor Gray
$verbose.removeat(0)}
}
}
write-host "`nDone!"