How to capture external command progress in PowerShell? - powershell

I'm using a PowerShell script to synchronize files between network directories. Robocopy is running in the background.
To capture the output and give statistics to the user, currently I'm doing something like:
$out = (robocopy $src $dst $options)
Once that is done, a custom windows form is presented with a multi-line text box containing the output string.
However, doing this way halts the script execution until file copy is done. Since all the other input screen are presented to the user as graphical dialogues, I would like to give user progress output in a graphical way.
Is there a way to capture the stdout from robocopy, on the fly ?
Then the next question would be:
How to pipe that output into a form with a text box?

You can run the robocopy job in the background and keep checking on the progress.
Start-Job will start the process in the background
Receive-Job will give you all the data that has been printed so far.
$job = Start-Job -scriptBlock { robocopy $using:src $using:dst $using:options }
$out = ""
while( ($job | Get-Job).HasMoreData -or ($job | Get-Job).State -eq "Running") {
$out += (Receive-job $job)
Write-output $out
Start-Sleep 1
}
Remove-Job $job

Related

Print PDFs to specific printers based on filename

I would just like to preface this by saying I am brand new to Powershell and have been trying to learn by picking things up here and there. I'm currently trying to automate a process within my company using strictly powershell and Adobe reader.
Our company currently is manually printing individual sets of records and a separate cover page, binding them, and sending them off. An idea to automate this process was to fill a folder with a zipped set of .pdfs for the day. This zip file would then be extracted and it's contents moved to another folder. PDFs with the normal set of records listed as "WO-xxxxxx Set" and the cover page as "WO-xxxxxx Cover". All I would need to do is create a simple script that prints these out in order, so that "WO-000001 Cover" is on top of "WO-000001 Set" and then print the next set in the order.
The complication I've run into is that Start-Process -FilePath $File.Fullname -Verb Print only allows me to target a default printer. Our Covers will need to be printed on thicker paper, and as such I thought the best course of action would be to create two printers on the network with the required printer settings. If I could have the script swap between the two printers based on file name then it would solve my issue.
This script is sending the documents to the printer in order but not actually swapping the default printer. I'm sure this is something I've done wrong in my IfElse cmdlet and would appreciate an experts eye in this.
Function UnZipEverything($src, $dest)
{
[System.Reflection.Assembly]::LoadWithPartialName("System.IO.Compression.FileSystem") | Out-Null
$zps = Get-ChildItem $src -Filter *.zip
foreach ($zp IN $zps)
{
$all = $src + $zp
[System.IO.Compression.ZipFile]::ExtractToDirectory($all, $dest)
}
}
UnZipEverything -src "C:\Users\admin\Desktop\Zip Test\" -dest'C:\Users\admin\Desktop\UnZip Test\'
Remove-Item "C:\Users\admin\Desktop\Zip Test\*.zip"
$files = Get-ChildItem “C:\Users\admin\Desktop\UnZip Test\*.*” -Recurse
ForEach ($file in $files){
If ($files -eq '*Cover*') {
(New-Object -ComObject WScript.Network).SetDefaultPrinter('Test')
Start-Process -FilePath $File.FullName -Verb Print -PassThru | %{ sleep 10;$_ } | kill
(New-Object -ComObject WScript.Network).SetDefaultPrinter('\\RFC-Print01\Collections Tray 6')
}
Else {Start-Process -FilePath $File.FullName -Verb Print -PassThru | %{ sleep 10;$_ } | kill
}
}
Any help would be greatly appreciated.
If you use the verb PrintTo instead of Print, you can specify the printer:
Start-Process -FilePath $File.FullName -Verb PrintTo '\\RFC-Print01\Collections Tray 6' -PassThru
This would allow you to remove the SetDefaultPrinter calls from the script.

Exit PowerShell script in controlled fashion using keyboard shortcut

I have a PowerShell script that performs numerous file management tasks and occasionally I have the need to terminate the script before it has finished processing. In order to terminate the script cleanly and not leave any files dotted around, I have the script read in a variable from a config file every time the foreach of the gci results is processed, as below:
Get-ChildItem $fileDir -Filter *.doc | Foreach-Object {
Get-Content $confFile | Foreach-Object {
$var = $_.Split('=')
New-Variable -Name $var[0] -Value $var[1] -Force
}
if ($process -eq "TRUE") {
<process File operations>
}
}
This allows me to change the value of process in the config file to anything but TRUE and the script will skip the process (although still loop though until complete).
Is there anyway to use a keyboard shortcut to exit the script in a controlled fashion? i.e. after an iteration of the foreach loop has completed. e.g. Press Ctrl+Q and it will exit cleanly as opposed to Ctrl+C.

powershell invoke-command returns upon completion

I would like to parallel copy using robocopy and returning log data when it is done.
if i were to use invoke-command server1,server2,server3 { robocopy }, it spits out the logs as it is copying for $1, $2, $3. this getting very confusing and hard to read. Is there a way to have the log or progress returns when copying is done?
for example:
server3 done.. display all copied files
server1 done.. display all copied files
server2 done.. display all copied files
instead of
$3 display copied files
server1 display copied files
server2 display copied files
server2 display copied files
server1 display copied files
server2 display copied files
I know i could use runspace to multi threads and check completion for handle to display the data (which i am currently using). but i would like to know if it is possible to do the same with invoke-command and it is easier :).
thanks
To expand on my comment, you could create your desired action as a scriptblock, use the -AsJob parameter to make background jobs for the copy actions, then wait for the jobs to finish and output the results when they're done.
$SB = {
& robocopy \\FileServer\Share\KittenPictures C:\webroot\MySite\Images *.*
}
$Servers = ('ServerA','ServerB','ServerC')
$Jobs = Invoke-Command -ScriptBlock $SB -ComputerName $Servers -AsJob
While($Jobs.JobStateInfo -Contains Running){
$Jobs | Wait-Job -Any
Get-Job -State Completed | ForEach{
$_ | Receive-Job
$_ | Remove-Job
}
}

Count number of scripts running and wait for them to finish

I'm looking for the best way to count the number of PowerShell scripts that are currently running.
I run .ps1 scripts from windows batch files. The script I am working on now is launched when a particular email is received from a client - but I want this script to first of all check that no other scripts are busy running at the moment, and if they are it must wait for them to finish before it continues.
I'm sure there are a few ways to go about this, but what would be the safest? I am still learning.
If it is possible to move away from batch files to launch PowerShell then I would suggest using Start-Process to launch your scripts. This will allow you to wait for your processes to exit using where-object and Measure-Object to filter the scripts that have not yet completed.
So your script might look something like this:
# create a loop
foreach ($item in $reasontoloop) {
$arguments = "define script names and arguments"
# Start the powershell script
$procs += Start-Process powershell -PassThru -argumentlist $arguments
}
Write-Host -message "Waiting for Processes to complete"
while( $procs | Where-Object { $_.hasExited -eq $false } )
{
# Display progress
$measureInfo = $procs | Where-Object { $_.hasExited -eq $true } | Measure-Object
write-host "$($measureInfo.count) of $($procs.Length) still running"
Start-Sleep 1
}
Write-Host -message "Processes complete"
If you are simply interested in the number of PowerShell instances executing then the following one liner using Get-Process will help.
#(Get-Process | where-object {$_.ProcessName -like 'powershell'}).count

Powershell pipe into exe and wait

I am piping an array of data into a executable program but I need it to block after every call in the foreach loop. It will leave the loop before it even opens the program from the first call.
Set-Alias program "whatever.exe"
foreach ($data in $all_data)
{
$data| %{ program /command:update /path:"$_" /closeonend:2 }
}
I like PowerShell but I never really learned Invoke-Command. So whenever I need to run an EXE I always use cmd. If you type cmd /? you get its help, look at the "c" switch. I'd do something like this:
foreach ($data in $all_data){
$data |
Foreach-Object{
cmd /c "whatever.exe" /command:update /path:"$_" /closeonend:2
}
}
If you don't like the cmd /c thing you could use Jobs.
foreach ($data in $all_data){
$data |
Foreach-Object{
$job = Start-Job -InitializationScript {Set-Alias program "whatever.exe"} -ScriptBlock {program /command:update /path:"$($args[0])" /closeonend:2} -ArgumentList $_
while($job.Status -eq 'Running'){
Start-Sleep -Seconds 3
#Could make it more robust and add some error checking.
}
}
}
I can think of two ways to tackle this:
pipe your executable call to Out-Null
shell out the call to cmd.exe /c (as shown in #BobLobLaw's answer)
I made your sample code a little more specific so I could run and test my solutions; hopefully it'll translate. Here's what I started with to be equivalent to your sample code, i.e. the script executes with no waiting on the executable to finish.
# I picked a specific program
Set-Alias program "notepad.exe"
# And put some values in $all_data, specifically the paths to three text files.
$all_data = Get-Item B:\matt\Documents\*.txt
# This opens each file in notepad; three instances of notepad are running
# when the script finishes executing.
$all_data | %{ program "$_" }
Here's the same code as above, but piping to Out-Null forces the script to wait on each iteration of the loop.
# I picked a specific program
Set-Alias program "notepad.exe"
# And put some values in $all_data, specifically the paths to three text files.
$all_data = Get-Item B:\matt\Documents\*.txt
# Piping the executable call to out-null forces the script execution to wait
# for the program to complete. So in this example, the first document opens
# in notepad, but the second won't open until the first one is closed, and so on.
$all_data | %{ program "$_" | Out-Null}
And, lastly, the same code (more or less) using cmd /c to call the executable and make the script wait.
# Still using notepad, but I couldn't work out the correct call for
# cmd.exe using Set-Alias. We can do something similar by putting
# the program name in a plain old variable, though.
#Set-Alias program "notepad.exe"
$program = "notepad.exe"
# Put some values in $all_data, specifically the paths to three text files.
$all_data = Get-Item B:\matt\Documents\*.txt
# This forces script execution to wait until the call to $program
# completes. Again, the first document opens in notepad, but the second
# won't open until the first one is closed, and so on.
$all_data | %{ cmd /c $program "$_" }
Depending on your scenario, wait-job might be overkill. If you have a programmatic way to know that whatever.exe has done its thing, you could try something like
do {start-sleep -sec 2} until ($done -eq $true)
Oh and.