I have a bunch of PDF files that I would like to print in sequence on a windows 7 computer using Powershell.
get-childItem "*.pdf" | sort lastWriteTime | foreach-object {start-process $._Name -verb 'print'}
The printed files are sometimes out of order like 1) A.pdf, 2) C.pdf, 3) B.pdf 4) D.pdf.
Different trials printed out a different sequence of files, thus, I fear the error is related to the printing queue or the start-process command. My guess is that each printing process is fired without waiting for the previous printing process to be completed.
Is there a way to consistently print out PDF files in a sequence that I specify?
You are starting the processes in order, but by default Start-Process does not wait until the command completes before it starts the next one. Since the commands take different amounts of time to complete based on the .PDF file size they print in whatever order they finish in. Try adding the -wait switch to your Start-Process, which will force it to wait until the command completes before starting the next one.
EDIT: Found an article elsewhere on Stack which addresses this. Maybe it will help. https://superuser.com/questions/1277881/batch-printing-pdfs
Additionally, there are a number of PDF solutions out there which are not Adobe, and some of them are much better for automation than the standard Reader. Adobe has licensed .DLL files you can use, and the professional version of Acrobat has hooks into the back end .DLLs as well.
If you must use Acrobat Reader DC (closed system or some such) then I would try opening the file to print and getting a pointer to the process, then waiting some length of time, and forcing the process closed. This will work well if your PDF sizes are known and you can estimate how long it takes to finish printing so you're not killing the process before it finishes. Something like this:
ForEach ($PDF in (gci "*.pdf"))
{
$proc = Start-Process $PDF.FullName -PassThru
Start-Sleep -Seconds $NumberOfSeconds
$proc | Stop-Process
}
EDIT #2: One possible (but untested) optimization is that you might be able use the ProcessorTime counters $proc.PrivilegedProcessorTime and $proc.UserProcessorTime to see when the process goes idle. Of course, this assumes that the program goes completely idle after printing. I would try something like this:
$LastPrivTime = 0
$LastUserTime = 0
ForEach ($PDF in (gci "*.pdf"))
{
$proc = Start-Process $PDF.FullName -PassThru
Do
{
Start-Sleep -Seconds 1
$PrivTimeElapsed = $proc.PrivilegedProcessorTime - $LastPrivTime
$UserTimeElapsed = $proc.UserProcessorTime - $LastUserTime
$LastPrivTime = $proc.PrivilegedProcessorTime
$LastUserTime = $proc.UserProcessorTime
}
Until ($PrivTimeElapsed -eq 0 -and $UserTimeElapsed -eq 0)
$proc | Stop-Process
}
If the program still ends too soon, you might need to increase the # of seconds to sleep inside the inner Do loop.
Related
I tried following powershell-command, but then 1000 windows opened and the powershell ISE crashed. Is there a way to run the batch-file 1000 times in the background? And is there a smarter way that leads to the average execution time?
That's the code I've tried:
cd C:\scripts
Measure-Command {
for($i = 0;$i -lt 1000;$i++){
Start-Process -FilePath "C:\scripts\open.bat"
}
}
Start-Process by default runs programs asynchronously, in a new console window.
Since you want to run your batch file synchronously, in the same console window, invoke it directly (which, since the path is double-quoted - though it doesn't strictly have to be in this case - requires &, the call operator for syntactic reasons):
Measure-Command {
foreach ($i in 1..1000){
& "C:\scripts\open.bat"
}
}
Note: Measure-Command discards the success output from the script block being run; if you do want to see it in the console, use the following variation, though note that it will slow down processing:
Measure-Command {
& {
foreach ($i in 1..1000){
& "C:\scripts\open.bat"
}
} | Out-Host
}
This answer explains in more detail why Start-Process is typically the wrong tool for invoking console-based programs and scripts.
Measure-Command is the right tool for performance measurement in PowerShell, but it's important to note that such measurements are far from an exact science, given PowerShell's dynamic nature, which involves many caches and on-demand compilation behind the scenes.
Averaging multiple runs generally makes sense, especially when calling external programs; by contrast, if PowerShell code is executed repeatedly and the repeat count exceeds 16, on-demand compilation occurs and speeds up subsequent executions, which can skew the result.
Time-Command is a friendly wrapper around Measure-Command, available from this MIT-licensed Gist[1]; it can be used to simplify your tests.
# Download and define function `Time-Command` on demand (will prompt).
# To be safe, inspect the source code at the specified URL first.
if (-not (Get-Command -ea Ignore Time-Command)) {
$gistUrl = 'https://gist.github.com/mklement0/9e1f13978620b09ab2d15da5535d1b27/raw/Time-Command.ps1'
if ((Read-Host "`n====`n OK to download and define benchmark function ``Time-Command`` from Gist ${gistUrl}?`n=====`n(y/n)?").Trim() -notin 'y', 'yes') { Write-Warning 'Aborted.'; exit 2 }
Invoke-RestMethod $gistUrl | Invoke-Expression
if (-not ${function:Time-Command}) { exit 2 }
}
Write-Verbose -Verbose 'Running benchmark...'
# Omit -OutputToHost to run the commands quietly.
Time-Command -Count 1000 -OutputToHost { & "C:\scripts\open.bat" }
Note that while Time-Command is a convenient wrapper even for measuring a single command's performance, it also allows you to compare the performance of multiple commands, passed as separate script blocks ({ ... }).
[1] Assuming you have looked at the linked Gist's source code to ensure that it is safe (which I can personally assure you of, but you should always check), you can install it directly as follows:
irm https://gist.github.com/mklement0/9e1f13978620b09ab2d15da5535d1b27/raw/Time-Command.ps1 | iex
I'm just wondering if I can clear Out-Gridview on every loop like I can in the console:
while (1) { ps | select -first 5; sleep 1; clear-host }
Unfortunately this doesn't clear out-gridview every time:
& { while (1) { ps | select -first 5; sleep 1; clear-host } } | out-gridview
Clear-Host clears the display of the host, which is the console window's content in a regular PowerShell console.
By contrast, Out-GridView is a separate GUI window, over which PowerShell offers no programmatic display once it is being displayed.
Notably, you can neither clear no refresh the window's content after it is displayed with the initial data.
The best approximation of this functionality is to close the old window and open a new one with the new data in every iteration - but note that this will be visually disruptive.
In the simplest case, move the Out-GridView into the loop and call it with -Wait, which requires you to close it manually in every iteration, however:
# NOTE: Doesn't move to the next iteration until you manually close the window.
while (1) { ps | select -first 5 | Out-GridView -Wait }
This answer shows how to implement an auto-closing Out-GridView window, but it is a nontrivial effort - and with a sleep period as short as 1 second it will be too visually disruptive.
Ultimately, what you're looking for is a GUI version of the Unix watch utility (or, more task-specifically, the top utility).
However, since you're not looking to interact with the Out-GridView window, there's little benefit to using Out-GridView in this case.
Instead, you could just spawn a new console window that uses Clear-Host to display the output in the same screen position periodically:
The following defines helper function Watch-Output to facilitate that:
# Simple helper function that opens a new console window and runs
# the given command periodically, clearing the screen beforehand every time.
function Watch-Output ([scriptblock] $ScriptBlock, [double] $timeout = 1) {
$watchCmd = #"
while (1) {
Clear-Host
& { $($ScriptBlock -replace '"', '\"') } | Out-Host
Start-Sleep $timeout
}
"# #'
Start-Process powershell.exe "-command $watchCmd"
}
# Invoke with the command of interest and a timeout.
Watch-Output -ScriptBlock { ps | Select -First 5 } -Timeout 1
Note that this will still flicker every time the window content is refreshed.
Avoiding that would require substantially more effort.
The PowerShellCookbook module offers the sophisticated Watch-Command cmdlet, which not only avoids the flickering but also offers additional features.
The big caveat is that - as of version 1.3.6 - the module has several cmdlets that conflict with built-in ones (Format-Hex, Get-Clipboard, New-SelfSignedCertificate, Send-MailMessage, Set-Clipboard), and the only way to import the module is to allow the module's commands to override the built-in ones (Import-Module PowerShellCookbook -AllowClobber).
I am making a call operation in a counted FOR LOOP for an array called $files so every call it does:
for ($i=0; $i -lt $files.Count; $i++) {
& $executable $files[$i].FullName -ErrorAction Continue | Out-Null
}
So $executable is the exe I'm running with every file in the $files.
But during this call, I sometimes get a popup of like a help center thing called WerFault and I only need to kill it. This works:
Get-Process -Name WerFault -ErrorAction SilentlyContinue | Stop-Process
BUT I can't get it to call this during the CALL operation. It only calls it before or after the CALL operation and this thing pops up during. So if I run the code and the executable opens and this thing pops up, I can open a new powershell and kill it there. But I can't figure out how to do that in my script.
I tried doing a WHILE loop, like while (CALL operation), get process and kill it but that doesn't work. And I can't use START-PROCESS because I need the out-null (Each file has to open/close one at a time since every file opened closes itself on its own)
Is there any way I can be like for every second of this for loop, check for this process and kill it?
I am recovering files from a hard drive wherein some number of the files are unreadable. I'm unable to change the hardware level timeout / ERC, and it's extremely difficult to work around when I have several hundred thousand files, any tens of thousands of which might be unreadable.
The data issues were the result of a controller failure. Buying a matching drive (all the way down), I've been able to access the drive, and can copy huge swaths of it without issues. However, there are unreadable files dotted throughout the drive that, when accessed, will cause the SATA bus to hang. I've used various resumable file copy applications like robocopy, RichCopy, and a dozen others, but they all have the same issue. They have a RETRY count that is based on actually getting an error reported from the drive. The issue is that the drive is taking an extremely long time to report the error, and this means that a single file may take up to an hour to fail officially. I know how fast each file SHOULD be, so I'd like to build a powershell CMDLET or similar that will allow me to pass in a source and destination file name, and have it try to copy the file. If, after 5 seconds, the file hasn't copied (or if it has - this can be a dumb process), I'd like it to quit. I'll write a script that fires off each copy process individually, waiting for the process before it to finish, but I'm so far unable to find a good way of putting a time limit on the process.
Any suggestions you might have would be greatly appreciated!
Edit: I would be happy with spawning a Copy-Item in a new thread, with a new PID, then counting down, then killing that PID. I'm just a novice at PowerShell, and have seen so many conflicting methods for imposing timers that I'm lost on what the best practices way would be.
Edit 2: Please note that applications like robocopy will utterly hang when encountering the bad regions of the disk. These are not simple hangs, but bus hangs that windows will try to preserve in order to not lose data. In these instances task manager is unable to kill the process, but Process Explorer IS. I'm not sure what the difference in methodology is, but regardless, it seems relevant.
I'd say the canonical way of doing things like this in PowerShell are background jobs.
$timeout = 300 # seconds
$job = Start-Job -ScriptBlock { Copy-Item ... }
Wait-Job -Job $job -Timeout $timeout
Stop-Job -Job $job
Receive-Job -Job $job
Remove-Job -Job $job
Replace Copy-Item inside the scriptblock with whatever command you want to run. Beware though, that all variables you want to use inside the scriptblock must be either defined inside the scriptblock, passed in via the -ArgumentList parameter, or prefixed with the using: scope qualifier.
An alternative to Wait-Job would be a loop that waits until the job is completed or the timeout is reached:
$timeout = (Get-Date).AddMinutes(5)
do {
Start-Sleep -Milliseconds 100
} while ($job.State -eq 'Running' -and (Get-Date) -lt $timeout)
I want to monitor a log file which is constantly being added to (every few seconds) over the course of a 2 hour period. I am currently using Get-Content file.txt –Wait which displays the content to the screen and allows me to see what’s being added to the file but I need to take this a step further and actually watch for specific messages and if something I’m looking for in the log file appears, then do something. Initially I used a .net file reader with a for loop as shown below
try {
for(;;) {
$line = $log_reader.ReadLine()
if ($line -match "something")
{
Write-Host "we have a match"
Break
}
The issue with this however is that it was causing the process that is generating the log file to fall over - it throws an error (because another process is using the log file it’s creating – I thought this was odd because I assumed the .net stream reader would just be ‘reading’ the file). I don’t have any control over the process which is generating the log file so I don’t know what it’s doing exactly (I’m guessing it has the file in read/write mode with some kind of lock which gets upset when I try to read the file using a .net stream reader). Doing a Get-Content on the file doesn’t seem to cause this issue however.
The question is, how can I use something like Get-Content (or another process) to monitor the log file but move onto another part of the script if a message I’m looking for in the log appears?
If you constantly want to monitor the log file and catch the desired pattern as soon as it appears:
while(!(Select-String -path 'c:\log.txt' -pattern 'something' -quiet)){};
## when finds the pattern, comes out of the loop and proceed to next part of the script
Write-host 'we have a match'
You may want to change the path of the file to yours.
Though this would work, there will be a lot of processing that your computer have to do, since the while loop is a constant loop. If you can afford to introduce some delay, like if it is OK to find the error 30 sec or whatever is your threshold, after it appeared, then you can consider introducing sleep:
while(!(Select-String -path 'c:\log.txt' -pattern 'something' -quiet)){start-sleep -seconds 30};
## when finds the pattern, comes out of the loop and proceed to next part of the script
Write-host 'we have a match'
You can write a small logic to terminate the script after two hours, otherwise it would become an infinite loop, if 'something' doesn't gets written at all in the log file.
Edit 1:
If you want to print the new lines at the console, you can try manipulating a bit like:
while(!(Select-String -path 'c:\log.txt' -pattern 'something' -quiet))
{
$a = get-content 'c:\log.txt'
if(($a.count) -gt $b )
{
$a[$b..($a.count)]
}
$b = ($a.count)
start-sleep -Seconds 30
}
## To print the line containing the pattern when while loop exited
(Get-content 'c:\log.txt')[-1]