I am making a call operation in a counted FOR LOOP for an array called $files so every call it does:
for ($i=0; $i -lt $files.Count; $i++) {
& $executable $files[$i].FullName -ErrorAction Continue | Out-Null
}
So $executable is the exe I'm running with every file in the $files.
But during this call, I sometimes get a popup of like a help center thing called WerFault and I only need to kill it. This works:
Get-Process -Name WerFault -ErrorAction SilentlyContinue | Stop-Process
BUT I can't get it to call this during the CALL operation. It only calls it before or after the CALL operation and this thing pops up during. So if I run the code and the executable opens and this thing pops up, I can open a new powershell and kill it there. But I can't figure out how to do that in my script.
I tried doing a WHILE loop, like while (CALL operation), get process and kill it but that doesn't work. And I can't use START-PROCESS because I need the out-null (Each file has to open/close one at a time since every file opened closes itself on its own)
Is there any way I can be like for every second of this for loop, check for this process and kill it?
Related
I have a bunch of PDF files that I would like to print in sequence on a windows 7 computer using Powershell.
get-childItem "*.pdf" | sort lastWriteTime | foreach-object {start-process $._Name -verb 'print'}
The printed files are sometimes out of order like 1) A.pdf, 2) C.pdf, 3) B.pdf 4) D.pdf.
Different trials printed out a different sequence of files, thus, I fear the error is related to the printing queue or the start-process command. My guess is that each printing process is fired without waiting for the previous printing process to be completed.
Is there a way to consistently print out PDF files in a sequence that I specify?
You are starting the processes in order, but by default Start-Process does not wait until the command completes before it starts the next one. Since the commands take different amounts of time to complete based on the .PDF file size they print in whatever order they finish in. Try adding the -wait switch to your Start-Process, which will force it to wait until the command completes before starting the next one.
EDIT: Found an article elsewhere on Stack which addresses this. Maybe it will help. https://superuser.com/questions/1277881/batch-printing-pdfs
Additionally, there are a number of PDF solutions out there which are not Adobe, and some of them are much better for automation than the standard Reader. Adobe has licensed .DLL files you can use, and the professional version of Acrobat has hooks into the back end .DLLs as well.
If you must use Acrobat Reader DC (closed system or some such) then I would try opening the file to print and getting a pointer to the process, then waiting some length of time, and forcing the process closed. This will work well if your PDF sizes are known and you can estimate how long it takes to finish printing so you're not killing the process before it finishes. Something like this:
ForEach ($PDF in (gci "*.pdf"))
{
$proc = Start-Process $PDF.FullName -PassThru
Start-Sleep -Seconds $NumberOfSeconds
$proc | Stop-Process
}
EDIT #2: One possible (but untested) optimization is that you might be able use the ProcessorTime counters $proc.PrivilegedProcessorTime and $proc.UserProcessorTime to see when the process goes idle. Of course, this assumes that the program goes completely idle after printing. I would try something like this:
$LastPrivTime = 0
$LastUserTime = 0
ForEach ($PDF in (gci "*.pdf"))
{
$proc = Start-Process $PDF.FullName -PassThru
Do
{
Start-Sleep -Seconds 1
$PrivTimeElapsed = $proc.PrivilegedProcessorTime - $LastPrivTime
$UserTimeElapsed = $proc.UserProcessorTime - $LastUserTime
$LastPrivTime = $proc.PrivilegedProcessorTime
$LastUserTime = $proc.UserProcessorTime
}
Until ($PrivTimeElapsed -eq 0 -and $UserTimeElapsed -eq 0)
$proc | Stop-Process
}
If the program still ends too soon, you might need to increase the # of seconds to sleep inside the inner Do loop.
I need to play all videos in a playlist so I came up with this code.
Foreach ($line in get-content playlist)
{ $line | invoke-item }
Which should go through the file and play each line using the default player.
The problem that it creates a race condition, the processes will start and end so fast that one can only see the last file.
How to overcome this, I tried /path/to/player $line | out-null but the player will stop and needs user intervention in order to proceed.
I would not call this a race condition but the problem that Invoke-Item does not wait for the process to finish. Try to use Start-Process -wait instead.
Update: since it is hard to make Windows Media Player exit after the movie finishes, you could try to invoke vlc instead:
Foreach ($line in get-content pl.txt)
{ Start-Process -wait "C:\...\VideoLAN\VLC\vlc.exe" -ArgumentList "$line vlc://quit" }
Maybe you should approach the problem differently and use the parameters for Media player classic. I don't have it installed here, but this code should give you the general idea:
Make sure all your files are in the same folder
This will play all the files in that folder mpc-hc64.exe "c:\mpcfiles" /play
More information here.
I am calling an external .ps1 file which contains a break statement in certain error conditions. I would like to somehow catch this scenario, allow any externally printed messages to show as normal, and continue on with subsequent statements in my script. If the external script has a throw, this works fine using try/catch. Even with trap in my file, I cannot stop my script from terminating.
For answering this question, assume that the source code of the external .ps1 file (authored by someone else and pulled in at run time) cannot be changed.
Is what I want possible, or was the author of the script just not thinking about playing nice when called externally?
Edit: providing the following example.
In badscript.ps1:
if((Get-Date).DayOfWeek -ne "Yesterday"){
Write-Warning "Sorry, you can only run this script yesterday."
break
}
In myscript.ps1:
.\badscript.ps1
Write-Host "It is today."
The results I would like to achieve is to see the warning from badscript.ps1 and for it to continue on with my further statements in myscript.ps1. I understand why the break statement causes "It is today." to never be printed, however I wanted to find a way around it, as I am not the author of badscript.ps1.
Edit: Updating title from "powershell try/catch does not catch a break statement" to "how to prevent external script from terminating your script with break statement". The mention of try/catch was really more about one failed solution to the actual question which the new title better reflects.
Running a separate PowerShell process from within my script to invoke the external file has ended up being a solution good enough for my needs:
powershell -File .\badscript.ps1 will execute the contents of badscript.ps1 up until the break statement including any Write-Host or Write-Warning's and let my own script continue afterwards.
I get where you're coming from. Probably the easiest way would be to push the script off as a job, and wait for the results. You can even echo the results out with Receive-Job after it's done if you want.
So considering the bad script you have above, and this script file calling it:
$path = Split-Path -Path $MyInvocation.MyCommand.Definition -Parent
$start = Start-Job -ScriptBlock { . "$using:Path\badScript.ps1" } -Name "BadScript"
$wait = Wait-Job -Name "BadScript" -Timeout 100
Receive-Job -Name "BadScript"
Get-Command -Name "Get-ChildItem"
This will execute the bad script in a job, wait for the results, echo the results, and then continue executing the script it's in.
This could be wrapped in a function for any scripts you might need to call (just to be on the safe side.
Here's the output:
WARNING: Sorry, you can only run this script yesterday.
CommandType Name Version Source
----------- ---- ------- ------
Cmdlet Get-ChildItem 3.1.0.0 Microsoft.PowerShell.Management
In the about_Break documentation it says
PowerShell does not limit how far labels can resume execution. The
label can even pass control across script and function call
boundaries.
This got me thinking, "How can I trick this stupid language design choice?". And the answer is to create a little switch block that will trap the break on the way out:
.\NaughtyBreak.ps1
Write-Host "NaughtyBreak about to break"
break
.\OuterScript.ps1
switch ('dummy') { default {.\NaughtyBreak.ps1}}
Write-Host "After switch() {NaughtyBreak}"
.\NaughtyBreak.ps1
Write-Host "After plain NaughtyBreak"
Then when we call OuterScript.ps1 we get
NaughtyBreak about to break
After switch() {NaughtyBreak}
NaughtyBreak about to break
Notice that OuterScript.ps1 correctly resumed after the call to NaughtyBreak.ps1 embedded in the switch, but was unceremoniously killed when calling NaughtyBreak.ps1 directly.
Putting break back inside a loop (including switch) where it belongs.
foreach($i in 1) { ./badscript.ps1 }
'done'
Or
switch(1) { 1 { ./badscript.ps1 } }
'done'
So I'm preforming some automated testing using powershell in jenkins. I'm testing a web application where I must fill out forms, retrieve values, etc.
It is all fine but the web app contains some pop up messages that appear every now and then, which causes the main script to freeze until they are manually closed in the application. Below is a link to a stack overflow thread with a similar problem.
Powershell Website Automation: Javascript Popup Box Freeze
I followed the advice of the first answer. I created a separate powershell script that is executing constantly and can tell if there is a pop up window present (as they have their own process ids, so if there is more then one iexplore process id present it must be a popup) and then uses sendkeys to close it.
Main Script example:
#start application
start C:\Users\Webapp
#start the monitor script
Start-Process Powershell.exe -Argumentlist "-file C:\Users\Monitor.ps1"
#Get app as object
$app = New-Object -ComObject Shell.Application
$ClientSelectPage = $app.Windows() | where {$_.LocationURL -like "http:webapp.aspx"}
#Input value to cause popup message
$MemNumberInput = $ClientSelectPage.Document.getElementByID("MemNum")
$MemNumberInput.Select()
$MemNumberInput.value = "22"
$FindBtn.click()
It is at this point my script will freeze (as pop up window appears to tell me info abotu the client I've inserted) If this popup can be seen as a process, the monitor code will close it.
Example of monitor
$i = 0
while($i -eq 0)
{
#Check what process are currently running under the webapps name
$Mainprocid = Get-Process | where {$_.mainWindowTitle -like "*webapp*" } | select -expand id
$Mainprocid.COUNT
$integer = [int]$Mainprocid
#If there is only one process, no action
if( $Mainprocid.count -eq 1)
{
echo "no popup"
}
else
{
if($integer -eq '0')
{
#If there are no processes close the script
$i = 1
echo "close process"
}
else
#If there are two processes one must be a pop, send 'enter' to the app
{
echo "POP UP!"
$title = Get-Process |where {$_.mainWindowTItle -like "*webapp*"}
#Code to sendkeys 'ENTER' to the application to close the popup follows here
}
}
}
However, for whatever reason, some pop ups cannot be found as processes and the monitor script is useless with them. These are few and far between, so I figured the best way was for the monitor script to check and see if the main script has frozen for a certain amount of time. If so, it can use the sendkeys method it does for the other popups.
Is there a way for me to check and see if the main script has frozen, from the monitor script? I understand I could pass a parameter from the main script every now and then, to let the monitor script know it is still active, but this seems like a messy way of doing it, and an alternative method would be preferable.
Both scripts are saved as .ps1 files.
I want to monitor a log file which is constantly being added to (every few seconds) over the course of a 2 hour period. I am currently using Get-Content file.txt –Wait which displays the content to the screen and allows me to see what’s being added to the file but I need to take this a step further and actually watch for specific messages and if something I’m looking for in the log file appears, then do something. Initially I used a .net file reader with a for loop as shown below
try {
for(;;) {
$line = $log_reader.ReadLine()
if ($line -match "something")
{
Write-Host "we have a match"
Break
}
The issue with this however is that it was causing the process that is generating the log file to fall over - it throws an error (because another process is using the log file it’s creating – I thought this was odd because I assumed the .net stream reader would just be ‘reading’ the file). I don’t have any control over the process which is generating the log file so I don’t know what it’s doing exactly (I’m guessing it has the file in read/write mode with some kind of lock which gets upset when I try to read the file using a .net stream reader). Doing a Get-Content on the file doesn’t seem to cause this issue however.
The question is, how can I use something like Get-Content (or another process) to monitor the log file but move onto another part of the script if a message I’m looking for in the log appears?
If you constantly want to monitor the log file and catch the desired pattern as soon as it appears:
while(!(Select-String -path 'c:\log.txt' -pattern 'something' -quiet)){};
## when finds the pattern, comes out of the loop and proceed to next part of the script
Write-host 'we have a match'
You may want to change the path of the file to yours.
Though this would work, there will be a lot of processing that your computer have to do, since the while loop is a constant loop. If you can afford to introduce some delay, like if it is OK to find the error 30 sec or whatever is your threshold, after it appeared, then you can consider introducing sleep:
while(!(Select-String -path 'c:\log.txt' -pattern 'something' -quiet)){start-sleep -seconds 30};
## when finds the pattern, comes out of the loop and proceed to next part of the script
Write-host 'we have a match'
You can write a small logic to terminate the script after two hours, otherwise it would become an infinite loop, if 'something' doesn't gets written at all in the log file.
Edit 1:
If you want to print the new lines at the console, you can try manipulating a bit like:
while(!(Select-String -path 'c:\log.txt' -pattern 'something' -quiet))
{
$a = get-content 'c:\log.txt'
if(($a.count) -gt $b )
{
$a[$b..($a.count)]
}
$b = ($a.count)
start-sleep -Seconds 30
}
## To print the line containing the pattern when while loop exited
(Get-content 'c:\log.txt')[-1]