How can you keep a Powershell script running continuously with TaskScheduler? - powershell

I have a (dumbed down for here) Powershell script that I want to run when a user logs in (actually through auto login) that I want it to be always running.
$i = 0
$max = 5
$f = "D:\VirtualBox-Powershell\out.txt"
do {
$i += 1
Add-Content -Path $f -Value $((Get-Date).ToString())
Start-Sleep -Seconds 1
}
while ($i -lt $max)
exit $i
Yes, I’m aware this will stop after 5 seconds. My actual code is in a continuous loop. I’m using TaskScheduler to start my script after login. Theoretically that should be enough. But the script stops for some unknown reason. It’s either a bug in the script or something else is killing the process.
So I added a trigger to attempt to run the script every minute (which for testing purposes 5 seconds is plenty). When I right click my task and select Run, the script sure enough modifies the file. The task shows running and after 5 seconds I refresh and it shows ready.
But when the trigger fires (the last Run Time shows it started at the correct time), and I refresh it immediately shows Ready.
In both runs the exit code is Ox 1
I’ve tried starting the script by running the ps1 file directly, as well as using C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe and passing in the script as a parameter.
I’ve had some form of this issue for years, and have never found a decent workaround. I’m hoping someone here does.
Windows 10, btw

Related

Run octave script from powershell and wait for it to finish

I'm trying to run a very simple octave .m file from Powershell but i can't manage to make powershell wait for it to finish. In fact, it launches the script execution but then immediatly starts executing the next line.
As I said the script is really simple, just a test
a=100
% Saving just to be sure that the script excuted
save test.mat a
% Pausing to be sure that the execution is not too fast
pause(10)
disp("no way")
And in powershell I simply run
octave --persist test.m
but prompt doesn't wait for octave to finish execution. It seems somehow it runs it async in another process.
I've tried running the script from batch with the wait option
START /W octave --persist test.m
but the result still the same.
Thanks in advance
EDIT 1
Thanks to #Trey Nuckolls I'm using this patch:
$Donefilename = 'done'
if (Test-Path $Donefilename) {
Remove-Item $Donefilename
Write-Host "Last execution $Donefilename has been deleted"
}
else {
Write-Host "$Donefilename doesn't exist"
}
octave --persist test.m
do{
$i++
Start-Sleep -Seconds 2
$OctaveComplete = Test-Path -Path $Donefilename
}
until ($OctaveComplete -or ($i -eq 30))
Making the octave script writing an empty "done" file at the end of execution. Not the best solution although; I'm not able to redirect the execution output for example.
EDIT 2
So, i managed to find the problem thanks to all your responses and comments. It seems that when i was calling octave from windows it wasn't calling the executable but something else. Getting the right path and executing:
& "C:/Program Files/GNU Octave/Octave-6.4.0/mingw64/bin/octave-cli.exe" test.m
works perfectly (you need just to add exit at the end of the script). So, it was a matter of path.
You might consider putting a wait loop into your invoking script like...
do{
$i++
Start-Sleep -Seconds 10 #Assuming that 30 seconds is way too long
$OctiveComplete = <**Boolean returning function that becomes 'True' when run is complete**>
}
until ($OctiveComplete -or ($i -eq 4))
This would go directly after the line that invokes Octave.

Powershell remote installation of exe files but it never finishes

Ok, so I have a problem currently where I am trying to remotely deploy and run exe files remotely. I have a Ninite installation of just a simple 7zip on it, which I wanna install, and for now I have gotten so close that it even executes the file, but it never finishes the installation. I can see it in the Task Manager, but it never makes any progress, nor does it use any CPU, only a small bit of memory.
The code looks like this:
Copy-item -path C:\windows\temp\installer.exe -destination 'c:\users\administrator\desktop\installer -tosession
start-sleep -seconds 2
invoke-command -computername MyPc -scriptblock { $(& cmd c/ 'c:\users\administrator\desktop\installer\installer.exe' -silent -wait -passthru | select Exitcode }
$session | remove-possession
this runs, but it never stops running. the script just constantly runs without ever finishing. If I go and look in the Task Manager, I can clearly see both(?) the installation files going on somewhere, but it doesn't do anything but just sit there and never finishes. I have now tried to let it sit for about 15 minutes (the installation takes about 2-3 min at max) and still, no progress.
Any ideas as to what could cause this? or atleast how to fix it and let it finish. this is the last thing I need to get done in my project and this is done and I am so tired of this already.
NOTE: I have found out that if I have ANY kind of parameters (-wait, -passthru, etc) then it will become stuck forever with 0 CPU usage, even if i just have argumentlist attached with 0 arguments it will do the same, but if i have nothing but the command in the scriptblock, then it will go through, say "done master" and then be the biggest liar because the software has not been installed, it just closed the installer instead.

running a command repeatedly on a remote server

I wrote the following script to run a command on a remote server with 5 sec interval. The command inside $LogrCmd variable runs on a remote server to check if a particular service is up or down. I expect the script to poll the service every 5 seconds until the service is completely down. However the scripts exits out immediately even if the service is up.
$LogrCmd = get-content 'c:\temp\info.cfg' | select-string -Pattern cheetahdev
while (-not (Invoke-Command -ScriptBlock {& cmd.exe /c "$LogrCmd"})) {
## Wait a specific interval
Start-Sleep -Seconds 5
}
Here's the contents of the info.cfg file which runs against the remote host.
"C:\PWX\pwxcmd displaystatus -sv cheetahdev"
You would do better to use a do / while loop here instead of a while loop:
$LogrCmd = Get-Content 'c:\temp\info.cfg' | Select-String -Pattern cheetahdev
do {
cmd.exe /c "$LogrCmd"
## Wait a specific interval
Start-Sleep -Seconds 5
} while ( $LASTEXITCODE -eq 0 )
This will always run the command once, sleep, then check to see if the command succeeded. If the command succeeded it will continue the loop. Of course, you can tailor the while condition to check for other exit codes and conditions as well.
Note that for external commands it's best to rely on $LASTEXITCODE most of the time to check for command success, unless you need to parse the output of the command or something else less common.
Also note that by reading the full command from the file like that opens you up to code injection attacks by someone familiar with how to manipulate your info.cfg.

PowerShell Self-Updating Script

We have a PowerShell script to continually monitor a folder for new JSON files and upload them to Azure. We have this script saved on a shared folder so that multiple people can run this script simultaneously for redundancy. Each person's computer has a scheduled task to run it at login so that the script is always running.
I wanted to update the script, but then I would have had to ask each person to stop their running script and restart it. This is especially troublesome since we eventually want to run this script in "hidden" mode so that no one accidentally closes out the window.
So I wondered if I could create a script that updates itself automatically. I came up with the code below and when this script is run and a new version of the script is saved, I expected the running PowerShell window to to close when it hit the Exit command and then reopen a new window to run the new version of the script. However, that didn't happen.
It continues along without a blip. It doesn't close the current window and it even keeps the output from old versions of the script on the screen. It's as if PowerShell doesn't really Exit, it just figures out what's happening and keeps going on with the new version of the script. I'm wondering why this is happening? I like it, I just don't understand it.
#Place at top of script
$lastWriteTimeOfThisScriptWhenItFirstStarted = [datetime](Get-ItemProperty -Path $PSCommandPath -Name LastWriteTime).LastWriteTime
#Continuous loop to keep this script running
While($true) {
Start-Sleep 3 #seconds
#Run this script, change the text below, and save this script
#and the PowerShell window stays open and starts running the new version without a hitch
"Hi"
$lastWriteTimeOfThisScriptNow = [datetime](Get-ItemProperty -Path $PSCommandPath -Name LastWriteTime).LastWriteTime
if($lastWriteTimeOfThisScriptWhenItFirstStarted -ne $lastWriteTimeOfThisScriptNow) {
. $PSCommandPath
Exit
}
}
Interesting Side Note
I decided to see what would happen if my computer lost connection to the shared folder where the script was running from. It continues to run, but presents an error message every 3 seconds as expected. But, it will often revert back to an older version of the script when the network connection is restored.
So if I change "Hi" to "Hello" in the script and save it, "Hello" starts appearing as expected. If I unplug my network cable for a while, I soon get error messages as expected. But then when I plug the cable back in, the script will often start outputting "Hi" again even though the newly saved version has "Hello" in it. I guess this is a negative side-effect of the fact that the script never truly exits when it hits the Exit command.
. $PSCommand is a blocking (synchronous) call, which means that Exit on the next line isn't executed until $PSCommand has itself exited.
Given that $PSCommand here is your script, which never exits (even though it seemingly does), the Exit statement is never reached (assuming that the new version of the script keeps the same fundamental while loop logic).
While this approach works in principle, there are caveats:
You're using ., the "dot-sourcing" operator, which means the script's new content is loaded into the current scope (and generally you always remain in the same process, as you always do when you invoke a *.ps1 file, whether with . or (the implied) regular call operator, &).
While variables / functions / aliases from the new script then replace the old ones in the current scope, old definitions that you've since removed from the new version of the script would linger and potentially cause unwanted side-effects.
As you observe yourself, your self-updating mechanism will break if the new script contains a syntax error that causes it to exit, because the Exit statement then is reached, and nothing is left running.
That said, you could use that as a mechanism to detect failure to invoke the new version:
Use try { . $ProfilePath } catch { Write-Error $_ } instead of just . $ProfilePath
and instead of the Exit command, issue a warning (or do whatever is appropriate to alert someone of the failure) and then keep looping (continue), which means the old script stays in effect until a valid new one is found.
Even with the above, the fundamental constraint of this approach is that you may exceed the maximum call-recursion depth. The nested . invocations pile up, and when the nesting limit is reached, you won't
be able to perform another, and you're stuck in a loop of futile retries.
That said, as of Windows PowerShell v5.1 this limit appears to be around 4900 nested calls, so if you never expect the script to be updated that frequently while a given user session is active (a reboot / logoff would start over), this may not be a concern.
Alternative approach:
A more robust approach would be to create a separate watchdog script whose sole purpose is to monitor for new versions, kill the old running script and start the new one, with an alert mechanism for when starting the new script fails.
Another option is to have the main script have "stages" where it runs command based on the name of the highest revision script in a folder. I think mklement0's watchdog is a genious idea though.
But what I'm referring to is doing what you do but use variables as your command and those variables get updated with the highest number script name. This way you just drop 10.ps1 into the folder and it will ignore 9.ps1. And the function in that script would be named mainfunction10 etc...
Something like
$command = ((get-childitem c:\path\to\scriptfolder\).basename)[-1]
& "C:\path\to\scruptfolder\\$command"
The files would have to be named alphabetically from oldest to newest. Otherwise you'll have to sort-object by date.
$command = ((get-childitem c:\path\to\scriptfolder\ | sort-object -Property lastwritetime).basename)[-1]
& "C:\path\to\scruptfolder\\$command"
Or . Source instead of using it as a command. And then have the later code call the functions like function$command and the function would be the name of the script
I still like the watch dog idea more.
The watchdog would look sort of like
While ($true) {
$new = ((get-childitem c:\path\to\scriptfolder\ | sort-object -Property lastwritetime).fullname)[-1]
If ($old -ne $new){
Kill $old
Sleep 10
& $new
}
$old -eq $new
Sleep 600
}
Mind you I'm not certain how the scripts are ran and you may need to seek instances of powershell based on the command used to start it.
$kill = ((WMIC path win32_process get Caption,Processid,Commandline).where({$_.commandline -contains $command})).processid
Kill $kill
Would replace kill $old
This command is an educated guess and untested.
Other tricks would be running the main script from the watchdog as a job. Getting the job Id. And then checking for file changes. If the new file comes in, the watch dog could kill the job Id and repeating the whole process
You could also just have the script end. And have a windows job every 10 mins just rerun the script. And that way you just have whatever script just run every ten minutes. This is more intense per startup though.
Instead of exit you could use break to kill the loop. And the script will exit naturally
You can use test-connection to check for the server. But if it's every 3 seconds. That's a lot if pings from a lot of computers

stop looped powershell script and kill remote processes initiated by it

i have a script that kicks off a remote job via invoke-command -asjob on 20 servers to run a legacy command line application. It might take a few hours to run it so I used foreach loop until I have 0 instances of (get-job - state running). Inside the loop I put write-progress to inform user about how many servers still running this process.
All works well but I need to add a small feature now to allow users running this script to stop the script and kill all running remote jobs.
Now I know how to kill them, but I have no idea how to allow user to provide feedback back to my loop. Originally I was thinking about popping up a GUI button using async runspace (like described here http://www.vistax64.com/powershell/16998-howto-create-windows-form-without-stopping-script-processing.html) but it looks like I cannot execute functions on the thread with my script from the UI thread.
Is there a way to allow my users to stop the script and kill remote processes?
Here is an example of breaking out of a loop.
Write-Host "Press 'q' to quit"
while(1){
if([console]::KeyAvailable){
$key = [console]::ReadKey($true)
if($key.Key -eq 'Q') {break}
}
}
Just place the if statement into your loop.