running a command repeatedly on a remote server - powershell

I wrote the following script to run a command on a remote server with 5 sec interval. The command inside $LogrCmd variable runs on a remote server to check if a particular service is up or down. I expect the script to poll the service every 5 seconds until the service is completely down. However the scripts exits out immediately even if the service is up.
$LogrCmd = get-content 'c:\temp\info.cfg' | select-string -Pattern cheetahdev
while (-not (Invoke-Command -ScriptBlock {& cmd.exe /c "$LogrCmd"})) {
## Wait a specific interval
Start-Sleep -Seconds 5
}
Here's the contents of the info.cfg file which runs against the remote host.
"C:\PWX\pwxcmd displaystatus -sv cheetahdev"

You would do better to use a do / while loop here instead of a while loop:
$LogrCmd = Get-Content 'c:\temp\info.cfg' | Select-String -Pattern cheetahdev
do {
cmd.exe /c "$LogrCmd"
## Wait a specific interval
Start-Sleep -Seconds 5
} while ( $LASTEXITCODE -eq 0 )
This will always run the command once, sleep, then check to see if the command succeeded. If the command succeeded it will continue the loop. Of course, you can tailor the while condition to check for other exit codes and conditions as well.
Note that for external commands it's best to rely on $LASTEXITCODE most of the time to check for command success, unless you need to parse the output of the command or something else less common.
Also note that by reading the full command from the file like that opens you up to code injection attacks by someone familiar with how to manipulate your info.cfg.

Related

PowerShell: waiting for output redirect file to unlock after killing process

I have a PowerShell script that:
Starts a new process and redirects the output to two files
Waits with a timeout for the process to complete
Kills the process if it timed out
Reads the contents from the redirected output files
It looks something like this:
$_timeout = 30
$_stdoutfile = "./stdout"
$_stderrfile = "./stderr"
# Start the process
$_process = Start-Process powershell.exe -ArgumentList "-file ""$_cmdfile""" -PassThru -NoNewWindow -RedirectStandardError "$_stderrfile" -RedirectStandardOutput "$_stdoutfile"
# Wait for it to complete, or time out
$_timeout_error = $null
$_process | Wait-Process -Timeout $_timeout -ErrorAction SilentlyContinue -ErrorVariable _timeout_error
# Check if it timed out
if ($_timeout_error) {
# Kill it
$_process | Stop-Process -Force
# (1)
# Wait for the process to exit after the kill command
$_kill_timeout_error = $null
$_process | Wait-Process -Timeout 10 -ErrorAction SilentlyContinue -ErrorVariable _kill_timeout_error
# If the process is still running 10 seconds after the kill command die with an error
if ($_kill_timeout_error) {
Write-Error "Failed to terminate process (waited for 10 seconds after initiating termination)."
Exit 1
}
}
# (2)
# Read the stdout and stderr content that was output
$_stdout = [System.IO.File]::ReadAllText("$_stdoutfile")
$_stderr = [System.IO.File]::ReadAllText("$_stderrfile")
# Delete the files after reading them
Remove-Item -Force "$_stdoutfile"
Remove-Item -Force "$_stderrfile"
The majority of this works properly; the process is killed as expected if it runs too long. The problem I'm having is with the ReadAllText functions. They work fine if the process exits on its own, but if they were killed due to a timeout, these functions fail with the error:
The process cannot access the file
'C:\myfilename' because it is being used by another process.
I figured that perhaps it takes the OS a couple seconds to unlock the files after the process is killed, so I inserted a sleep-poll loop at # (2), but often several minutes later, they're still locked.
#Santiago Squarzon suggested (in the comments) a way to read the file while it's still locked, which may work, but I also need to be able to delete them after reading them.
So my questions are:
Is there a way to get these files to naturally unlock more quickly after killing the process?
Is there a way to force-unlock these files with a separate PowerShell command/function?
Unrelated, but is the part in my code around the comment # (1) necessary (waiting for the process to stop after the kill command), or will Stop-Process block until the process is actually stopped?

How to execute separate Jmeter test plans one at a time with powershell?

We have received 20 jmeter test plans, each testing one endpoint, which we need to run. On tests we need to pass parameters and others we don't.
My idea was to create a powershell script that loops through the directories and runs a test, waits until finished and then runs the next test. When we develop a new endpoint we just create a new test plan and save it in the appropriate folder and the powershell script will include it next time we loop through tests.
I need the tests to finish before starting the next plan, so I'm looking at something like:
Write-Output "Running Test 1"
$proc = Start-Process -FilePath "C:\JmeterLoadTests\apache-jmeter-5.2.1\bin\jmeter" -ArgumentList "-n -t C:\JmeterLoadTests\test\enpointsType1\test-1-1.jmx -Jduration=10"
$proc.WaitForExit()
Write-Output "Proc 1 Done"
Write-Output "Running Proc 2"
$proc2 = Start-Process -FilePath "C:\JmeterLoadTests\apache-jmeter-5.2.1\bin\jmeter" -ArgumentList "-n -t C:\JmeterLoadTests\test\enpointsType1\test-1-1.jmx -Jduration=10"
$proc2.WaitForExit()
This just launches both tests simultaneously.
My question is then how to make Powershell wait for the previous test to finish.
Your immediate problem is that your Start-Process call is missing the -PassThru switch, which is required for the call to return a System.Diagnostics.Process instance representing the newly launched process.
# ...
# Note the use of -PassThru
$proc = Start-Process -PassThru -FilePath "C:\JmeterLoadTests\apache-jmeter-5.2.1\bin\jmeter" -ArgumentList "-n -t C:\JmeterLoadTests\test\enpointsType1\test-1-1.jmx -Jduration=10"
$proc.WaitForExit()
# ...
Alternatively, if you don't need to examine the process exit code (which $proc.ExitCode in the above command would give you), you can simply use the -Wait switch, which makes Start-Process itself wait for the process to terminate:
# ...
# Note the use of -Wait
Start-Process -Wait -FilePath "C:\JmeterLoadTests\apache-jmeter-5.2.1\bin\jmeter" -ArgumentList "-n -t C:\JmeterLoadTests\test\enpointsType1\test-1-1.jmx -Jduration=10"
# ...
Taking a step back:
To synchronously execute console applications or batch files in the current console window, call them directly, do not use Start-Process (or the System.Diagnostics.Process API it is based on).
Aside from being syntactically easier and less verbose, this has two key advantages:
You can directly capture their output.
It allows you to examine the process exit code via the automatic $LASTEXITCODE variable afterwards.
Assuming that jmeter is a console application (the docs suggests it runs as one when invoked with arguments):
# ...
# Direct invocation in the current window.
# Stdout and stderr output will print to the console by default,
# but can be captured or redirected.
# Note: &, the call operator, isn't strictly needed here,
# but would be if your executable path were quoted
# or contained variable references.
& C:\JmeterLoadTests\apache-jmeter-5.2.1\bin\jmeter -n -t C:\JmeterLoadTests\test\enpointsType1\test-1-1.jmx -Jduration=10
# Use $LASTEXITCODE to examine the process exit code.
# ...
See this answer for more information.
It might be the case you're suffering from Out-Default cmdlet execution, the easiest is separating the commands with the semicolon like:
cmd1;cmd2;cmd3;etc;
this way Powershell will wait for the previous command to complete before starting the next one
Demo:
It might be a better idea considering switching to Maven JMeter Plugin which by default executes all tests it finds under src/test/jmeter folder relative to the pom.xml file

How to get background task status from parent window in PowerShell

I have the following powershell scripts to start two background task. I could able to fetch the status of background task if I use wait parameter.
$TestResult1=start .\TestFile1.bat -NoNewWindow -PassThru -ErrorAction Stop
$TestResult2=start .\TestFile2.bat -NoNewWindow -PassThru -Wait -ErrorAction Stop
if($TestResult1.ExitCode -gt 0){
throw 'Exceptions in TestFile1.bat'
}
if($TestResult2.ExitCode -gt 0){
throw 'Exceptions in TestFile2.bat'
}
Is there any way to fetch the status of background task without using wait parameter? In above example, I can able to fetch the status only from TestFile2.bat.
If you don't use -Wait, you can use Wait-Process with your $TestResult1 and $TestResult2 variables, which, thanks to -PassThru, contain System.Diagnostics.Process instances representing the processes launched:
# Waits synchronously for both processes to terminate.
$TestResult1, $TestResult2 | Wait-Process
# Now you can inspect the exit codes.
# NOTE: The .ExitCode property is only available after a process
# has *terminated*. Before that, it effectively returns `$null`
# (the underlying .NET exception that occurs is swallowed by PowerShell).
$TestResult1.ExitCode, $TestResult2.ExitCode
If you want to perform other operations while waiting for the processes to terminate, you can use the .HasExited property in a loop to periodically test if the process have terminated:
$leftToMonitor = $TestResult1, $TestResult2
do {
# Perform foreground operations...
Write-Host . -NoNewLine; Start-Sleep 1
# Check for processes that have already terminated.
$exited, $leftToMonitor = $psToMonitor.Where({ $_.HasExited }, 'Split')
foreach ($ps in $exited) {
# Output the command line and the exit code as properties of a custom object.
[pscustomobject] #{
CommandLine = $ps.CommandLine
ExitCode = $ps.ExitCode
]
}
} while ($leftToMonitor)
Note that Wait-Process also has a -Timeout parameter, and you can use -TimeOut 0 to momentarily test if processes have exited, but note that for (each) process that hasn't exited, a non-terminating error is reported, which makes checking the .HasExited property more convenient (and doing so is also faster).
That said, for invisible background tasks I recommend using
PowerShell jobs, either via Start-Job, or, preferably, via the faster and lighter-weight Start-ThreadJob (comes with PowerShell (Core) 7+, installable with Install-Module ThreadJob in Windows PowerShell) for background tasks rather than Start-Process -NoNewWindow, because they:
avoid the problem of potential output from the Start-Process -NoNewWindow-launched process printing output that cannot be captured to the console, which without -Wait will arrive with unpredictably timing.
instead allow you to collect output in a controlled manner on demand via the Receive-Job cmdlet.
Waiting for jobs to finish, optionally with a timeout, is done via the Wait-Job cmdlet.
Note:
Start-Job creates a hidden PowerShell child process in which to run given commands, which is what makes it slow, whereas Start-ThreadJob uses a thread in the current process.
As of PowerShell 7.1, background jobs do not automatically capture the exit code of an / the most recent external program executed by them, unlike in foreground execution, where the automatic $LASTEXITCODE variable reflects this information. Therefore, unfortunately, $LASTEXITCODE must be reported as part of each job's output, which is cumbersome - see below.
GitHub proposal #5422 suggests adding a .LastExitProperty to job objects to address this limitation.
Examples:
Note:
Instead of calling a batch file, the examples below call a cmd.exe command directly, with /c, but the principle is the same.
As stated, the exit code of the cmd.exe call must be reported as part of the job's output, hence the extra ; $LASTEXITCODE statement after the call.
Simplistic example: Wait synchronously for all jobs to terminate, and report the output, which comprises all stdout and stderr output from cmd.exe followed by the process exit code reported via $LASTEXITCODE:
# Start two thread jobs that call cmd.exe, with different outputs
# and different exit code.
# Note: If you don't have Start-ThreadJob, you can use Start-Job
$jobs =
(Start-ThreadJob { cmd /c 'echo ONE'; $LASTEXITCODE }),
(Start-ThreadJob { cmd /c 'echo TWO & exit /b 1'; $LASTEXITCODE })
$jobs | Receive-Job -Wait -AutoRemoveJob
The above yields (note that the output order isn't guaranteed):
ONE
0
TWO
1
Example with continued foreground operation while waiting:
# Start two thread jobs that call cmd.exe, with different outputs
# and different exit code.
# Note: If you don't have Start-ThreadJob, you can use Start-Job
$jobs =
(Start-ThreadJob { cmd /c echo ONE; $LASTEXITCODE }),
(Start-ThreadJob { cmd /c 'echo TWO & exit /b 1'; $LASTEXITCODE })
do {
# Perform foreground operations...
Write-Host . -NoNewLine; Start-Sleep 1
# Note: You can also capture *ongoing* job output via repeated Receive-Job calls.
# Find all all jobs that have finished.
$finished, $jobs = $jobs.Where({ $_.State -in 'Completed', 'Failed', 'Stopped' }, 'Split')
# Process all finished jobs.
foreach ($job in $finished) {
# Get the job's output and separate it into the actual output
# and the exit code, which is the *last* object.
$output = $job | Receive-Job
$i = 0
$lastExitCode, $actualOutput = $output.Where({ ++$i -eq $output.Count }, 'Split')
# Output a custom object that reflects the original command, the output, and the exit code.
[pscustomobject] #{
Command = $job.Command
Output = $($actualOutput) # unwrap a single-object output collection
ExitCode = $lastExitCode
}
# Remove the job
Remove-Job $job
}
} while ($jobs)
Note:
The above uses the fairly cumbersome $_.State -in 'Completed', 'Failed', 'Stopped' to momentarily test for finished jobs, without waiting.
Ideally, Wait-Job -Timeout 0 could more simply be used, but as of PowerShell 7.1 that doesn't work as expected (the minimum wait period is therefore -Timeout 1, i.e. 1 second) - see GitHub issue #14675.

Returning success and then restarting computer

I'm integrating Jenkins with a bunch of stuff and im using powershell to do this. I have a script on a remote machine that is executed after a build is successful on jenkins. This script does a bunch of stuff and then will restart the machine.
What i need to do is:
Return to jenkins that the script was successful (meaning that it will end the job as SUCCESS)
Then restart the machine
So far i have not managed to send 'EXIT 0' to jenkins and then restart the machine. There's anyway to do this?
Thanks in advance.
Code example:
Write-Host "Code example"
Exit 0 #for jenkins success
Restart-Computer -Force
This will host a seperate command prompt that runs async from the powershell script and restarts the computer in 3 seconds, enough time for powershell to return the exit code to jenkins.
Start-Process -FilePath "cmd.exe" -ArgumentList '/c "timeout /t 3 /nobreak && shutdown -r -f -t 0"' -WindowStyle Hidden
Exit 0
As noted in a comment by #Avshalom, your problem is that the Exit statement will unconditionally exit your script without ever executing the Restart-Computer command placed after it.
Restart-Computer, when executed locally, is invariably asynchronous, so your script will continue to execute, at least for a while.
You can therefore try to call Restart-Computer first, and exit 0 afterwards:
Write-Host "Code example"
Restart-Computer -Force
exit 0 # for Jenkins success
However, there's no guarantee that control will return to Jenkins in time and that Jenkins itself will have time to process the successful exit before it is shut down itself.
You can improve the likelihood of that with a delay, via a separate, asynchronously launched PowerShell instance[1], similar to the approach in Evilcat's answer:
Write-Host "Code example"
# Asynchronously start a separate, hidden PowerShell instance
# that sleeps for 5 seconds before initiating the shutdown.
Start-Process -WindowStyle Hidden powershell.exe -Args '-command',
'Start-Sleep 5; Restart-Computer -Force'
exit 0 # for Jenkins success
This still isn't a fully robust solution, however; a fully robust solution requires changing your approach:
Let your script indicate success only, without initiating a restart.
Make Jenkins test for success and, if so, call another script that unconditionally initiates a shutdown.
[1] As Evilcat points out, using a background job with Start-Job does not work, because on exiting the calling PowerShell session with exit the background jobs are terminated too.

How to keep a powershell script running 24/7

I have a PowerShell script that needs to be restarted when it dies for whatever reason, be it a crash, a self exit or after a system reboot...
How can I, from a bat or another powershell script, see to it that if it is not running, it will be started again...
i.e. how can I find out if it is already running from another script?
I know I can make one powershell script start the active one and simply have it loop a new start as long as it doesnt exit with a specific error... but then THAT scripts need to be seen to :D So we are back to the original quesiton, how do I keep THAT script running 24/7?
do
{
$date = Get-Date -format "yyyy-MM-ddTHH:mm:ss"
"$($date) Repeat : Restarting Worker, LastExitCode $($LastExitCode)." | Out-File D:\IDM\Worker\Worker.LOG -width 180 -append
powershell -File "D:\IDM\Scripts\Worker.ps1"
sleep 10
}
while ($LastExitCode -ne $null)
I would just use scheduled tasks. There are plenty of options in there to help you do what you want. You can run the script every five minutes and have it do enough loops to take up that time and quit:
$now = Get-Date
while ($now.AddMinutes(5) -lt (Get-Date)){
...work...
}
Or you could even have it write a flag file every time the loop works and have any new process check that file to see if there hasn't been activity on it. If there's been no activity:
$workFlag = Get-Item C:\work.flg
$cutOff = (Get-Date).AddMinutes(-5)
if ($workFlag.LastWriteTime -gt $cutOff){
New-Item -force -path C:\work.flg
...work loop..
}