Task Scheduler can't kill a process properly using PowerShell script - powershell

I want to kill a process using Task Scheduler and PowerShell script.
When a specific process starts, task scheduler triggers the PS script. The script gets the process Id and tries to stop it.
My issue is, the script can't kill the process until the process finishes its job. However, I want to kill the process as soon as the script triggers, without waitingfor anything. As a note, the process I want to kill also runs with Admin privileges, and runs in window mode(not in background)
Scheduled Task settings: running as SYSTEM with highest privileges. I also used executionPloicyBypass parameter as below:
powershell -ExecutionPolicy Bypass -File C:\Scripts\KillProcess.ps1.
In the script, I have the following code basically
$process = Get-Process -Id $pid
$process.PriorityClass = 'High'
$events=Get-WinEvent -FilterHashtable #{LogName="Security"; Id = <eventId>; StartTime = [datetime]::Now.AddMinutes(-5)} |Where-Object -Property Message -Match '<string to search for>' | Where-Object -Property Message -Match ('<string to search for>') -ErrorAction Stop
if (!$events[0].message) {
Exit
}
else {
$processes = #()
#for each event, get process Id and kill it.
#this is because the process can spawn multiple process.
foreach ($event in $events) {
#parse the process Id.-*
$processId=[int][regex]::Match($event.message,'Process\sID\:\s+(0x.+)\s').captures.groups[1].Value
$processes += $processId
}
$processes = $processes | Select -Unique
foreach ($proc in $processes) {
Stop-Process -Id $proc -Force -ErrorAction SilentlyContinue
}
}
When I run PowerShell ISE as Admin and run the script there manually, it immediately kills the process. However, it waits for the process to finish its job when task scheduler triggers the script. Am i doing something wrong with the Task scheduler?

I don't know what was the issue with Stop-Process but I changed it to process.Kill() method by getting the process object using Get-Process -Id $proc first. Now it works without issue.

Related

Get Ctrl-C to work better on invoke-expression for powershell?

I'm doing an invoke-expression of an old console command written in C++ and I don't have control over the C++ source code... But, I think it's trapping the Ctrl-C to prevent the command line from interrupting it... Thus, when I invoke-expression from my powershell script there's no way to break the execution using ctrl-C and it locks my terminal and I need to keep killing and restarting my terminal window... which is super annoying...
Is there a way to make sure that powershell gets the ctrl-C instead of the C++ program when starting the C++ program using invoke-expression?
Like maybe, I can refuse to give stdin to the C++ program, and let powershell have it instead? or maybe there's some solution where stdin goes to a different powershell thread that waits for a ctrl-c and then kills the invoke-expression thread..
Example:
PS> $cmd = <path_to_misbehaving_cpp_program_that_doesnt_Allow_ctrl_c_To_break_it>
PS> invoke-expression $cmd
# now here if I ctrl-C I can't break out of the invoke-expression....
# But I need this capability to ctrl-C
# to break the script and terminate the
# invoke expression of the C++ program.
Starting and stopping (killing) a process from within powershell:
Use the Start-Process cmdlet to start the other program. If you use the -PassThru switch you get back the information witch process was started.
$Proc = Start-Process powershell.exe -ArgumentList '-command "sleep 60" ' -PassThru
This process can easily be killed with Stop-Process even if it is supposed to run another 60 seconds:
$Proc | Stop-Process
Edit:
Now with exit code (Thx #mklement0 for the Wait)
$Proc.WaitForExit()
$Proc.ExitCode
The trick to get it to respond to ctrl-c was to pipe $null into the command run via start-process powershell... see below... Also, note at the end the finally "kill -force" kills the remaining child processes from the initial powershell.exe process-start... my program created childprocesses that keep running after the ctrl-c kill powershell.exe so i need to kill again...to kill the entire process tree.
$script:Proc = $null
# Run Command as a separate process
function x1_process {
write-host "`nx1_process: $args"
$posh = (get-command powershell.exe).Source
$iproc = Start-Process $posh -ArgumentList "`$null `| $args" -PassThru -NoNewWindow
$script:Proc = $iproc
if ($iproc -eq $null) {
throw "null proc"
}
Wait-Process -InputObject $iproc
$code = $iproc.ExitCode
if ($code -ne 0) {
throw "return-code:$code"
}
write-host "ok`n"
return
}
try {
x1_process mycommand.exe arg1 arg2 ...
}
finally {
if ($script:Proc) {
if (-not $script:Proc.HasExited) {
write-host -ForegroundColor Red "=> Killing X1_PROCESS <="
Stop-Process -InputObject $script:Proc -Force -ErrorAction SilentlyContinue
}
}
}

How to continue powershell script after child generated powershell scripts have finished?

I'm currently struggling with an issue and would appreciated any help or input.
I have a powershell script that, during it's execution, spawns other powershell scripts that run various commands. All of these generated powershell scripts exit automatically after their commands have been executed.
What I'm trying to do is that after all of these spawned powershell instances exited, run another command within the initial powershell script. Basically wait for all generated powershell instances to exit and then run a command.
Any ideas on how to implement that?
Thanks
Edit: The code that spawns the powershell scripts looks like this:
foreach ($var in $filters){
start-process powershell.exe -Verb Runas -argument "-nologo -noprofile -command .\$var"}
You can use Wait-Process. Use the -PassThru switch to return the started processes. This should work:
$processes = foreach ($var in $filters) {
Start-Process powershell.exe -PassThru -Verb RunAs -Arg "-nologo -noprofile -command .\$var"
}
# wait for processes to finish
$processes | Wait-Process
# now, run your other commands
# ...
You could also use background jobs like this (using the $PWD automatic variable):
# run all scripts as background jobs
$jobs = foreach ($var in $filters) {
Start-Job -FilePath "$PWD\$var"
}
# wait for all jobs to finish
$jobs | Wait-Job
# now, run your other commands
# ...
or in a one-liner:
$filters | foreach { Start-Job -FilePath "$PWD\$_" } | Wait-Job
You can try this using powershell jobs as #jeorenmostert said
$j=$filters|%{
start-job -filepath "$($_)"
}
$j|wait-job
here first iterate over the $filters using % alias of ForEach
then start each job ($_) is the pipeline variable
get them inside $j variable
pipe $j jobs to wait-job to wait until the jobs has finished (you can pipe receive-job too to get passthru like results)

Stop the Running process of taskmgr from PowerShell

I write this command for powershell. I want to stop the running process of taskmgr from PowerShell.
Get-Service | Where-Object {$_.Status -eq "Running"}
And how to stop process those running processes?
Task Manager is a process, not a service.
So, in order to stop it from PowerShell you need to run:
Stop-Process -Name Taskmgr -Force
PS: You will need to run the powershell as administrator in order to stop the process.
Best regards,

Powershell build step, fire and forget?

I am running the following powershell command in a build step using TFS 2018.
Start-Job -ScriptBlock {
Invoke-Command -FilePath \\MyServer\run.ps1 -ComputerName MyServer -ArgumentList arg1, arg2
}
Since I don't want the script to affect the build step it should simply fire and forget the script. Hence I am using Start-Job. But it seems that once the step is done the process is killed. Is there a way to maintain the process lifetime even though the build step is done or the build process is finished?
Additional information... the powershell script should run on the remote server. The script itself triggers an .exe with parameters.
To simply fire and forget, invoke the script with Invoke-Command -AsJob:
Invoke-Command -AsJob -FilePath \\MyServer\run.ps1 -ComputerName MyServer -Args arg1, arg2
Start-Sleep 1 # !! Seemingly, this is necessary, as #doorman has discovered.
This should kick off the script remotely, asynchronously, with a job getting created in the local session to monitor its execution.
Caveat: The use of Start-Sleep - possibly with a longer wait time -
is seemingly necessary in order for the remote process to be created before the calling script exits, but such a solution may not be fully robust, as there is no guaranteed timing.
Since you're not planning to monitor the remote execution, the local session terminating - and along with it the monitoring job - should't matter.
When do you want the script to stop running? You could use a do-while loop and come up with a <condition> that meets your needs.
Start-Job -ScriptBlock {
do{
Invoke-Command -FilePath \\MyServer\run.ps1 -ComputerName MyServer -ArgumentList arg1, arg2
Start-Sleep 2
}while(<condition>)
}
Alternatively, you could use the condition $true so it executes forever. You will have to stop the job later in the script when you no longer need it.
$job = Start-Job -ScriptBlock {
do{
Invoke-Command -FilePath \\MyServer\run.ps1 -ComputerName MyServer -ArgumentList arg1, arg2
Start-Sleep 2
}while($true)
}
Stop-Job $job
Remove-Job $job
I've added a Start-Sleep 2 so it doesn't lock up your CPU as no idea what the script is doing - remove if not required.
Why not something like this:
Invoke-Command -Filepath \\MyServer\Run.ps1 -Computername MyServer -Argumentlist Arg1,Arg2 -AsJob
$JobCount = (get-job).Count
Do
{
Start-Sleep -Seconds 1
$totalJobCompleted = (get-job | Where-Object {$_.state -eq "Completed"} | Where-Object {$_.Command -like "NAMEOFCOMMAND*"}).count
}
Until($totalJobCompleted -ge $JobCount)
#doorman -
PowerShell is natively a single threaded application. In almost all cases, this is a huge benefit. Even forcing multiple threads, you can see the child threads are always dependent on the main thread. If this wasn't the case, it would be very easy to create memory leaks. This is almost always a good thing as when you close the main thread, .Net will clean up all the other threads you may have forgotten about for you. You just happened to run across a case where this behaviour is not beneficial to your situation.
There are a few ways to tackle the issue, but the easiest is probably to use the good ol' command prompt to launch an independent new instance not based at all on your original script. To do this, you can use invoke-expression in conjunction with 'cmd /c'. See Below:
invoke-expression 'cmd /c start powershell -NoProfile -windowstyle hidden -Command {
$i = 0
while ($true) {
if($i -gt 30) {
break
}
else {
$i | Out-File C:\Temp\IndependentSessionTest.txt -Append
Start-Sleep -Seconds 1
$i++
}
}
}
'
This will start a new session, run the script you want, not show a window and not use your powershell profile when the script gets run. You will be able to see that even if you kill the original PowerShell session, this one will keep running. You can verify this by looking at the IndependentSessionTest.txt file after you close the main powershell window and see that the file keeps getting updated numbers.
Hopefully this points you in the right direction.
Here's some source links:
PowerShell launch script in new instance
How to run a PowerShell script without displaying a window?

powershell loop for periodical checking running process

I'm trying to create a PowerShell script (loop) for periodical checking if some other PowerShell script with specific command is running. If not I start it.
This is what I have:
$processInfo = Get-WmiObject Win32_Process -Filter "name = 'powershell.exe'" | select CommandLine | Out-String -width 200
while($true) {
if ($processInfo -NotLike '*specific_path*') {
Write-Host 'process not running'
Start-Process powershell -argument '-f X:\specific_path\other_script.ps1'
}
else { Write-Host 'process is running' }
Start-Sleep -Seconds 60
}
The script correctly detects running or not running process when started. but when situation changes during the script run, it does not detect it.
Also, when script starts, detects that the other script is not running, it starts it correctly but then it does not see it already running and starts it again and again.
So the only problem I have (I believe) is how to get "fresh" data about running processes. Any ideas? Many thanks!