I have a Script to execute a code at shutdown but the code don't get executed because Windows closes the Powershell session.
Is there a way to prevent Powershell from closing if you press shutdown Windows?
To track the shutdown I use:
Register-ObjectEvent -InputObject $sysevent -EventName "SessionEnding" -Action { [Windows.Forms.MessageBox]::Show("Shutdown!", "", [Windows.Forms.MessageBoxButtons]::OK, [Windows.Forms.MessageBoxIcon]::Warning)}
Register-ObjectEvent -InputObject $sysevent -EventName "SessionEnded" -Action { [Windows.Forms.MessageBox]::Show("Shutdown!", "", [Windows.Forms.MessageBoxButtons]::OK, [Windows.Forms.MessageBoxIcon]::Warning)}
Edit:
I have a Script that runs in the background. This script is used to activate a product on a Server. If the user shutdown the pc, the script should deactivate the product on a server.
Situation now: Script run normal, product is activated on server. After on shutdown the script closes and do nothing
Situation expected: Script run normal, product is activated on server. After on shutdown the script runs last command and then shutdown the machine
Related
I've put together a powershell workflow script that runs on the local machine and calls some other scripts. The other scripts include write-output messages to let operators know what's happening on the system.
This is important because if an operator uses the computer while the workflow is running, things might not work properly. It's also helpful in debugging the scripts if something goes wrong.
Some of the scripts reboot the computer and suspend the workflow. I have a scheduled task to restart the suspended workflow after the computer logs back in.
The problem is that when this happens the workflow resumes running in the background and the write-output messages are no longer visible on screen.
Here's my code to schedule the workflow restart:
# Set up a job trigger to go off 15-20 seconds after login
$delayTime = New-TimeSpan -Start 00:0:15 -End 00:0:20
$AtLogOn = New-JobTrigger -AtLogOn -RandomDelay $delayTime
# Set up the task action to start Powershell and run this code
$resumeScript = 'Get-Job -State Suspended | Resume-Job; pause'
$action = New-ScheduledTaskAction -Execute powershell -Argument $resumeScript
# Actually registering the scheduled task with the above inputs
Register-ScheduledTask -TaskName ResumeSetup -Trigger $AtLogOn -Action $action -RunLevel Highest
How can I have Powershell resume a workflow in an open powershell window so the write-output messages are visible?
I am working with a script that uses Excel as a COM application, so whenever I close my scipt using the X-button, it will leave an Excel task open in the background.
So I have a function that will close the Exceltask whenever I give "exit" as an user input, but I want to be able to close the console and excel with the X-button in the console. Is there any way that I can change the behavior from the X-button so it will first trigger a function, maybe? To give an idea about the function:
function Close(){
if($workbook){$workbook.close($false)}
if($excel){[void][System.Runtime.InteropServices.Marshal]::ReleaseComObject([System.__ComObject]$excel)}
[gc]::Collect()
[gc]::WaitForPendingFinalizers()
Remove-Variable excel -ErrorAction SilentlyContinue
exit
}
EDIT
It seems that exiting via the x-button will remain an issue for me as this thread (github) suggests.
So the reason for whom it interests is that:
"The event never kicks in if PowerShell is not in control of its own termination: Thus, closing the window / quitting the terminal emulator will not run the event handler."
which is just what so happens by pressing the x-button. So this issue is being considered in PS 7.0. Furthermore this thread, which discusses the same problem, also helped me find the thread on github.
You can carry out events when the form closes:
$form.Add_FormClosing({
# Actions to carry out when form closed.
})
Edit:
$Null supresses output - you can remove it to see whats going on.
The following code works on my PC:
Register-EngineEvent -SourceIdentifier PowerShell.Exiting -SupportEvent -Action { New-Item -Path c:\temp -Name text.txt }
File created when PS window shut
I have a series of 15 batch files I need to run as part of a powershell script. I currently have a working version of this, but I execute each batch file as below:
Start-Process -FilePath "Script1.bat" -WorkingDirectory "C:\Root" -Wait
Start-Process -FilePath "Script2.bat" -WorkingDirectory "C:\Root" -Wait
Start-Process -FilePath "Script3.bat" -WorkingDirectory "C:\Root" -Wait
and so on.
The batch files are lumping thousands of SQL scripts into one big script, and some of them are much slower than others (the SQL populates 5 different databases and 1 is much larger than the other 4)
I'm trying to improve the speed at which the 15 scripts run, and I thought a good idea would be to run the batch files in parallel so that all of the SQL files are created at the same time, rather than in sequence. To do this I removed the "-Wait"s, and can see all of the command line windows opening simultaneously.
The problem I'm having is that after I have created the SQL files, I'm using Copy-Item to move the scripts to the desired location. Because the Copy-Item commands are no longer waiting for the batch files to execute, it's attempting to copy the SQL files from the folder they are created in before the batch files have finished creating them.
So basically I'm trying to come up with a way to "Wait" for the collection of batch files to run so I can ensure the batch files have finished running before I start copying the files. I've looked online for ages, and have tried using powershell Jobs with the "Wait-Job" command, but the wait only waits until every batch file has been executed, and not until they have been completed. Does anyone have any ideas on how this can be achieved?
I'm thinking: put all the process objects in an array... and wait for all of them to have exited...
$p = 1..15 | ForEach-Object {
Start-Process -FilePath "Script$_.bat" -WorkingDirectory "C:\Root" -PassThru
}
while(($p.HasExited -eq $false).Count) {
Start-Sleep -Milliseconds 100
}
I'd use an event here.
Instead of triggering the batch with what you have try the following:
$bat3 = Start-Process -FilePath "Script3.bat" -WorkingDirectory "C:\Root" -Wait -PassThru
Notice, you're saving the process object to a variable and you've added a -PassThru parameter (so an object is returned by the commandlet). Next, create a scriptblock with whatever you want to happen when the batch script finishes:
$things-todo-when-batchfile-completes = {
Copy-Item -Path ... -Destination ...
Get-EventSubscriber | Unregister-Event
}
Make sure to end the block with Get-EventSubscriber | Unregister-Event. Next, create an event handler:
$job = Register-ObjectEvent -InputObject $bat3 `
-eventname exited `
-SourceIdentifier Batch3Handler `
-Action $things-todo-when-batchfile-completes
Now, when the task finishes running the handler will execute the actions in $things-todo-when-batchfile-completes.
This should let you kick off the batch files, and when they finish running, you execute the file copy you were after.
Microsoft has a blog post on the technique in the piece Weekend Scripter: Playing with PowerShell Processes and Events.
I have a powershell script that registers certain events and logs them to a file.
I want to be able to also log to that file the moment that the script was forcefully stopped. For example, by closing the window via X button.
How could I perform this?
We can include in the ecuation the following languages: Powershell via ConEmu console, Perl via ConEmu console, AutoIT.
This does the trick.
Register-EngineEvent PowerShell.Exiting -SupportEvent -Action `
{
#action
}
I need to complete a series of tasks across several Windows 2008 servers that need elevated permissions and as such I've had to create a series of scheduled tasks that i'm running via psexec. Since they have to run in sequence, I found and modified a powershell script that 'stalls' until scheduled tasks are completed on remote machines. The problem I have is that when I launch the script with psexec on the remote machine, once it completes running (indicated by a message in the console output) PowerShell.exe doesn't exit cleanly, but rather hangs out and holds up the entire process. I need powershell to quit after the delay script exits, but even with an exit keyword at the end it stays in memory and prevents the process from finishing. I'm not terribly experienced with powershell so I'll attach my script in case I'm doing something foolish in it:
while($true) {
$status = schtasks /query /tn "AutoDeploy"| select-string -patt "AutoDeploy"
if($status.tostring().substring(64,7) -eq "Running") { "Waiting for task to complete..." } else { break }
start-sleep -s 5
}
"Task complete."
exit
Thanks in advance for any insight.
This works for me (using a different task name) and doesn't hang psexec:
$taskName = "AutoDeploy"
while (1)
{
$stat = schtasks /query /tn $taskName |
Select-String "$taskName.*?\s(\w+)\s*$" |
Foreach {$_.Matches[0].Groups[1].value}
if ($stat -ne 'Running')
{
"Task completed"
break
}
"Waiting for task to complete"
Start-Sleep 5
}