Background jobs end prematurely - powershell

I have a problem of understanding with the use of a background process.
I have to convert a logic of treatment which is on Linux towards Windows. I have a script which in its treatment, launche among others, another script in background, and pursues its treatment. The script which throws(launches) the treatment background does not have to care about the result of this one thus has to end even if the treatment in backgroud still execute.
What I experimented in Powershell is when my first script ends, it looks like the processing of the script thrown in background breaks off.
Here is an example of the tests I have made in my environment (PowerShell version 2.0)
Here's the simple code of my first script script_1.ps1:
start-job -filepath d:\script\script_2.ps1
sleep 3
get-item d:\log\log
And my second script script_2.ps1:
sleep 1
"test" > d:\log\log-1.log
sleep 1
"test" > d:\log\log-2.log
sleep 1
"test" > d:\log\log-3.log
sleep 1
"test" > d:\log\log-4.log
sleep 1
"test" > d:\log\log-5.log
Here's what's happen when I submit it:
C:\Documents and Settings\user1>Powershell -command "D:\script\script_1.ps1"
Id Name State HasMoreData Location Command
-- ---- ----- ----------- -------- -------
1 Job1 Running True localhost sleep 1...
LastWriteTime : 2013-10-10 10:19:13
Length : 14
Name : log-1.log
LastWriteTime : 2013-10-10 10:19:14
Length : 14
Name : log-2.log
The result show me that when the first script terminate, (after 3 sec.) the second script
stop also because we see that only the first 2 logs files were created.
I tought that it would be like on Linux (&) where the background job continue anyway.
Is it the way it suppose to be with PowerShell or is somethings I do wrong?

When you use the -Command switch to start PowerShell it runs the supplied command, and then exits the session. Jobs in PowerShell are specific to the session they're started in. You can see this for yourself if you want by starting two instances (sessions) of PowerShell. Create a job in one, e.g. start-Job {sleep 600}, and do get-Job, you'll see that a job is Running. If you type get-Job into the other PowerShell instance you'll see that there are no jobs.
The start-Job cmdlet is creating a job, but it doesn't wait for that job to complete, it simply starts it on its merry way. In your case what's happening is that start-Job is creating the job, you're waiting one second, then listing the contents of the log directory. Then you reach the end of the script and once that happens the session is terminated. This is why only some of your log files are created. The session is being ended because you've reached the end of the script, before the created job has had a chance to finish.
To use jobs in the manner you describe you need for the PowerShell session to remain started for the entirety of the job's lifetime. For that, you'll need the -NoExit parameter on your PowerShell command that runs the job. You may want to browse around powershell /? for other switches you might find useful. If you got extra creative with your script that starts the job you could eventually make a hidden, noninteractive session that automatically quit once the job was done. But baby steps.

On linux, the background job will keep on running only if launched with the nohup command, or the nohuponexit option is set in the shell (or the process is later disowned with the eponym command)
Hints on how to do the same with PS can be found in this thread of emails on freelist.

Related

Trying to get a Powershell Script that will run in a 2nd window and monitor in real time other running scripts / report all Errors / ExitCodes

I am fairly new to writing code in Powershell. For my job I have to write multiple Powershell scripts to make changes in the Hardware and Software settings as well as the Registry and Group Policy Editor to get these applications to run. These applications are a little older. Upgrading these software applications or the hardware then run on is NOT an option. as an example, when Microsoft releases the new patches on like Patch Tuesday...when those patches are applied there is a high probability that something will be changed which is where I come in to write a script to fix the issue. I have multiple scripts that I run. When those scripts are ran they may end up terminating because of an Error Code or an Exit Code. A large part of the time I do not that the script has failed immediately.
I am trying to figure out a script that I can run in a 2nd PowerShell Console Window. I am thinking that the only purpose of this script is to just sit there on the screen and wait and monitor. Then when I execute a script or Application (the only file extensions that I am worried about are: EXE, BAT, CMD, PS1) if the script/application that I just ran ends with an exit code or an error code....then output that to the screen...in REAL TIME.
Below, I have a small piece of code that kind of works, but it is not what I am wanting.
I have researched online and read and read tons of stuff. But I just can't seem to find what I am looking for.
Could someone please help me with getting a script that will do what I am wanting.
Thank you for your help!!!!
$ExitErrorCode =
"C:\ThisFolder\ThatFolder\AnotherFolder\SomeApplication.EXE # (this
would
# either be an EXE or CMD or BAT or PS1)"
$proc = Start-Process $ExitErrorCode -PassThru
$handle = $proc.Handle # cache proc.Handle
$proc.WaitForExit();
if ($proc.ExitCode -ne 0) {
Write-Warning "$_ exited with status code $($proc.ExitCode)"
}
Possible duplicate of the approaches shown here:
Monitoring jobs in a PowerShell session from another PowerShell session
Monitoring jobs in a PowerShell session from another PowerShell session
PowerShell script to monitor a log and output progress to another
PowerShell script to monitor a log and output progress to another

Run synchronous powershell script on startup

I'm aiming to run a Powershell script on the startup on my windows 10. This tutorial has perfectly explained that: https://devblogs.microsoft.com/scripting/use-powershell-to-create-job-that-runs-at-startup/
But I expect the job to keep my script running as my app is listening to some events but when I check the status of the job using the Get-job cmdlet it shows that it's completed. I think that it's treating my script as an asynchronous one and eventually, my doesn't listen to anything. Any idea how can I make it synchronous i.e. keep my script running forever? This is the content of my ps1 file:
C:\Users\m\Desktop\Scripts\Activate.ps1
python C:\Users\m\Desktop\app.py

Can a PowerShell script be dependent of another script's execution?

I have a situation where I want to make the execution of my scripts smarter. I have a set of scripts that execute at a given time, but because sometimes the input files are not posted at the correct times the scripts run into errors and get unexpected results. so one of the solutions I was thinking of is to make the execution of the scripts dependent of each other. Here is what I mean:
script 1 runs at 6 pm
validates that the file is there
if it's there, set a flag
the flag is active so execute script 2 at 9 pm
if it's NOT there, the flag is not set
the flag is not set so script 2 is not executed
Right now script 1 and script 2 are set with the Task Scheduler at those times, I checked the Scheduler for those type of conditions, but didn't find anything.
You can set triggers in Task Scheduler, like when an event happens for basically everything you can see in eventviewer.
I would suggest Write-Eventlog from the script which works on the file, and depending on the result the sched task would get triggerd.
I suggest you to have single script running every N-minutes on single scheduled task via Task Scheduler.
The master script will analyze activities and have all logical conditions those determine when and which external script to run. You can also have flag files.

Batch Script not Releasing after Execution

I have a batch script which performs file version controlling after a Backup event has taken place. This batch script, writing to a normal txt logfile, calls a PowerShell script to send this Logfile as an attachment with a success notification email. I have managed to release the writing lock on the log file, to allow PS to attach and send the file, but the Batch script does not stop after the entire sequence has been completed.
When I check the log file, I see that the shell instance has placed a 'Pause' in the script, instead of a self-termination (as it is instructed), and results in:
Press any key to continue... with a waiting shell
an app locked logfile, which won't allow the script to run again, unless the logfile is released.
This is the sequence of events:
The only Pause I have, is in < bak_send_exec.bat > - its sole purpose is to start a PS script:
PowerShell.exe -noprofile -executionpolicy bypass
If I remove it, the PS does not start. If I have it in there, the PS starts and executes flawlessly, but the logfile stays locked in a shell instances which is in Paused state, until someone kills the cmd.exe instances which locked the file.
This runs on a weekend at 01:00 am, so user intervention should not be required.
VC Script Summary:
This inter-connected batch files renames two identical files (in different locations) with timestamps. The timestamp is written to a variable for use in a notification email, which is sent using a PowerShell command. The entire process is logged to a txt log file (file overwritten when script runs again), and the log file is included with a Notification Email, mentioned earlier.
Script Calls:
Initial Start Command: Triggers the Version Control Procedures and Logs Progress with versioncontrol_post.bat > TSLog.txt 2>&1
versioncontrol_post.bat: Performs main procedure, then ends with CALL bak_send_exec.bat
bak_send_exec.bat: The suspected cause... Coding of entire file is three lines long, but required as mentioned earlier, for policy relaxation:
#ECHO OFF
PowerShell.exe -noprofile -executionpolicy bypass -file bak_send.ps1
PAUSE
bak_send.ps1: Performs main procedure to make a copy of the temporary log (TSLog.txt) to its final home, releases the TSLog file to work with the new duplicate of it, and continues to take that new duplicate and attach it to an email and sends email. The final line in the procedure is EXIT.
Fault finding tells me that the issue is not with the PowerShell script, but rather with the script that calls it. Taking out the PAUSE command results in the PowerShell not starting.
Does anyone have a possible solution to this "feature"?

VSTS build definition - prevent PowerShell exit behavior causing processes termination

I have a PowerShell task in my definition that calls another script file on its own which takes care of running several things on my build agent (starts several different processes) - emulators, node.js applications, etc.
Everything is fine up until the moment this step is done and the run continues. All of the above mentioned stuff gets closed with most of the underlying processes killed, thus, any further execution (e.g. tests run) is doomed to fail.
My assumption is that these processes are somehow dependent on the outermost (temporary) script that VSTS generates to process the step.
I tried with the -NoExit switch specified in the arguments list of my script, but to no avail. I've also read somewhere a suggestion to set this by default with a registry key for powershell.exe - still nothing.
The very same workflow was okay in Jenkins. How can I fix this?
These are the tasks I have:
The last PowerShell task calls a specified PowerShell file which calls several others on its own. They ensure some local dependencies and processes needed to start executing the tests, e.g. a running Node.js application (started in a separate console for example and running fine).
When the task is done and it is successful, the last one with the tests would fail because the Node.js application has been shut down as well as anything else that was started within the previous step. It just stops everything. That's why I'm currently running the tests within the same task itself until I find out how to overcome this behavior.
I am not sure how you call the dependencies and applications in your PowerShell script. But I tried with the following command in PowerShell script task to run a Node.js application:
invoke-expression 'cmd /c start powershell -Command {node main.js}'
The application keeps running after the PowerShell script task is passed and finished which should meet your requirement. Refer to this question for details: PowerShell launch script in new instance.
But you need to remember to close the process after the test is finished.
There is the Continue on error option (Control Options section). The build process will be continued if it is true (checked), but the build result will be partially succeeded.
You also can output the error or warning by using PowerShell or VSTS task commands (uncheck Fail on Standard Error option in the Advanced section) and terminate the current PowerShell process by using the exit keyword, for example:
Write-Warning “warning”
Write-Error “error”
Write-Host " ##vso[task.logissue type=warning;]this is the warning"
Write-Host " ##vso[task.logissue type=error;sourcepath=consoleapp/main.cs;linenumber=1;columnnumber=1;code=100;]this is an error "
More information about the VSTS task command, you can refer to: Logging Commands