Returning handle to calling script perl - perl

I have an executable which can run perl scripts using the following command at the prompt:
blah.exe Launch.pl
The way we have our tests setup is that we call the Launch.pl from Parent.pl like this "blah.exe Launch.pl" - script within script. However, when executing the command with backticks/system command the parent .pl script execution waits till I get the handle back by closing and exiting out of the application (blah.exe). At this point the code in parent.pl continues to execute.
How do I return the handle back to the parent .pl script after I get done running the code that is contained in the Launch.pl
So, parent.pl calls "blah.exe Launch.pl"; but after running the code inside Launch.pl inside the application (blah.exe) it just sits there waiting to be exited out of so that the code in parent.pl can continue running. I need to keep the application (blah.exe) open till I am done running a bunch of scripts one after another.

Run blah.exe in the background. When you are done with the Parent.pl, terminate the application with kill.

Related

Run synchronous powershell script on startup

I'm aiming to run a Powershell script on the startup on my windows 10. This tutorial has perfectly explained that: https://devblogs.microsoft.com/scripting/use-powershell-to-create-job-that-runs-at-startup/
But I expect the job to keep my script running as my app is listening to some events but when I check the status of the job using the Get-job cmdlet it shows that it's completed. I think that it's treating my script as an asynchronous one and eventually, my doesn't listen to anything. Any idea how can I make it synchronous i.e. keep my script running forever? This is the content of my ps1 file:
C:\Users\m\Desktop\Scripts\Activate.ps1
python C:\Users\m\Desktop\app.py

How to increase the execution time in perl?

I am trying to run a Perl CGI script (named script.cgi) on apache2 server but after executing two or three commands, it stops with errors in the log file:
"AH01215: Fatal Error:: /usr/lib/cgi-bin/script.cgi"
"AH01215: Program halted !!: /usr/lib/cgi-bin/script.cgi"
It successfully executes the first three external commands mentioned in the script but fails at the fourth one as it takes more time in generating complete results. I also tried printing the output of the command so that I could know whether the command is being executed or not, and I found that it is being executed but getting failed after a specific period of time.
I tried using Time::Out module but it is not working either. The external commands are being executed in backticks(``) because the system() is not capturing output and exec() does not wait for the command to finish and starts executing the next command.
Is there any way I could modify the timeout settings in apache2? Because I tried that too but I couldn't find any file or mod for this purpose. Please help me out.
In httpd.conf there is a Timeout directive. It usually defaults to 60 seconds.

How to use debugger on perl script executed by "exec"

I trying to figure out how a Perl script which is doing test status reporting, is working. The script executes another piece of perl script via exec. I am able single step through code in first script but when it hits exec, the script executed by exec runs till completion. Is there a way by which I will be able single step and look at variables in the script executed by exec?
Add below to the script which is being called with exec
#!/usr/bin/perl -d

Jenkins - Close the Jenkins job as soon as one of the parallel scrips fail

I'm running two Perl scripts in parallel in Jenkins
some shell commands
perl script 1 &
perl script 2 &
wait
some more shell commands
If one of the perl scripts fail in the middle of the execution , the job waits until the other script runs (as it is executed in parallel in background).
I want the job to stop as soon as one of the script fails and not waste time by completing the execution of other script.
Please help.
You set up a signal handler for SIGCHLD, which is a signal that is always delivered to the parent process when a child exits. I'm not aware of a mechanism to see which child process exited, but you can save the subprocess process identifiers and just kill both of them when you receive SIGCHLD:
some shell commands
perl script 1 &
pid1=$!
perl script 2 &
pid2=$!
trap "kill $pid1 $pid2" CHLD
wait
some more shell commands
The script above has the downside that it will kill the other script regardless of the exit status of the subprocess. You could in the trap, if you want to, add a check for the exit status. The subprocess could e.g. create some temp file if it succeeds and the trap could check if the file exists.
Typically with Jenkins you would have the parallel steps running as separate jobs (or projects as they are sometimes known) rather than steps in a job. This would then allow the steps to run in parallel across different slave machines and it would keep the output for the jobs in a separate place.
You would then have a controlling job running the other parts.
I like the Multijob plugin for this sort of thing.
There are alternatives which may suit better, such as Build Flow Plugin which uses a DSL to describe the jobs you want to run

Error handling in sets of batch files running in Windows task scheduler

Let's say I have 5 batch files that run sequentially one after another (executed via the Windows task scheduler on a normal Windows XP PC):
Script1.bat
Script2.bat
Script3.bat
Script4.bat
Script5.bat
Suppose one of the scripts fail (an error condition is detected -- details on how this happens is not important for my question here). How do I stop the other scripts from running if they all run within the task scheduler? For example, if Script1.bat fails, I don't want to run Script2-5.bat. If Script3.bat fails, I don't want to run Script4-5.bat, etc.
I thought about writing a flag value to a temporary file that each script would read from. At the beginning of each script (except for the first one), it will check to see if the flag is valid. The first script would clear out this flag at the beginning each time these set of batch files run.
Surely there is a better way to do this or maybe there is a standard for how to handle this type of situation? Thanks!
Write a master.bat file that conditionally calls each of the scripts in sequence. Then schedule the master instead of directly scheduling the 5 scripts.
#echo off
call Script1.bat
if %errorlevel%==0 call Script2.bat
if %errorlevel%==0 call Script3.bat
if %errorlevel%==0 call Script4.bat
if %errorlevel%==0 call Script5.bat