Wait for all processes started by a command or script executed with the function system perl - perl

With the system functionlaunch a bash script. The function system waits for the bash script finished execution and I will return the exit status of the script.
The bash script in question, in its execution flow has a loop that executes n times the same script with different parameters.Obviously when the loop condition is no longer valid, the loop is terminated and the exit is invoked. In this way there are child processes of the executed script from perl function system that are zombies.
The system function does not wait for processes zombies but only the first script launched.
The my scenario is:
perl system function ---launch---> my bash script ---launch---> bash script
---launch---> bash script
---launch---> bash script
.............................
.............................
.............................
---launch---> bash script
To wait until all processing is done, I have to change the bash script or function I can resolve directly with the function system perl?

Change the bash script you call from Perl so it does:
bash subprocess &
bash subprocess &
bash subprocess &
...
wait
then it will wait for all its own children to complete before it exits itself. For example
sleep 5 &
sleep 5 &
sleep 5 &
wait
will take 5 seconds to run.

Related

How to use debugger on perl script executed by "exec"

I trying to figure out how a Perl script which is doing test status reporting, is working. The script executes another piece of perl script via exec. I am able single step through code in first script but when it hits exec, the script executed by exec runs till completion. Is there a way by which I will be able single step and look at variables in the script executed by exec?
Add below to the script which is being called with exec
#!/usr/bin/perl -d

Jenkins - Close the Jenkins job as soon as one of the parallel scrips fail

I'm running two Perl scripts in parallel in Jenkins
some shell commands
perl script 1 &
perl script 2 &
wait
some more shell commands
If one of the perl scripts fail in the middle of the execution , the job waits until the other script runs (as it is executed in parallel in background).
I want the job to stop as soon as one of the script fails and not waste time by completing the execution of other script.
Please help.
You set up a signal handler for SIGCHLD, which is a signal that is always delivered to the parent process when a child exits. I'm not aware of a mechanism to see which child process exited, but you can save the subprocess process identifiers and just kill both of them when you receive SIGCHLD:
some shell commands
perl script 1 &
pid1=$!
perl script 2 &
pid2=$!
trap "kill $pid1 $pid2" CHLD
wait
some more shell commands
The script above has the downside that it will kill the other script regardless of the exit status of the subprocess. You could in the trap, if you want to, add a check for the exit status. The subprocess could e.g. create some temp file if it succeeds and the trap could check if the file exists.
Typically with Jenkins you would have the parallel steps running as separate jobs (or projects as they are sometimes known) rather than steps in a job. This would then allow the steps to run in parallel across different slave machines and it would keep the output for the jobs in a separate place.
You would then have a controlling job running the other parts.
I like the Multijob plugin for this sort of thing.
There are alternatives which may suit better, such as Build Flow Plugin which uses a DSL to describe the jobs you want to run

Catching the error status while running scripts in parallel on Jenkins

I'm running two perl scripts in parallel on Jenkins and one more script which should get executed if the first two succeed. If I get an error in script1, script 2 still runs and hence the exit status becomes successful.
I want to run it in such a way that if any one of the parallel script fails, the job should stop with a failure status.
Currently my setup looks like
perl_script_1 &
perl_script_2 &
wait
perl_script_3
If script 1 or 2 fails in the middle, the job should be terminated with a Failure status without executing job 3.
Note: I'm using tcsh shell in Jenkins.
I have a similar setup where I run several java processes (tests) in parallel and wait for them to finish. If any fail, I fail the rest of my script.
Each test process writes its result to a file to be tested once done.
Note - the code examples below are written in bash, but it should be similar in tcsh.
To do this, I get the process id for every execution:
test1 &
test1_pid=$!
# test1 will write pass or fail to file test1_result
test2 &
test2_pid=$!
...
Now, I wait for the processes to finish by using the kill -0 PID command
For example test1:
# Check test1
kill -0 $test1_pid
# Check if process is done or not
if [ $? -ne 0 ]
then
echo process test1 finished
# check results
grep fail test1_result
if [ $? -eq 0 ]
then
echo test1 failed
mark_whole_build_failed
fi
fi
Same for other tests (you can do a loop to test all running processes periodically).
Later condition the rest of the execution based on mark_whole_build_failed.
I hope this helps.

Returning handle to calling script perl

I have an executable which can run perl scripts using the following command at the prompt:
blah.exe Launch.pl
The way we have our tests setup is that we call the Launch.pl from Parent.pl like this "blah.exe Launch.pl" - script within script. However, when executing the command with backticks/system command the parent .pl script execution waits till I get the handle back by closing and exiting out of the application (blah.exe). At this point the code in parent.pl continues to execute.
How do I return the handle back to the parent .pl script after I get done running the code that is contained in the Launch.pl
So, parent.pl calls "blah.exe Launch.pl"; but after running the code inside Launch.pl inside the application (blah.exe) it just sits there waiting to be exited out of so that the code in parent.pl can continue running. I need to keep the application (blah.exe) open till I am done running a bunch of scripts one after another.
Run blah.exe in the background. When you are done with the Parent.pl, terminate the application with kill.

How do I exit the command shell after it invokes a Perl script?

If I run a Perl script from a command prompt (c:\windows\system32\cmd.exe), how can I exit the command prompt after the script finishes executing.
I tried system("exit 0") inside the Perl script but that doesn't exit the cmd prompt shell from where the Perl script is running.
I also tried exit; command in the Perl script, but that doesn't work either.
Try to run the Perl script with a command line like this:
perl script.pl & exit
The ampersand will start the second command after the first one has finished. You can also use && to execute the second command only if the first succeeded (error code is 0).
Have you tried cmd.exe /C perl yourscript.pl ?
According to cmd.exe /? /C carries out the command specified by string and then terminates.
If you're starting the command shell just to run the perl script, the answer by Arkaitz Jimenez should work (I voted for it.)
If not, you can create a batch file like runmyscript.bat, with content:
#echo off
perl myscript.pl
exit
The exit will end the shell session (and as a side effect, end the batch script itself.)
You can start the program in a new window using the START Dos command. If you call that with /B then no additional window is created. Then you can call EXIT to close the current window.
Would that do the trick?
You can send a signal to the parent shell from Perl:
kill(9,$PARENT_PID);`
Unfortunately, the getppid() function is not implemented in Perl on windows so you'll have to find out the parent shell PID via some other means. Also, signal #9 might not be the best choice.