Exit Codes when using die in Perl - perl

I have overridden die in perl for my logging framework, so that it can log messages and print it on console.
Overridden code for die:
BEGIN{ *CORE::GLOBAL::die = sub {
my ($package, $filename, $line, $subroutine) = caller;
untie *STDERR;
my $message;
foreach my $arg (#_) {
$message = $message.$arg;
}
print STDERR $message;
tie *STDERR, __PACKAGE__, (*STDERR);
logmessage("die",$message,$filename, $line);
#What exit code to pass?
#exit CODE;
}
}
I don't know what exit code to set while exiting the process as the normal die exits with an error code.
Is there any way I can find out what exit code to set when die is
called?
Also It would be helpful if can know the list of error codes availabe
in perl?

The exit code is documented in die:
exit $! if $!; # errno
exit $? >> 8 if $? >> 8; # child exit status
exit 255; # last resort
But as #amon noted, die doesn't exit, it throws an exception. Instead of overriding it, it might be clearer to wrap the whole thing into an eval { ... ; 1 } (or Try::Tiny's try) and log the exception in the or do or catch part.

die() exits with a none-zero exit code (but it's not defined, which, I believe):
jan#jancooltek ~ $ perl
die("test");
test at - line 1.
jan#jancooltek ~ $ echo $?
9
However, with -e:
jan#jancooltek ~ $ perl -e 'die("test")'
test at -e line 1.
jan#jancooltek ~ $ echo $?
255
exit() can use any exit code you'd like, there are no specific rules in Perl.
Settle on something != 0 and use that for these generic errors.

Related

Perl script to log external executables output and error while that still runs

I have a Perl script, which runs an external executable. That executable runs for a while (sometimes seconds, sometimes an hour), can spit out text to both STDOUT and STDERR as well as an exit code, which all are needed. Following code demonstrates first successful external executable run (small bash script with one line - the comment), then with bad exit status (example with gs - ghostscript).
I want the external executable give its STDOUT to the Perl script for evaluation, filtering, formatting etc. before it gets logged to a logfile (used for other stuff as well) while the external is still executing. STDERR would also be great to be worked on same way.
This script is in stand to log everything from STDOUT, but only after the executable has finished. And the STDERR is logged only directly, without evaluations etc. I have no possibility to install any additional Perl parts, modules etc.
How do I get my Perl script to get each line (STDOUT + STDERR) from the executable while it is spitting it out (not just at the end) as well as its exit code for other purposes?
#!/usr/bin/perl
#array_executable_and_parameters = "/home/username/perl/myexecutable.sh" ; #ls -lh ; for i in {1..5}; do echo X; sleep 1; done
#array_executable_and_parameters2= "gs aaa" ;
my $line;
chdir("/home/username/perl/");
$logFileName = "logfileforsomespecificinput.log";
open(LOGHANDLE, ">>$logFileName" );
open (STDERR, '>>', $logFileName); #Prints to logfile directly
#open (STDERR, '>>', <STDOUT>); #Prints to own STDOUT (screen or mailfile)
print LOGHANDLE "--------------OK run\n";
open CMD, '-|', #array_executable_and_parameters or die $#;
while (defined($line = <CMD>)) { #Logs all at once at end
print LOGHANDLE "-----\$line=$line-----\n";
}
close CMD;
$returnCode1= $?>>8;
print LOGHANDLE "\$returnCode1=$returnCode1\n";
print LOGHANDLE "--------------BAD run\n";
open CMD2, '-|', #array_executable_and_parameters2 or die $#;
while (defined($line = <CMD2>)) {
print LOGHANDLE "-----\$line=$line-----\n";
}
close CMD2;
$returnCode2= $?>>8;
print LOGHANDLE "\$returnCode2=$returnCode2\n";
close(LOGHANDLE);
Take 2. After good advice in comments I have tried the IPC::Run. But something still does not work as expected. I seem to be missing how the looping from start (or pump?) to finish works, as well as how to get it to iterate when I do not know what the last output would be - as the examples everywhere mentions. So far I have now the following code, but it does not work line by line. It spits out listing of files in one go, then waits until the external loop is fully finished to print all the X's out. How do I tame it to the initial needs?
#! /usr/bin/perl
use IPC::Run qw( start pump finish );
#array_executable_and_parameters = ();
push(#array_executable_and_parameters,"/home/username/perl/myexecutable.sh"); #ls -lh ; for i in {1..5}; do echo X; sleep 1; done
my $h = start \#array_executable_and_parameters, \undef, \$out, \$err ;
pump $h;# while ($out or $err);
print "1A. \$out: $out\n";
print "1A. \$err: $err\n";
$out = "";
$err = "";
finish $h or die "Command returned:\n\$?=$?\n\$#=$#\nKilled by=".( $? & 0x7F )."\nExit code=".( $? >> 8 )."\n" ;
print "1B. \$out: $out\n";
print "1B. \$err: $err\n";
Look at IPC modules, especially IPC::Cmd, IPC::Run and if not satisfied then IPC::Run3. There is a lot of details you would have to cover and those modules will make your life a lot easier.
OK, have got it to work, so far. Might have some issues - not sure about environment variables, like umask or language related or the system load when push is waiting/blocking, or how to replace die with capturing of all variables for status. Nevertheless for my purpose, seems to work well. Will see how it works on a real system.
#! /usr/bin/perl
BEGIN {
push #INC, '/home/myusername/perl5/lib/perl5'; #Where the modules from Cpan are
}
use IPC::Run qw( start pump finish );
#array_executable_and_parameters = ();
push(#array_executable_and_parameters,"/home/myusername/perl/myexecutable.sh"); #ls -lh ; for i in {1..5}; do echo X; sleep 1; done
my $h = start \#array_executable_and_parameters, \undef, \$out, \$err ;
while (42) {
pump $h;# while ($out or $err);
if ($out eq '' and $err eq '') {last;}
print "1A. \$out: $out\n";
print "1A. \$err: $err\n";
$out = "";
$err = "";
}
finish $h or die "Command returned:\n\$?=$?\n\$#=$#\nKilled by=".( $? & 0x7F )."\nExit code=".( $? >> 8 )."\n" ;
print "1B. \$out: $out\n";
print "1B. \$err: $err\n";
The key was understanding how the blocking of pump works. All the manuals and help places kind of skipped over this part. So a neverending while which jumps out when pump lets go further without output was the key.

Capturing output, error and exit code with Perl IPC::Open3 fails in newer Perl

I have a small subroutine that uses IPC::Open3 (usually I use Capture::Tiny) because I wanted to use only core Perl modules. This subroutine is part of larger installation script. It captures command output, error and exit code. It works on Centos 6 (and 5) with Perl 5.10.1 (IPC::Open3 1.04) but it fails on Ubuntu 14 LTS with Perl 5.18.1 (IPC::Open3 1.13). Could someone explain to me why it fails on newer Perl and how to fix it.
sub _capture_output {
croak( '_capture_output() needs a $cmd and options' ) unless (#_ == 2);
my ($cmd, $param_href) = #_;
my $verbose = defined $param_href->{verbose} ? $param_href->{verbose} : 0; #default is silent
print "Report: COMMAND is: $cmd\n" if $verbose;
local $| = 1; #autoflush
my ( $in, $out, $err );
open my ($in_fh), '<', \$in;
open my ($out_fh), '>>', \$out;
open my ($err_fh), '>>', \$err;
my $pid = open3($in_fh, $out_fh, $err_fh, $cmd);
my $stdout = $out;
my $stderr = $err;
$stdout = '' if !defined $stdout;
$stderr = '' if !defined $stderr;
waitpid( $pid, 0 ) or die "$!\n";
my $exit = $? >> 8;
if ($verbose == 2) {
print 'STDOUT is: ', "$stdout", "\n", 'STDERR is: ', "$stderr", "\n", 'EXIT is: ', "$exit\n";
}
return $stdout, $stderr, $exit;
}
It fails on line with open3 call with error:
Report: COMMAND is: plenv --version
Uncaught exception from user code:
open3: exec of plenv --version failed at ./Perlinstall.pm line 175.
IPC::Open3::_open3('open3', 'GLOB(0x27819a0)', 'GLOB(0x2781730)', 'GLOB(0x2781b68)', 'plenv --version') called at /usr/share/perl/5.18/IPC/Open3.pm line 250
When I try this command on command line I get same error code but different error.
Centos6:
$ plenv --version
-bash: plenv: command not found
$ echo $?
127
Ubuntu14:
$ plenv --version
No command 'plenv' found, did you mean:
Command 'p7env' from package 'libnss3-tools' (main)
plenv: command not found
$ echo $?
127
It's not clear what you think the problem is, but open3 is behaving as documented.
It doesn't return on failure: it just raises an exception matching /^open3:/.
The program you are trying to execute doesn't exist, so open3 throws an exception. I think you are wondering why $? isn't set and why nothing was printed to the the handle in $stderr, but that shouldn't be a surprise since the program never ran.
Keep in mind that exec (used by open3) will bypass the shell if the command is a string with no shell characters except whitespace, so you'll get different results from executing
plenv --version
# Same as: open3(..., ..., ..., 'plenv', '--version')
# Exception: Can't find plenv
and
plenv '--version'
# Same as: open3(..., ..., ..., '/bin/sh', '-c', q{plenv '--version'})
# Shell exits with an error in ($? >> 8)
You can catch exceptions using eval BLOCK.

perl background process

I am trying to run a background process in perl. I create a child process, which is used to call another perl script. I want to run few lines of code parallely with this child process. And after the child process is done.I want to print a line of code.
Main script
#!/usr/bin/perl
$|=1;
print "before the child process\n";
my $pid = fork();
if (defined $pid)
{
system("perl testing.pl");
}
print "before wait command\n";
wait();
print "after 20 secs of waiting\n";
testing.pl
#!/usr/bin/perl
print "inside testing\n";
sleep(20);
Expected output
before the child process
before wait command
(should wait for 20 secs and then print)
after 20 secs of waiting
There are many problems with your script. Always:
use strict;
use warnings;
localising special variables is a good practice. Only a variable containing the special value undef returns false for defined. So, every other value (even a 0; which is the case here) returns true for defined. In the other script, the shebang is wrong.
#!/usr/bin/perl
use strict;
use warnings;
local $| = 1;
print "Before the child process\n";
unless (fork) {
system("perl testing.pl");
exit;
}
print "Before wait command\n";
wait;
print "After 20 secs of waiting\n";
The “Background Processes” section of the perlipc documentation reads
You can run a command in the background with:
system("cmd &");
The command’s STDOUT and STDERR (and possibly STDIN, depending on your shell) will be the same as the parent’s. You won't need to catch SIGCHLD because of the double-fork taking place; see below for details.
Adding an ampersand to the argument to system in your program can vastly simplify your main program.
#! /usr/bin/env perl
print "before the child process\n";
system("perl testing.pl &") == 0
or die "$0: perl exited " . ($? >> 8);
print "before wait command\n";
wait;
die "$0: wait: $!" if $? == -1;
print "after 20 secs of waiting\n";
fork return value handling is a bit tricky, indeed.
Recent article by Aristotle features a nice and concise forking idiom, which, in your case, looks like:
#!/usr/bin/env perl
use 5.010000;
use strict;
use warnings qw(all);
say 'before the child process';
given (fork) {
when (undef) { die "couldn't fork: $!" }
when (0) {
exec $^X => 'testing.pl';
} default {
my $pid = $_;
say 'before wait command';
waitpid $pid, 0;
say 'after 20 secs of waiting';
}
}
Pay attention to exec $^X => '...' line: the $^X variable holds the full path to the current Perl executable, so the "right Perl version" will be guaranteed. Also, system call is pointless when you're pre-forking.

Why is Perl's $? returning the wrong value for the exit code of a forked process?

Consider this trivial example of fork()ing then waiting for a child to die in Perl:
#!/usr/bin/perl
use strict;
use warnings;
if (fork() == 0) {
exit(1);
}
waitpid(-1,0);
print $?;
Running the script on Solaris 10 I get this result:
$ perl test.pl
256
I suspect the values of are being shifted upwards because when I do exit(2) in the child, the output becomes 512.
I can't seem to find this documented in perl's waitpid. Is this a bug on my system or am I doing something wrong?
It's documented in the $? section of the perlvar man page.
i.e. the real exit code is $? >> 8.
The child might not even have gotten to call exit. As such, $? packs more information than just the exit parameter.
if ( $? == -1 ) { die "Can't launch child: $!\n"; }
elsif ( $? & 0x7F ) { die "Child killed by signal ".( $? & 0x7F )."\n"; }
elsif ( $? >> 8 ) { die "Child exited with error ".( $? >> 8 )."\n"; }
else { print "Child executed successfully\n"; }
This is documented more clearly in system's documentation.

How can I terminate a system command with alarm in Perl?

I am running the below code snippet on Windows. The server starts listening continuously after reading from client. I want to terminate this command after a time period.
If I use alarm() function call within main.pl, then it terminates the whole Perl program (here main.pl), so I called this system command by placing it in a separate Perl file
and calling this Perl file (alarm.pl) in the original Perl File using the system command.
But in this way I was unable to take the output of this system() call neither in the original Perl File nor in called one Perl File.
Could anybody please let me know the way to terminate a system() call or take the output in that way I used above?
main.pl
my #output = system("alarm.pl");
print"one iperf completed\n";
open FILE, ">display.txt" or die $!;
print FILE #output_1;
close FILE;
alarm.pl
alarm 30;
my #output_1 = readpipe("adb shell cd /data/app; ./iperf -u -s -p 5001");
open FILE, ">display.txt" or die $!;
print FILE #output_1;
close FILE;
In both ways display.txt is always empty.
There are a few separate issues here.
First, to keep the alarm from killing your script, you need to handle the ALRM signal. See the alarm documentation. You shouldn't need two scripts for this.
Second, system doesn't capture output. You need one of the backtick variants or a pipe if you want to do that. There are answers for that on Stackoverflow already.
Third, if alarm.pl puts anything in display.txt, you discard it in main.pl when you re-open the file in write mode. You only need to create the file in one place. When you get rid of the extra script, you won't have this problem.
I recently had some problems with alarm and system, but switching to IPC::System::Simple fixed that.
Good luck, :)
What the hell was I thinking? You don't need a background process for this task. You just need to follow the example in the perldoc -f alarm function and wrap your time-sensitive code in an eval block.
my $command = "adb shell cd /data/app; ./iperf -u -s -p 5001";
my #output;
eval {
local $SIG{ALRM} = sub { die "Timeout\n" };
alarm 30;
#output = `$command`;
alarm 0;
};
if ($#) {
warn "$command timed out.\n";
} else {
print "$command successful. Output was:\n", #output;
}
Inside the eval block, you can capture your output the regular way (with backticks or qx() or readpipe). Though if the call times out, there won't be any output.
If you don't need the output (or don't mind hacking some interprocess communication together), an almost idiot-proof alternative is to set the alarm and run the system call in a child process.
$command = "adb shell cd /data/app; ./iperf -u -s -p 5001";
if (($pid = fork()) == 0) {
# child process
$SIG{ALRM} = sub { die "Timeout\n" }; # handling SIGALRM in child is optional
alarm 30;
my $c = system($command);
alarm 0;
exit $c >> 8; # if you want to capture the exit status
}
# parent
waitpid $pid, 0;
waitpid will return when either the child's system command is finished, or when the child's alarm goes off and kills the child. $? will hold the exit code of the system call, or something else (142 on my system) for an unhandled SIGALRM or 255 if your SIGALRM handler calls die.
I run into a similar problem that requires:
run a system command and get its output
time out the system command after x seconds
kill the system command process and all child processes
After much reading about Perl IPC and manual fork & exec, I came out with this solution. It is implemented as a simulated 'backtick' subroutine.
use Error qw(:try);
$SIG{ALRM} = sub {
my $sig_name = shift;
die "Timeout by signal [$sig_name]\n";
};
# example
my $command = "vmstat 1 1000000";
my $output = backtick(
command => $command,
timeout => 60,
verbose => 0
);
sub backtick {
my %arg = (
command => undef,
timeout => 900,
verbose => 1,
#_,
);
my #output;
defined( my $pid = open( KID, "-|" ) )
or die "Can't fork: $!\n";
if ($pid) {
# parent
# print "parent: child pid [$pid]\n" if $arg{verbose};
try {
alarm( $arg{timeout} );
while (<KID>) {
chomp;
push #output, $_;
}
alarm(0);
}
catch Error with {
my $err = shift;
print $err->{-text} . "\n";
print "Killing child process [$pid] ...\n" if $arg{verbose};
kill -9, $pid;
print "Killed\n" if $arg{verbose};
alarm(0);
}
finally {};
}
else {
# child
# set the child process to be a group leader, so that
# kill -9 will kill it and all its descendents
setpgrp( 0, 0 );
# print "child: pid [$pid]\n" if $arg{verbose};
exec $arg{command};
exit;
}
wantarray ? #output : join( "\n", #output );
}
Might use "timeout -n " for wrapping your commands if thats already common on your system.