Warning within IPC::Open3 when using open3 twice - perl

I am using IPC::Open3 for the suggestion given by Hans Lub here.
My issue is that the open3 call works correctly for the first time, but subsequent invocations return the warning:
Use of uninitialized value in numeric ne (!=) at /usr/lib/perl5/5.8.8/IPC/Open3.pm line 215.
The code sample I am using looks like this:
use IPC::Open3;
my $pid;
# dup the old standard output and error
open(OLDOUT, ">&STDOUT") or die "Can't dup STDOUT: $!\n";
open(OLDERR, ">&STDERR") or die "Can't dup STDERR: $!\n";
my $transcript_file = "transcript.temp";
# reopen stdout and stderr
open (STDOUT, "|tee -i $transcript_file") or die "Can't reopen STDOUT: $!\n";
open (STDERR, ">&STDOUT") or die "Can't reopen STDERR: $!\n";
# print statements now write to log
print "Logging important info: blah!\n";
print STDERR "OOPS!\n";
#eval { $pid = open3("\*STDIN", "\*OLDOUT", "\*OLDERR", "ls"); }; # Tried this, but doesnt seem to help. Output does not appear on STDOUT.
eval { $pid = open3(">&STDIN", ">&OLDOUT", ">&OLDERR", "ls"); }; #This works correctly
waitpid( $pid, 0 );
eval { $pid = open3(">&STDIN", ">&OLDOUT", ">&OLDERR", "ls"); }; #First warning
waitpid( $pid, 0 );
eval { $pid = open3(">&STDIN", ">&OLDOUT", ">&OLDERR", "ls"); }; #Second warning
waitpid( $pid, 0 );
I apologize if I look to be trying to get others solve my problems, but I just can't seem to get around this, and looking inside Perl modules is beyond my current understanding.

It doesn't make sense to give the same STDIN to multiple parallel process. open3 thus assumes the handle you tell open3 to use isn't used by anything else, so it closes it.
It looks like your children aren't using the STDIN you provide them, so you should provide a handle to /dev/null.
open(local *CHILD_STDIN, '<', '/dev/null') or die $!;
$pid = open3('<&CHILD_STDIN', '>&STDOUT', '>&STDERR', #cmd);

I think the problem is the way open3 uses the file handles that you pass. If you use, say, >&STDOUT then the file handle is duped, the dupe is passed to the child process, and the parent's copy is closed. That means the second time you do the same thing you are duping a closed file handle, which doesn't have the effect you want.
The only way around this that I can see is to dupe the file handles separately and pass the dupes to the child process. It won't matter that the parent's copy of the dupes is closed because it still has the original STDOUT etc. Unfortunately it adds another three statements to each open3 call, so you woul probably want to wrap the whole thing in a subroutine, like this.
my_open3('ls');
my_open3('ls');
my_open3('ls');
sub my_open3 {
my #cmd = #_;
my $pid;
open IN_COPY, '<&', STDIN or die "Couldn't dup STDIN: $!";
open OUT_COPY, '>&', STDOUT or die "Couldn't dup STDOUT: $!";
open ERR_COPY, '>&', STDERR or die "Couldn't dup STDERR: $!";
eval {
$pid = open3('>&IN_COPY', '>&OUT_COPY', '>&ERR_COPY', #cmd);
};
waitpid $pid, 0;
}
This isn't the nicest of solutions, so if anyone can see anything better then please chime in. The only alternative I can see is to let the parent keep its own standard IO handles and use completely new ones to communicate with the child process each time. Then the parent would have mess with IO::Select to do the copying from the child output to its own STDOUT and STDERR.
As nwellnhof says, if the child doesn't use its STDIN (as is the case with the ls command) then you can just pass undef as the first parameter. That saves duplicating one of three standard handles.

Related

Perl: open or [block of code]?

I'm running script A which feeds ARGV containing the path to a file to perl script B. This is done by a
local #ARGV = ($file, $file2, etc.);
do scriptB.pl or die "scriptB has failed";
Script B then tries to open the file:
open( my $fh_file, "<", $file )
or die "Could not open file '$file' $!";
However, if the file is missing I do not get the message quoted after "or die" in B. Instead I get the do scriptB.pl or die message in A. If I remove the "or die" from A, the script continues after B silently dies as if nothing went wrong.
I was wondering if there was any way to get B to print its die message?
Better yet, what is the best way to have B run a block of code after it fails to open the file? Said code would for example write to a separate file listing which files were missing so that the user may easily track down such errors.
#something like
open( my $fh_file, "<", $file) or {
print "the file could not be found";
die;
}
The only thing I've found searching the net for help was someone mentioning a "or do {}", but this is giving me strange syntax errors so I am not sure if I'm using it right.
If you want to continue to use the open(...) or ... syntax, then you could use do.
open my $fh, '<', $file or do {
...
};
But I think it's probably clearer to switch to if
if (! open my $fh, '<', $file) {
...
}
Or even unless
unless (open my $fh '<', $file) {
...
}
I think you'll get clearer code with fewer gotchas if you put script B into a module, and load it with use or require and call the function(s) in there directly with clear parameters.
What you're missing here is that do involves an eval behind the scenes, and that results in the exception confusion. You can more or less avoid that confusion by moving your script B code into a function in a module, and calling it.
(Also, perl 5.26 will have a slight hiccup with do wherein the current directory will be removed from the directory lookup, due to security concerns. use and require have the same hiccup, but this may be less surprising since you should put your module into a path you explicitly get into the #INC load path.)
die doesn't print a message; die throws an exception. When you catch that exception you don't do anything with the message passed to die. Replace
local #ARGV = ($file, $file2, etc.);
do scriptB.pl or die "scriptB has failed";
with
local #ARGV = ($file, $file2, etc.);
do scriptB.pl or die "scriptB has failed: ". ( $# || $! );

simply tee in Perl without fork, File::Tee, or piping to tee

Is there a simple way in Perl to send STDOUT or STDERR to multiple places without forking, using File::Tee, or opening a pipe to /usr/bin/tee?
Surely there is a way to do this in pure perl without writing 20+ lines of code, right? What am I missing? Similar questions have been asked, both here on SO and elsewhere, but none of the answers satisfy the requirements that I not have to
fork
use File::Tee / IO::Tee / some other module+dependencies
whose code footprint is 1000x larger than my actual script
open a pipe to the actual tee command
I can see the use of a Core module as a tradeoff here, but really is that needed?
It looks like I can simply do this:
BEGIN {
open my $log, '>>', 'error.log' or die $!;
$SIG{__WARN__} = sub { print $log #_ and print STDERR #_ };
$SIG{__DIE__} = sub { warn #_ and exit 1 };
}
This simply and effectively sends most error messages both to the original STDERR and to a log file (apparently stuff trapped in an eval doesn't show up, I'm told). So there are downsides to this, mentioned in the comments. But as mentioned in the original question, the need was specific. This isn't meant for reuse. It's for a simple, small script that will never be more than 100 lines long.
If you are looking for a way to do this that isn't a "hack", the following was adapted from http://grokbase.com/t/perl/beginners/096pcz62bk/redirecting-stderr-with-io-tee
use IO::Tee;
open my $save_stderr, '>&STDERR' or die $!;
close STDERR;
open my $error_log, '>>', 'error.log' or die $!;
*STDERR = IO::Tee->new( $save_stderr, $error_log ) or die $!;

perl bug when STDIN, STDOUT and STDERR get closed?

When writing a daemon, I want to close STDIN, STDOUT and STDERR for "good daemon behavior". But I got surprised. Subsequent opening of files require the same properties as the old STDIN, STDOUT and STDERR (because their fileno-s got re-opened?)
Here is warn.pl:
use warnings;
my $nrWarnings = 0;
$SIG{__WARN__} = sub {
no warnings;
$nrWarnings++;
open my $log, '>>', '/tmp/log.txt'
or die;
printf $log "%d: %s", $nrWarnings, #_;
close $log;
};
close STDOUT;
close STDERR;
close STDIN;
open my $o, '>', '/tmp/foobar.txt'
or die;
open my $i, '<', '/etc/passwd'
or die;
open my $i2, '<', '/etc/passwd'
or die;
open my $i3, '<', '/etc/passwd'
or die;
exit $nrWarnings;
And here I run it:
> rm -f /tmp/log.txt ; perl warn.pl; echo $? ; cat /tmp/log.txt
3
1: Filehandle STDIN reopened as $o only for output at warn.pl line 20.
2: Filehandle STDOUT reopened as $i only for input at warn.pl line 22.
3: Filehandle STDERR reopened as $i2 only for input at warn.pl line 24.
I was expecting no warnings and $? == 0. Where is the bug? In my code or in perl?
This may appear similar to How can I reinitialize Perl's STDIN/STDOUT/STDERR?, but there the accepted solution was to close STDIN, STDOUT and STDERR like I do.
Those are warnings, not errors. I suppose they exist because if your program subsequently forked and execed a different program, that program would be mightily confused that its standard input stream is opened for output and its standard output and error streams are opened for input.
It's perfectly legitimate to suppress warnings when you're sure you know what you're doing. In this case, you'd just add no warnings 'io'; prior to your opens.
Now, right after hitting submit, I think of looking in perldoc perldiag and the warning text is listed there. That leads me to Perl bug #23838 which basically states: "Well, don't close those handles, re-open them to '/dev/null' instead".
And the bug is marked as resolved after that.
I disagree that re-opening to '/dev/null' is the correct way (tm), but now we venture into opinion, which is off-topic for stackoverflow, so I'll mark this as answered.
Sorry for the noise.

How can I redirect standard output to a file in Perl?

I'm looking for an example of redirecting stdout to a file using Perl. I'm doing a fairly straightforward fork/exec tool, and I want to redirect the child's output to a file instead of the parents stdout.
Is there an equivilant of dup2() I should use? I can't seem to find it
From perldoc -f open:
open STDOUT, '>', "foo.out"
The docs are your friend...
As JS Bangs said, an easy way to redirect output is to use the 'select' statement.
Many thanks to stackoverflow and their users. I hope this is helpful
for example:
print "to console\n";
open OUTPUT, '>', "foo.txt" or die "Can't create filehandle: $!";
select OUTPUT; $| = 1; # make unbuffered
print "to file\n";
print OUTPUT "also to file\n";
print STDOUT "to console\n";
# close current output file
close(OUTPUT);
# reset stdout to be the default file handle
select STDOUT;
print "to console";
The child itself can do select $filehandle to specify that all of its print calls should be directed to a specific filehandle.
The best the parent can do is use system or exec or something of the sort to do shell redirection.
open my $fh, '>', $file;
defined(my $pid = fork) or die "fork: $!";
if (!$pid) {
open STDOUT, '>&', $fh;
# do whatever you want
...
exit;
}
waitpid $pid, 0;
print $? == 0 ? "ok\n" : "nok\n";
A strictly informational but impractical answer:
Though there's almost certainly a more elegant way of going about this depending on the exact details of what you're trying to do, if you absolutely must have dup2(), its Perl equivalent is present in the POSIX module. However, in this case you're dealing with actual file descriptors and not Perl filehandles, and correspondingly you're restricted to using the other provided functions in the POSIX module, all of which are analogous to what you would be using in C. To some extent, you would be writing C in very un-Perlish Perl.
http://perldoc.perl.org/POSIX.html

Perl open file problem

I am having some trouble trying to print from a file. Any ideas? Thanks
open(STDOUT,">/home/int420_101a05/shttpd/htdocs/receipt.html");
#Results of a sub-routine
&printReceipt;
close(STDOUT);
open(INF,"/home/int420_101a05/shttpd/htdocs/receipt.html"); $emailBody = <INF>;
close(INF);
print $emailBody;
ERRORS: Filehandle STDOUT reopened as INF only for input at ./test.c line 6.
print() on closed filehandle STDOUT at ./test.c line 9.
This discussion addresses the technical reason for the message. Relevant info from the thread is this:
From open(2) manpage:
When the call is successful, the file descriptor returned will be
the lowest file descriptor not currently open for the process.
But STDOUT still refers to the
filehandle #1. This warning could be
useful. Although one can argue that
further uses of STDOUT as an output
filehandle will trigger a warning as
well...
So, to summarize, you closed STDOUT (file descriptor 1) and your file will be open as FD#1. That's due to open()'s properties.
As other have noted, the real reason you're having this problem is that you should not use STDOUT for printing to a file unless there's some special case where it's required.
Instead, open a file for writing using a new file handle:
open(OUTFILE,">/home/int420_101a05/shttpd/htdocs/receipt.html")
|| die "Could not open: $!";
print OUTFILE "data";
close(OUTFILE);
To print to filehandle from subroutine, just pass the file handle as a parameter.
The best way of doing so is to create an IO::File object and pass that object around
my $filehandle = IO::File->new(">$filename") || die "error: $!";
mySub($filehandle);
sub mySub {
my $fh = shift;
print $fh "stuff" || die "could not print $!";
}
You can also set a particular filehandle as a default filehandle to have print print to that by default using select but that is a LOT more fragile and should be avoidded in favor of IO::File solution.
If you want to temporarily change the standard output, use the select builtin. Another option is to localize the typeglob first:
{
local *STDOUT;
open STDOUT, '>', 'outfile.txt' or die $!;
print "Sent to file\n";
}
Don't try to open the STDOUT handle. If you want to print to STDOUT, just use print (with no filehandle argument). If you want to print to something other than STDOUT, use a different name.