I feel as though there should be a simple way to do this, but searching around gives me no good leads. I just want to open() a pipe to an application, write some data to it, and have the output of the subprocess sent to the STDOUT of the calling script.
open(my $foo, '|-', '/path/to/foo');
print $foo 'input'; # Should behave equivalently to "print 'output'"
close($foo);
Is there a simple way to do this, or have I hit upon one of the many "can't get there from here" moments in Perl?
The subprocess will inherit STDOUT automatically. This works for me:
open(my $f, "|-", "cat");
print $f "hi\n";
If you are not really closing the pipe immediately the problem might be on the other end: STDOUT is line-buffered by default, so you see print "hello world\n" immediately. The pipe to your subprocess will be block-buffered by default, so you may actually be waiting for the data from your perl script to reach the other program:
open(my $f, "|-", "cat");
print $f "hi\n";
sleep(10);
close($f); # or exit
# now output appears
Try adding select $f; $| = 1 (or I think the more modern way is $f->autoflush(1))
Related
I want to temporarily redirect stdout to an in memory variable. Prints are correctly redirected to my variable but not the output of a pipe (bc in my example). What is going on?
#!/usr/bin/perl
my $stdout_sink;
open(my $orig_stdout, ">&STDOUT") || die $!;
close STDOUT;
open(STDOUT, ">", \$stdout_sink) || die $!;
# produce some output in different ways
print "before bc\n"; # ok
open my $fh, "| bc";
print $fh "2+2\n"; # not ok
close $fh;
close STDOUT;
open(STDOUT, ">&", $orig_stdout) || die $!;
print "$stdout_sink";
Actual ouput will be:
before bc
Expected output:
before bc
4
This is ... not possible.
Standard output of piped opens and system calls are written to file descriptor 1. Normally, Perl's STDOUT file handle is associated with file descriptor 1, but that can be manipulated.
In this example, the system calls writes to STDOUT filehandle, which writes to the file foo.
close STDOUT; # closes file descriptor 1
open STDOUT, '>', 'foo'; # reopens STDOUT as file descriptor 1
system("echo bar");
close STDOUT;
print STDERR "foo: ",`cat foo`;
# result: "foo: bar"
But in this example, the system calls writes to the BAZ filehandle.
close STDOUT; # closes file descriptor 1
open BAZ, '>', 'baz'; # attaches fd 1 to handle BAZ
open STDOUT, '>', 'foo'; # opens new file descriptor, maybe fd 3
system("echo bar");
close STDOUT;
print STDERR "foo: ",`cat foo`;
print STDERR "baz: ",`cat baz`;
# result: "foo: baz: bar"
An in-memory filehandle is not a real filehandle. If you call fileno on it, you will (generally, may be OS dependent) get a negative number.
open STDOUT, '>', \$scalar;
print STDERR fileno(STDOUT); # -1
Piped opens and system calls will not be able to write to this filehandle.
You will need a more complicated workaround, like writing the piped open output to a file, and then copying that file into the in-memory variable.
You have a detailed explanation in mob's answer of why you cannot redirect a child's STDOUT to a variable, which isn't really a filehandle.
Instead, you can use a module for running external programs, that can redirect standard streams to variables. Then you can combine strings with redirected output as you wish.
An example with IPC::Run3
use warnings;
use strict;
use feature 'say';
use IPC::Run3;
open my $fh, ">", \my $so_1;
my $old = select $fh; # make $fh the default for output,
say "To select-ed default"; # so prints end up in $so_1
run3 ["ls", "-l", "./"], undef, \my $so_2; # output goes to $so_2
select $old; # restore STDOUT as default for output
print $so_1, $so_2;
Here I used select to manipulate where prints go by default (without a filehandle specified).
Note that the example redirects run3 to a different variable ($so_2) than the one used for a previous redirect. If you'd rather append to the same variable specify this in %options
run3 ["ls", "-l", "./"], undef, \$so_1, { append_stdout => 1 };
and remove $so_2 from the printing statement.
The module uses temporary files for this redirection, as mob also indicated in the answer.
Some other options are Capture::Tiny that can redirect output from nearly any code, with a simple and clean interface, and the very powerfull, rounded, and more complex IPC::Run.
Is there a simple way in Perl to send STDOUT or STDERR to multiple places without forking, using File::Tee, or opening a pipe to /usr/bin/tee?
Surely there is a way to do this in pure perl without writing 20+ lines of code, right? What am I missing? Similar questions have been asked, both here on SO and elsewhere, but none of the answers satisfy the requirements that I not have to
fork
use File::Tee / IO::Tee / some other module+dependencies
whose code footprint is 1000x larger than my actual script
open a pipe to the actual tee command
I can see the use of a Core module as a tradeoff here, but really is that needed?
It looks like I can simply do this:
BEGIN {
open my $log, '>>', 'error.log' or die $!;
$SIG{__WARN__} = sub { print $log #_ and print STDERR #_ };
$SIG{__DIE__} = sub { warn #_ and exit 1 };
}
This simply and effectively sends most error messages both to the original STDERR and to a log file (apparently stuff trapped in an eval doesn't show up, I'm told). So there are downsides to this, mentioned in the comments. But as mentioned in the original question, the need was specific. This isn't meant for reuse. It's for a simple, small script that will never be more than 100 lines long.
If you are looking for a way to do this that isn't a "hack", the following was adapted from http://grokbase.com/t/perl/beginners/096pcz62bk/redirecting-stderr-with-io-tee
use IO::Tee;
open my $save_stderr, '>&STDERR' or die $!;
close STDERR;
open my $error_log, '>>', 'error.log' or die $!;
*STDERR = IO::Tee->new( $save_stderr, $error_log ) or die $!;
When writing a daemon, I want to close STDIN, STDOUT and STDERR for "good daemon behavior". But I got surprised. Subsequent opening of files require the same properties as the old STDIN, STDOUT and STDERR (because their fileno-s got re-opened?)
Here is warn.pl:
use warnings;
my $nrWarnings = 0;
$SIG{__WARN__} = sub {
no warnings;
$nrWarnings++;
open my $log, '>>', '/tmp/log.txt'
or die;
printf $log "%d: %s", $nrWarnings, #_;
close $log;
};
close STDOUT;
close STDERR;
close STDIN;
open my $o, '>', '/tmp/foobar.txt'
or die;
open my $i, '<', '/etc/passwd'
or die;
open my $i2, '<', '/etc/passwd'
or die;
open my $i3, '<', '/etc/passwd'
or die;
exit $nrWarnings;
And here I run it:
> rm -f /tmp/log.txt ; perl warn.pl; echo $? ; cat /tmp/log.txt
3
1: Filehandle STDIN reopened as $o only for output at warn.pl line 20.
2: Filehandle STDOUT reopened as $i only for input at warn.pl line 22.
3: Filehandle STDERR reopened as $i2 only for input at warn.pl line 24.
I was expecting no warnings and $? == 0. Where is the bug? In my code or in perl?
This may appear similar to How can I reinitialize Perl's STDIN/STDOUT/STDERR?, but there the accepted solution was to close STDIN, STDOUT and STDERR like I do.
Those are warnings, not errors. I suppose they exist because if your program subsequently forked and execed a different program, that program would be mightily confused that its standard input stream is opened for output and its standard output and error streams are opened for input.
It's perfectly legitimate to suppress warnings when you're sure you know what you're doing. In this case, you'd just add no warnings 'io'; prior to your opens.
Now, right after hitting submit, I think of looking in perldoc perldiag and the warning text is listed there. That leads me to Perl bug #23838 which basically states: "Well, don't close those handles, re-open them to '/dev/null' instead".
And the bug is marked as resolved after that.
I disagree that re-opening to '/dev/null' is the correct way (tm), but now we venture into opinion, which is off-topic for stackoverflow, so I'll mark this as answered.
Sorry for the noise.
I'm using perl back-ticks syntax to run some commands.
I would like the output of the command to written to a file and also printed out to stdout.
I can accomplish the first by adding a > at the end of my back-ticked string, but I do not know hot to make the output be printed as soon as it is generated. If I do something like
print `command`;
the output is printed only after command finished executing.
Thanks,
Dave
You cannot do it with the backticks, as they return to the Perl program only when the execution has finished.
So,
print `command1; command2; command3`;
will wait until command3 finishes to output anything.
You should use a pipe instead of backticks to be able to get output immediately:
open (my $cmds, "-|", "command1; command2; command3");
while (<$cmds>) {
print;
}
close $cmds;
After you've done this, you then have to see if you want or not buffering (depending on how immediate you want the output to be): Suffering from buffering?
To print and store the output, you can open() a file to write the output to:
open (my $cmds, "-|", "command1; command2; command3");
open (my $outfile, ">", "file.txt");
while (<$cmds>) {
print;
print $outfile $_;
}
close $cmds;
close $outfile;
I'm looking for an example of redirecting stdout to a file using Perl. I'm doing a fairly straightforward fork/exec tool, and I want to redirect the child's output to a file instead of the parents stdout.
Is there an equivilant of dup2() I should use? I can't seem to find it
From perldoc -f open:
open STDOUT, '>', "foo.out"
The docs are your friend...
As JS Bangs said, an easy way to redirect output is to use the 'select' statement.
Many thanks to stackoverflow and their users. I hope this is helpful
for example:
print "to console\n";
open OUTPUT, '>', "foo.txt" or die "Can't create filehandle: $!";
select OUTPUT; $| = 1; # make unbuffered
print "to file\n";
print OUTPUT "also to file\n";
print STDOUT "to console\n";
# close current output file
close(OUTPUT);
# reset stdout to be the default file handle
select STDOUT;
print "to console";
The child itself can do select $filehandle to specify that all of its print calls should be directed to a specific filehandle.
The best the parent can do is use system or exec or something of the sort to do shell redirection.
open my $fh, '>', $file;
defined(my $pid = fork) or die "fork: $!";
if (!$pid) {
open STDOUT, '>&', $fh;
# do whatever you want
...
exit;
}
waitpid $pid, 0;
print $? == 0 ? "ok\n" : "nok\n";
A strictly informational but impractical answer:
Though there's almost certainly a more elegant way of going about this depending on the exact details of what you're trying to do, if you absolutely must have dup2(), its Perl equivalent is present in the POSIX module. However, in this case you're dealing with actual file descriptors and not Perl filehandles, and correspondingly you're restricted to using the other provided functions in the POSIX module, all of which are analogous to what you would be using in C. To some extent, you would be writing C in very un-Perlish Perl.
http://perldoc.perl.org/POSIX.html