I am writing a simple script using IPC::Open3. The script produces no output to either stdout or stderr, while I would expect output to both.
The complete source code:
#!/usr/bin/env perl
use strict;
use warnings;
use utf8;
use IPC::Open3;
pipe my $input, my $output or die $!;
my $pid = open3(\*STDIN, $output, \*STDERR, 'dd', 'if=/usr/include/unistd.h') or die $!;
while(<$input>) {
print $_."\n";
}
waitpid $pid, 0;
I am fairly certain that I am using IPC::Open3 incorrectly. However, I am still confused as to what I should be doing.
It's the pipe. Without knowing why it's there I can't say more. This works fine.
my $reader;
my $pid = open3(\*STDIN, $reader, \*STDERR, 'dd', 'if=/usr/include/unistd.h') or die $!;
while(<$reader>) {
print $_."\n";
}
waitpid $pid, 0;
I realize it's probably just an example, but in case it's not... this is complete overkill for what you're doing. You can accomplish that with backticks.
print `dd if=/usr/include/unistd.h`
IPC::Open3 is a bit overcomplicated. There are better modules such as IPC::Run and IPC::Run3.
use strict;
use warnings;
use IPC::Run3;
run3(['perl', '-e', 'warn "Warning!"; print "Normal\n";'],
\*STDIN, \*STDOUT, \*STDERR
);
Your program suffers from the following problems:
\*STDIN (open STDIN as a pipe tied to the child's STDIN) should be <&STDIN (use the parent's STDIN as the child's STDIN).
\*STDERR (open STDERR as a pipe tied to the child's STDERR) should be >&STDERR (use the parent's STDERR as the child's STDERR).
The value you place in $output is being ignored and overwritten. Fortunately, it's being overwritten with a correct value!
You use print $_."\n";, but $_ is already newline-terminated. Either chomp first, or don't add a newline.
open3 isn't a system call, so it doesn't set $!.
open3 doesn't return false on error; it throws an exception.
So we get something like:
#!/usr/bin/env perl
use strict;
use warnings;
use feature qw( say );
use IPC::Open3;
my $pid = open3('<&STDIN', my $output, '>&STDERR',
'dd', 'if=/usr/include/unistd.h');
while (<$output>) {
chomp;
say "<$_>";
}
waitpid($pid, 0);
Related
Given the following code, the user will see it printed 3 times, one time a second "Hello World!".
#!/bin/perl
$| = 1;
use feature ':5.10';
use strict;
use warnings;
use constant PERL_SCRIPT => '$|=1; foreach (0..3) {say "World!"; sleep 1}';
open ( my $h, '-|', '/bin/perl', '-wE', PERL_SCRIPT() ) or die $!;
while (<$h>) {
print "Hello $_";
}
How can I achieve this same effect with IPC::Run3? Note, I don't want buffering. I want this to stream.
Why am I using IPC::Run3? I want to stdin to be pointed to /dev/null. And, I don't want to have to have to do the actual redirection with another shell exec.
First of all, you could simply use the following along with your existing code if you don't otherwise need STDIN in the parent.
open(STDIN, '<', '/dev/null') or die $!;
I don't think you can with IPC::Run3, but you can with IPC::Run.
use IPC::Run qw( run );
run [ $^X, -wE => PERL_SCRIPT ],
\undef,
sub { print "Hello $_[0]" };
If you wanted to use a pipe, you could use the following:
use IPC::Run qw( start );
use Symbol qw( gensym );
my $h =
start [ $^X, -wE => PERL_SCRIPT ],
\undef,
'>pipe', my $pipe = gensym;
print "Hello $_" while <$pipe>;
$h->finish();
(You can't use run because that waits for the child to finish.)
How can I force Perl script to die if anything is written to STDERR ?
Such action should be done instantly, when such output happen, or even before, to prevent that output...
This doesn't seem like an especially smart idea, but a tied filehandle should work. According to the perltie manpage:
When STDERR is tied, its PRINT method will be called to issue warnings and error messages. This feature is temporarily disabled during the call, which means you can use warn() inside PRINT without starting a recursive loop.
So something like this (adapted from the manpage example) ought to work:
package FatalHandle;
use strict;
use warnings;
sub TIEHANDLE { my $i; bless \$i, shift }
sub PRINT {
my $r = shift;
die "message to STDERR: ", #_;
}
package main;
tie *STDERR, "FatalHandle";
warn "this should be fatal.";
print "Should never get here.";
And that outputs (with exit code 255):
message to STDERR: this should be fatal. at fh.pl line 17.
Here's a method that works no matter how STDERR (fd 2) is written to, even if it's a C extension that doesn't use Perl's STDERR variable to do so. It will even kill child processes that write to STDERR!
{
pipe(my $r, my $w)
or die("Can't create pipe: $!\n");
open(STDERR, '>&', $w)
or die("Can't dup pipe: $!\n");
close($r);
}
print "abc\n";
print "def\n";
print STDERR "xxx\n";
print "ghi\n";
print "jkl\n";
$ perl a.pl
abc
def
$ echo $?
141
Doesn't work on Windows. Doesn't work if you add a SIGPIPE handler.
In the following code if there is space between FILE and ( in the printf statement
like
printf FILE ("Test string inline\n");
Perl will treat FILE as a filehandle otherwise
printf FILE("Test string inline\n");
will be treated as subroutine call(If no subroutine is defined by FILE perl will through an error Undefined subroutine &main::FILE called at ./test.pl line xx ). Isn't there a better way Perl can implement this ? (Maybe this is why bareword filehandles are considered outdated ?)
#!/usr/bin/perl
use warnings;
open(FILE,">test.txt");
printf FILE ("Test string inline\n");
close(FILE);
sub FILE
{
return("Test string subroutine\n");
}
Are you asking how to avoid that error accidentally? You could wrap the handle in curlies
printf({ HANDLE } $pattern, #args);
print({ HANDLE } #args);
say({ HANDLE } #args);
Or since parens are often omitted for say, print and printf,
printf { HANDLE } $pattern, #args;
print { HANDLE } #args;
say { HANDLE } #args;
Or you could use a method call
HANDLE->printf($pattern, #args);
HANDLE->print(#args);
HANDLE->say(#args);
Try:
#!/usr/bin/env perl
use strict;
use warnings;
# --------------------------------------
use charnames qw( :full :short );
use English qw( -no_match_vars ) ; # Avoids regex performance penalty
my $test_file = 'test.txt';
open my $test_fh, '>', $test_file or die "could not open $test_file: $OS_ERROR\n";
printf {$test_fh} "Test string inline" or die "could not print $test_file: $OS_ERROR\n";
close $test_fh or die "could not close $test_file: $OS_ERROR\n";
Let me elaborate.
Say I have perl program
(whch was shamelessly copied and edited from perl
http://perldoc.perl.org/perlfaq8.html#How-can-I-open-a-pipe-both-to-and-from-a-command%3f
)
use IPC::Open3;
use Symbol qw(gensym);
use IO::File;
local *CATCHOUT = IO::File->new_tmpfile;
local *CATCHERR = IO::File->new_tmpfile;
my $pid = open3(gensym, ">&CATCHOUT", ">&CATCHERR", "ping -t localhost");
#waitpid($pid, 0);
seek $_, 0, 0 for \*CATCHOUT, \*CATCHERR;
while( <CATCHOUT> ) {
print $_;
}
But the problem with the above program is it will to a sort of readtoEnd() of the STDOUT belonging to the program ping.exe in this case and allow it ti be read all at once.
But what I want to be able to do is to read the STDOUT as it is being written out to STDOUT.
if I remove waitforpid() then program exits immediately, so that doesn't help either.
Is that Possible ? If so, can you please point me in the right direction.
Update:
Drats!!!! I missed the | symbol... which is essential for piping the output out of ping and into the perl script!!!
use IPC::Open3 qw( open3 );
open(local *CHILD_STDIN, '<', '/dev/null') or die $!;
my $pid = open3(
'<&CHILD_STDIN',
my $child stdout,
'>&STDERR',
'ping', '-t', 'localhost',
);
while (<$child_stdout>) {
chomp;
print("Got: <<<$_>>>\n");
}
waitpid($pid, 0);
But that can be written as
open(my $ping_fh, '-|', 'ping', '-t', 'localhost') or die $!;
while (<$ping_fh>) {
chomp;
print("Got: <<<$_>>>\n");
}
close($ping_fh);
This just shows the proper usage. If these don't work, it's an unrelated problem: ping is buffering it's IO when not connected to a terminal. You can fool it using a pseudo-tty.
One of the strengths (or weaknesses) of perl is that there is more than one way to do things. This works:
perl -e 'open(F,"ping localhost|"); while(<F>) { s/ms/Milliseconds/; print $_; }'
Just put the s/ms/Milliseconds/ to show that the data is being read and changed
Not sure exactly what you have wrong with Open3
While running following Perl program, the output of the child script is printed to the terminal instead of going into $v. Please let me know how to fix it.
open (OUTPUT, '>', \$v);
select OUTPUT;
$| = 1;
open (SUB, "| sh print_user_input.sh");
print SUB "Hello World\n";
close(SUB);
close(OUTPUT);
select STDOUT;
print "Output: $v\n";
The output of the program is:
Hello World
Output:
select doesn't change STDOUT.
open '>', \$buf does not create a system file handle. (Who would read from it and place the data in $buf? Another process cannot write directly to $buf, even if were a perl a process.)
One solution:
use IPC::Run3 qw( run3 );
run3 [ 'sh', 'print_user_input.sh' ],
\"Hello World\n",
\my $v;
You've got 2 problems. select does not change STDOUT, it just changes Perl's idea of which filehandle it should be printing to. And in-memory filehandles like you're trying to use only work inside a single Perl process; you can't use them in child processes.
You want to look at IPC::Open3 or a similar module.
Using IPC::Open2's open2 function:
#!/usr/bin/env perl
use strict;
use warnings;
use IPC::Open2;
my $pid = open2( \*CHLD_OUT, \*CHLD_IN, 'sh print_user_input.sh' );
print CHLD_OUT "Hello World\n";
close CHLD_OUT;
my $output = do { local $/; <CHLD_OUT> };
print "Output: $output";