Die if anything is written to STDERR? - perl

How can I force Perl script to die if anything is written to STDERR ?
Such action should be done instantly, when such output happen, or even before, to prevent that output...

This doesn't seem like an especially smart idea, but a tied filehandle should work. According to the perltie manpage:
When STDERR is tied, its PRINT method will be called to issue warnings and error messages. This feature is temporarily disabled during the call, which means you can use warn() inside PRINT without starting a recursive loop.
So something like this (adapted from the manpage example) ought to work:
package FatalHandle;
use strict;
use warnings;
sub TIEHANDLE { my $i; bless \$i, shift }
sub PRINT {
my $r = shift;
die "message to STDERR: ", #_;
}
package main;
tie *STDERR, "FatalHandle";
warn "this should be fatal.";
print "Should never get here.";
And that outputs (with exit code 255):
message to STDERR: this should be fatal. at fh.pl line 17.

Here's a method that works no matter how STDERR (fd 2) is written to, even if it's a C extension that doesn't use Perl's STDERR variable to do so. It will even kill child processes that write to STDERR!
{
pipe(my $r, my $w)
or die("Can't create pipe: $!\n");
open(STDERR, '>&', $w)
or die("Can't dup pipe: $!\n");
close($r);
}
print "abc\n";
print "def\n";
print STDERR "xxx\n";
print "ghi\n";
print "jkl\n";
$ perl a.pl
abc
def
$ echo $?
141
Doesn't work on Windows. Doesn't work if you add a SIGPIPE handler.

Related

How can I read the STDOUT of a external Program in realtime?

Let me elaborate.
Say I have perl program
(whch was shamelessly copied and edited from perl
http://perldoc.perl.org/perlfaq8.html#How-can-I-open-a-pipe-both-to-and-from-a-command%3f
)
use IPC::Open3;
use Symbol qw(gensym);
use IO::File;
local *CATCHOUT = IO::File->new_tmpfile;
local *CATCHERR = IO::File->new_tmpfile;
my $pid = open3(gensym, ">&CATCHOUT", ">&CATCHERR", "ping -t localhost");
#waitpid($pid, 0);
seek $_, 0, 0 for \*CATCHOUT, \*CATCHERR;
while( <CATCHOUT> ) {
print $_;
}
But the problem with the above program is it will to a sort of readtoEnd() of the STDOUT belonging to the program ping.exe in this case and allow it ti be read all at once.
But what I want to be able to do is to read the STDOUT as it is being written out to STDOUT.
if I remove waitforpid() then program exits immediately, so that doesn't help either.
Is that Possible ? If so, can you please point me in the right direction.
Update:
Drats!!!! I missed the | symbol... which is essential for piping the output out of ping and into the perl script!!!
use IPC::Open3 qw( open3 );
open(local *CHILD_STDIN, '<', '/dev/null') or die $!;
my $pid = open3(
'<&CHILD_STDIN',
my $child stdout,
'>&STDERR',
'ping', '-t', 'localhost',
);
while (<$child_stdout>) {
chomp;
print("Got: <<<$_>>>\n");
}
waitpid($pid, 0);
But that can be written as
open(my $ping_fh, '-|', 'ping', '-t', 'localhost') or die $!;
while (<$ping_fh>) {
chomp;
print("Got: <<<$_>>>\n");
}
close($ping_fh);
This just shows the proper usage. If these don't work, it's an unrelated problem: ping is buffering it's IO when not connected to a terminal. You can fool it using a pseudo-tty.
One of the strengths (or weaknesses) of perl is that there is more than one way to do things. This works:
perl -e 'open(F,"ping localhost|"); while(<F>) { s/ms/Milliseconds/; print $_; }'
Just put the s/ms/Milliseconds/ to show that the data is being read and changed
Not sure exactly what you have wrong with Open3

perl background process

I am trying to run a background process in perl. I create a child process, which is used to call another perl script. I want to run few lines of code parallely with this child process. And after the child process is done.I want to print a line of code.
Main script
#!/usr/bin/perl
$|=1;
print "before the child process\n";
my $pid = fork();
if (defined $pid)
{
system("perl testing.pl");
}
print "before wait command\n";
wait();
print "after 20 secs of waiting\n";
testing.pl
#!/usr/bin/perl
print "inside testing\n";
sleep(20);
Expected output
before the child process
before wait command
(should wait for 20 secs and then print)
after 20 secs of waiting
There are many problems with your script. Always:
use strict;
use warnings;
localising special variables is a good practice. Only a variable containing the special value undef returns false for defined. So, every other value (even a 0; which is the case here) returns true for defined. In the other script, the shebang is wrong.
#!/usr/bin/perl
use strict;
use warnings;
local $| = 1;
print "Before the child process\n";
unless (fork) {
system("perl testing.pl");
exit;
}
print "Before wait command\n";
wait;
print "After 20 secs of waiting\n";
The “Background Processes” section of the perlipc documentation reads
You can run a command in the background with:
system("cmd &");
The command’s STDOUT and STDERR (and possibly STDIN, depending on your shell) will be the same as the parent’s. You won't need to catch SIGCHLD because of the double-fork taking place; see below for details.
Adding an ampersand to the argument to system in your program can vastly simplify your main program.
#! /usr/bin/env perl
print "before the child process\n";
system("perl testing.pl &") == 0
or die "$0: perl exited " . ($? >> 8);
print "before wait command\n";
wait;
die "$0: wait: $!" if $? == -1;
print "after 20 secs of waiting\n";
fork return value handling is a bit tricky, indeed.
Recent article by Aristotle features a nice and concise forking idiom, which, in your case, looks like:
#!/usr/bin/env perl
use 5.010000;
use strict;
use warnings qw(all);
say 'before the child process';
given (fork) {
when (undef) { die "couldn't fork: $!" }
when (0) {
exec $^X => 'testing.pl';
} default {
my $pid = $_;
say 'before wait command';
waitpid $pid, 0;
say 'after 20 secs of waiting';
}
}
Pay attention to exec $^X => '...' line: the $^X variable holds the full path to the current Perl executable, so the "right Perl version" will be guaranteed. Also, system call is pointless when you're pre-forking.

Perl STDOUT redirection not working in child

While running following Perl program, the output of the child script is printed to the terminal instead of going into $v. Please let me know how to fix it.
open (OUTPUT, '>', \$v);
select OUTPUT;
$| = 1;
open (SUB, "| sh print_user_input.sh");
print SUB "Hello World\n";
close(SUB);
close(OUTPUT);
select STDOUT;
print "Output: $v\n";
The output of the program is:
Hello World
Output:
select doesn't change STDOUT.
open '>', \$buf does not create a system file handle. (Who would read from it and place the data in $buf? Another process cannot write directly to $buf, even if were a perl a process.)
One solution:
use IPC::Run3 qw( run3 );
run3 [ 'sh', 'print_user_input.sh' ],
\"Hello World\n",
\my $v;
You've got 2 problems. select does not change STDOUT, it just changes Perl's idea of which filehandle it should be printing to. And in-memory filehandles like you're trying to use only work inside a single Perl process; you can't use them in child processes.
You want to look at IPC::Open3 or a similar module.
Using IPC::Open2's open2 function:
#!/usr/bin/env perl
use strict;
use warnings;
use IPC::Open2;
my $pid = open2( \*CHLD_OUT, \*CHLD_IN, 'sh print_user_input.sh' );
print CHLD_OUT "Hello World\n";
close CHLD_OUT;
my $output = do { local $/; <CHLD_OUT> };
print "Output: $output";

How to read to and write from a pipe in Perl?

I need to modify an existing Perl program. I want to pipe a string (which can contain multiple lines) through an external program and read the output from this program. This external program is used to modify the string. Let's simply use cat as a filter program. I tried it like this, but it doesn't work. (Output of cat goes to STDOUT instead of being read by perl.)
#!/usr/bin/perl
open(MESSAGE, "| cat |") or die("cat failed\n");
print MESSAGE "Line 1\nLine 2\n";
my $message = "";
while (<MESSAGE>)
{
$message .= $_;
}
close(MESSAGE);
print "This is the message: $message\n";
I've read that this isn't supported by Perl because it may end up in a deadlock, and I can understand it. But how do I do it then?
You can use IPC::Open3 to achieve bi-directional communication with child.
use strict;
use IPC::Open3;
my $pid = open3(\*CHLD_IN, \*CHLD_OUT, \*CHLD_ERR, 'cat')
or die "open3() failed $!";
my $r;
for(my $i=1;$i<10;$i++) {
print CHLD_IN "$i\n";
$r = <CHLD_OUT>;
print "Got $r from child\n";
}
This involves system programming, so it’s more than a basic question. As written, your main program doesn’t require full-duplex interaction with the external program. Dataflow travels in one direction, namely
string → external program → main program
Creating this pipeline is straightforward. Perl’s open has a useful mode explained in the “Safe pipe opens” section of the perlipc documentation.
Another interesting approach to interprocess communication is making your single program go multiprocess and communicate between—or even amongst—yourselves. The open function will accept a file argument of either "-|" or "|-" to do a very interesting thing: it forks a child connected to the filehandle you’ve opened. The child is running the same program as the parent. This is useful for safely opening a file when running under an assumed UID or GID, for example. If you open a pipe to minus, you can write to the filehandle you opened and your kid will find it in his STDIN. If you open a pipe from minus, you can read from the filehandle you opened whatever your kid writes to his STDOUT.
This is an open that involves a pipe, which gives nuance to the return value. The perlfunc documentation on open explains.
If you open a pipe on the command - (that is, specify either |- or -| with the one- or two-argument forms of open), an implicit fork is done, so open returns twice: in the parent process it returns the pid of the child process, and in the child process it returns (a defined) 0. Use defined($pid) or // to determine whether the open was successful.
To create the scaffolding, we work in right-to-left order using open to fork a new process at each step.
Your main program is already running.
Next, fork a process that will eventually become the external program.
Inside the process from step 2
First fork the string-printing process so as to make its output arrive on our STDIN.
Then exec the external program to perform its transformation.
Have the string-printer do its work and then exit, which kicks up to the next level.
Back in the main program, read the transformed result.
With all of that set up, all you have to do is implant your suggestion at the bottom, Mr. Cobb.
#! /usr/bin/env perl
use 5.10.0; # for defined-or and given/when
use strict;
use warnings;
my #transform = qw( tr [A-Za-z] [N-ZA-Mn-za-m] ); # rot13
my #inception = (
"V xabj, Qnq. Lbh jrer qvfnccbvagrq gung V pbhyqa'g or lbh.",
"V jnf qvfnccbvagrq gung lbh gevrq.",
);
sub snow_fortress { print map "$_\n", #inception }
sub hotel {
given (open(STDIN, "-|") // die "$0: fork: $!") { # / StackOverflow hiliter
snow_fortress when 0;
exec #transform or die "$0: exec: $!";
}
}
given (open(my $fh, "-|") // die "$0: fork: $!") {
hotel when 0;
print while <$fh>;
close $fh or warn "$0: close: $!";
}
Thanks for the opportunity to write such a fun program!
You can use the -n commandline switch to effectively wrap your existing program code in a while-loop... look at the man page for -n:
LINE:
while (<>) {
... # your program goes here
}
Then you can use the operating system's pipe mechanism directly
cat file | your_perl_prog.pl
(Edit)
I'll try to explain this more carefully...
The question is not clear about what part the perl program plays: filter or final stage. This works in either case, so I will assume it is the latter.
'your_perl_prog.pl' is your existing code. I'll call your filter program 'filter'.
Modify your_perl_prog.pl so that the shebang line has an added '-n' switch: #!/usr/bin/perl -n or #!/bin/env "perl -n"
This effectively puts a while(<>){} loop around the code in your_perl_prog.pl
add a BEGIN block to print the header:
BEGIN {print "HEADER LINE\n");}
You can read each line with '$line = <>;' and process/print
Then invoke the lot with
cat sourcefile |filter|your_perl_prog.pl
I want to expand on #Greg Bacon's answer without changing it.
I had to execute something similar, but wanted to code without
the given/when commands, and also found there was explicit exit()
calls missing because in the sample code it fell through and exited.
I also had to make it also work on a version running ActiveState perl,
but that version of perl does not work.
See this question How to read to and write from a pipe in perl with ActiveState Perl?
#! /usr/bin/env perl
use strict;
use warnings;
my $isActiveStatePerl = defined(&Win32::BuildNumber);
sub pipeFromFork
{
return open($_[0], "-|") if (!$isActiveStatePerl);
die "active state perl cannot cope with dup file handles after fork";
pipe $_[0], my $child or die "cannot create pipe";
my $pid = fork();
die "fork failed: $!" unless defined $pid;
if ($pid) { # parent
close $child;
} else { # child
open(STDOUT, ">&=", $child) or die "cannot clone child to STDOUT";
close $_[0];
}
return $pid;
}
my #transform = qw( tr [A-Za-z] [N-ZA-Mn-za-m] ); # rot13
my #inception = (
"V xabj, Qnq. Lbh jrer qvfnccbvagrq gung V pbhyqa'g or lbh.",
"V jnf qvfnccbvagrq gung lbh gevrq.",
);
sub snow_fortress { print map "$_\n", #inception }
sub hotel
{
my $fh;
my $pid = pipeFromFork($fh); # my $pid = open STDIN, "-|";
defined($pid) or die "$0: fork: $!";
if (0 == $pid) {
snow_fortress;
exit(0);
}
open(STDIN, "<&", $fh) or die "cannot clone to STDIN";
exec #transform or die "$0: exec: $!";
}
my $fh;
my $pid = pipeFromFork($fh); # my $pid = open my $fh, "-|";
defined($pid) or die "$0: fork: $!";
if (0 == $pid) {
hotel;
exit(0);
}
print while <$fh>;
close $fh or warn "$0: close: $!";
the simplest -- not involving all these cool internals -- way to do what the OP needs, is to use a temporary file to hold the output until the external processor is done, like so:
open ToTemp, "|/usr/bin/tac>/tmp/MyTmp$$.whee" or die "open the tool: $!";
print ToTemp $TheMessageWhateverItIs;
close ToTemp;
my $Result = `cat /tmp/MyTmp$$.whee`; # or open and read it, or use File::Slurp, etc
unlink "/tmp/MyTmp$$.whee";
Of course, this isn't going to work for something interactive, but co-routines appear to be out of the scope of the original question.

How can I print to a variable instead of a file, in Perl?

How can I print to a variable with Perl?
I've been working on a program for a while which logs its iterative progress in a highly verbose fashion...
print $loghandle $some_message;
However, I'd like to also selectively print some of the messages to a different file. Naturally, I could sprinkle the code with...
print $loghandle $some_message
print $otherloghandle $some_message
Or rewrite the whole business into a function. Blah.
What I want to do is do some magic when I open the $loghandle so that when I'm print'ing, I'm actually just doing a sprintfish operation against a variable(call it $current_iteration), so that when I get down to a decision point I can do something like this...
print $real_log_file $current_iteration;
print $other_real_log_file $current_iteration if($condition);
I'm fairly sure I've seen something like this somewhere, but I have no idea where it is or where to look.
edit: File::Tee solves this problem to some extent on *nix, but I run on Windows.
You can treat a scalar variable as a filehandle by opening it:
open my $fh, '>', \$variable or die "Can't open variable: $!";
print $fh "Treat this filehandle like any other\n";
You can even map stdout or stderr to a scalar:
close STDOUT;
open STDOUT, '>', \$variable or die "Can't open STDOUT: $!";
If you want to split your output or set up a config file to do "interesting" things with your logging, you are better off with Log4Perl as others have suggested.
Do you mean something like IO::Scalar? Lets you write to a variable with filehandle semantics.
If you want to do selective logging where you can control which messages are logged and where they are logged, use Log::Log4perl. That will save you a bunch of time over messing with ties and other black magic.
You can use File::Tee to split a filehandle into multiple output streams.
use File::Tee;
open my $fh, '>', 'logfile.txt' or die $!;
tee( $fh, '>', 'otherlogfile.txt' ) if $condition;
print $fh $current_iteration; # will also go to otherlogfile.txt
# if $condition was true
Perlfaq5 recommends Tie::FileHandle::Multiplex for printing to multiple files.
The source is very simple and it should be easy to modify with a per-handle filter.
Sound like you want to tie your filehandle.
my $x;
# printing to $fh will update the variable $x
# when you close $fh, it will print $x to a filehandle depending
# on code in the function Print_to_variable::CLOSE
tie $fh, "Print_to_variable", \$x, $output_fh1, $output_fh2;
print $fh "stuff";
print $fh "more stuff";
close $fh;
sub Print_to_variable::TIEHANDLE {
my ($class, $ref, $fh1, $fh2) = #_;
my $self = {};
$self->{ref} = $ref;
$self->{output_fh1} = $fh1;
$self->{output_fh2} = $fh2;
bless $self, "Print_to_variable";
$self;
}
sub Print_to_variable::PRINT {
my ($self,#list);
${$self->{ref}} .= join "", #list;
}
sub Print_to_variable::CLOSE {
my $self = shift;
my $text = ${$self->{ref}};
if ( &myCondition1($text) ) { # ... how you decide where to print
print {$self->{output_fh1}} $text;
} else {
print {$self->{output_fh1}} $text;
}
}
This is a tremendous hack, and I think mobrule's solution or(esp) Sinan's solution of Log4Perl are the way to go when I have time.
But, this is what I'm using, as a completion thing:
sub print_to_var($$) {
my($rx_var, $scalar) = #_;
$$rx_var .= $scalar;
}
print_to_var \$logvar, $message;
#...later on...
print $somefile $logvar;
Edit:
Since this is community wiki, it's worth nothing that Perl aliases arguments to functions. That means you can just write this:
sub print_to_var($$) {
$_[0] .= $_[1];
}
And then say:
my $foo = "OH HAI. ";
print_to_var $foo, "I ARE HELO KITTIE.";
say $foo; # OH HAI. I ARE HELO KITTIE.
This is not a particularly tremendous hack, although print_to_var is a lot more typing than . is.
And here is the HELO KITTIE:
helo kittie http://blogs.philadelphiaweekly.com/style/files/2008/11/hello-kitty-color.gif