I would like to make a wrapper that would take data from STDIN and pass it to another script, wait for his STDOUT response and output it to STDOUT on the parent side.
I have the following code but it does not seems to work:
test.pl
#!/usr/bin/perl
#
use IPC::Open2;
$pid = open2( \*RDR, \*WTR, '/usr/bin/perl test2.pl');
while (<STDIN>) {
print WTR;
}
while (<RDR>) {
print STDOUT;
}
and on the test2.pl, i have:
#!/usr/bin/perl
#
while (<STDIN>) {
print STDOUT;
}
It seems to write to test2.pl but i have no feedback from test2.pl.
Any hints?
Thank's,
You should close WTR when you are done reading from STDIN. Your external command will keep expecting input until you do this, and if you are suffering from buffering, your external program won't terminate and it won't output anything.
You are probably "suffering from buffering" in both your primary script and in your external command.
In your test script, you can add $|=1 to the top of the script to make its output more responsive. You might not be able to affect the output buffering of an arbitrary external command, though.
Update: IPC::Open2 already sets autoflush on the write filehandle, so the external command won't be starved for input.
Related
I would like to capture all output (both STDOUT and STDERR) of a command that also requires user interaction from the terminal window, i.e. it reads STDIN and then prints something to STDOUT.
Here is minimal version of the script I want to capture the output from:
user.pl:
#! /usr/bin/env perl
use feature qw(say);
use strict;
use warnings;
print "Enter URL: ";
my $ans = <STDIN>;
# do something based on $ans
say "Verification code: AIwquj2VVkwlWEBwway";
say "Access Token: bskjZO8iZotv!";
I tried using Capture::Tiny :
p.pl:
#! /usr/bin/env perl
use feature qw(say);
use strict;
use warnings;
use Capture::Tiny qw(tee_merged);
my $output = tee_merged {
#STDOUT->autoflush(1); # This does not work
system "user.pl";
};
if ( $output =~ /Access Token: (.*)$/ ) {
say $1;
}
but it does not work, since the prompt is not displayed until after the user has entered the input in the terminal.
Edit:
It seems it works fine if I replace user.pl with a python script. For example:
user.py:
#! /usr/bin/env python3
ans = input( 'Enter URL: ' )
# do something based on $ans
print( 'Verification code: AIwquj2VVkwlWEBwway' )
print( 'Access Token: bskjZO8iZotv!' )
TL/DR There is a solution, it's somewhat ugly, but it works. There are some minor caveats.
What's going on? The problem is actually in user.pl. The sample user.pl that you provided works like this: It starts by printing the string Enter URL: to its stdout, it then flushes its stdout and it then reads a line from its stdin. The flushing of the stdout occurs automatically by perl: when you try do read from stdin with <..> (aka readline), perl flushes stdout. It does that precisely to make programs like this behave correctly. Unfortunately, it appears that perl only implements this behavior when stdout is a tty (pseudo-terminal). If not, it does not flush stdout before reading from stdin. This is why the script works when you execute it in an interactive terminal session and it doesn't work correctly when you try to capture its output (because in that case its stdout is connected to a pipe).
How to fix this? Since user.pl misbehaves if its stdout is not a tty, we must use a tty. AFAIK, IPC::Run is the only perl module that can capture the output of a subprocess using a tty instead of a plain pipe. Unfortunately, when using a tty, IPC::Run does not allow us to redirect stdout only, it forces us to redirect stdin too. Because of that, we have to handle reading from stdin in the parent process on behalf of the child process (yikes!). Here's an example implementation of p.pl using IPC::Run:
#!/usr/bin/perl
use strict;
use warnings;
use IO::Handle;
use IPC::Run;
my $complete_output='';
my $in='';
my $out='';
my $h=IPC::Run::start ['./user.pl'],'<pty<',\$in,'>pty>',\$out;
while ($h->pumpable) {
$h->pump;
print $out;
STDOUT->flush;
if ($out eq 'Enter URL: ') {
$in.=<STDIN>;
}
$complete_output.=$out;
$out='';
}
$h->finish;
# do something with $complete_output here
So this is somewhat ugly. For example, we try do detect when the subprocess is waiting for user input (by looking for the string Enter URL:) and when it does, we read the user input in the parent process and then pass it to the child. Also notice that we have to implement the tee functionality ourselves since IPC::Run doesn't offer that.
There are some caveats. The way we handle user input, if the subprocess uses something like the readline library to support line editing, this will not work, because we do all the reading in the parent process with a simple <STDIN>. Also, because a tty is used behind the scenes instead of a pipe, all user input will be echoed to stdout. So whatever the user types in prompt, we put it in $in to send it to the process and will get it back from the process (via the $out variable). But since our terminal has also echo, the text will appear twice. One solution is filter $out to remove the user input and to prevent us from printing it.
Finally, this will not work on Windows.
Write your input prompt directly to the tty.
open TTY, '>', '/dev/tty'; # or 'con' in Windows
print TTY "Enter URL:";
my $ans = <STDIN>;
...
I need to know how is possible return values of Perl file from other Perl file.
In my first file i call to the second file with sentence similar to:
$variable = qx( perl file2.pl --param1 $p1 --param2 $p2);
I have tried with exit and return to get this data but is not possible.
Any idea?
Processes are no subroutines.
Communication between processes (“IPC”) is mostly done via normal file handles. Such file handles can specifically be
STDIN and STDOUT,
pipes that are set up by the parent process, these are then shared by the child,
sockets
Every process also has an exit code. This code is zero for success, and non-zero to indicate a failure. The code can be any integer in the range 0–255. The exit code can be set via the exit function, e.g. exit(1), and is also set by die.
Using STDIN and STDOUT is the normal mode of operation for command line programs that follow the Unix philosophy. This allows them to be chained with pipes to more complex programs, e.g.
cat a b c | grep foo | sort >out
Such a tool can be implemented in Perl by reading from the ARGV or STDIN file handle, and printing to STDOUT:
while (<>) {
# do something with $_
print $output;
}
Another program can then feed data to that script, and read it from the STDOUT. We can use open to treat the output as a regular file handle:
use autodie;
open my $tool, "-|", "perl", "somescript.pl", "input-data"; # notice -| open mode
while (<$tool>) {
...
}
close $tool;
When you want all the output in one variable (scalar or array), you can use qx as a shortcut: my $tool_output = qx/perl somescript.pl input-data/, but this has two disadvantages: One, a shell process is executed to parse the command (shell escaping problems, inefficiency). Two, the output is available only when the command has finished. Using open on the other hand allows you to do parallel computations.
In file2.pl, you must print something to STDOUT. For example:
print "abc\n";
print is the solution.
Sorry for my idiot question!
#
$variable = system( perl file2.pl --param1 $p1 --param2 $p2);
#$variable has return value of perl file2.pl ...
I basically want to reopen STDERR/STDOUT so they write to one logfile with both the stream and the timestamp included on every line. So print STDERR "Hello World" prints STDERR: 20130215123456: Hello World. I don't want to rewrite all my print statements into function calls, also some of the output will be coming from external processes via system() calls anyway which I won't be able to rewrite.
I also need for the output to be placed in the file "live", i.e. not only written when the process completes.
(p.s. I'm not asking particularly for details of how to generate timestamps, just how to redirect to a file and prepend a string)
I've worked out the following code, but it's messy:
my $mode = ">>";
my $file = "outerr.txt";
open(STDOUT, "|-", qq(perl -e 'open(FILE, "$mode", "$file"); while (<>) { print FILE "STDOUT: \$\_"; }'));
open(STDERR, "|-", qq(perl -e 'open(FILE, "$mode", "$file"); while (<>) { print FILE "STDERR: \$\_"; }'));
(The above doesn't add dates, but that should be trivial to add)
I'm looking for a cleaner solution, one that doesn't require quoting perl code and passing it on the command line, or at least module that hides some of the complexity. Looking at the code for Capture::Tiny it doesn't look like it can handle writing a part of output, though I'm not sure about that. annotate-output only works on an external command sadly, I need this to work on both external commands and ordinary perl printing.
The child launched via system doesn't write to STDOUT because it does not have access to variables in your program. Therefore, means having code run on a Perl file handle write (e.g. tie) won't work.
Write another script that runs your script with STDOUT and STDERR replaced with pipes. Read from those pipes and print out the modified output. I suggest using IPC::Run to do this, because it'll save you from using select. You can get away without it if you combine STDOUT and STDERR in one stream.
I have a Perl script that has to wrap a PHP script that produces a lot of output, and takes about half an hour to run.
At moment I'm shelling out with:
print `$command`;
This works in the sense that the PHP script is called, and it does it's job, but, there is no output rendered by Perl until the PHP script finishes half an hour later.
Is there a way I could shell out so that the output from PHP is printed by perl as soon as it receives it?
The problem is that Perl's not going to finish reading until the PHP script terminates, and only when it finishes reading will it write. The backticks operator blocks until the child process exits, and there's no magic to make a read/write loop implicitly.
So you need to write one. Try a piped open:
open my $fh, '-|', $command or die 'Unable to open';
while (<$fh>) {
print;
}
close $fh;
This should then read each line as the PHP script writes it, and immediately output it. If the PHP script doesn't output in convenient lines and you want to do it with individual characters, you'll need to look into using read to get data from the file handle, and disable output buffering ($| = 1) on stdout for writing it.
See also http://perldoc.perl.org/perlipc.html#Using-open()-for-IPC
Are you really doing print `$command`?
If you are only running a command and not capturing any of its output, simply use system $command. It will write to stdout directly without passing through Perl.
You might want to investigate Capture::Tiny. IIRC something like this should work:
use strict;
use warnings;
use Capture::Tiny qw/tee/;
my ($stdout, $stderr, #result) = tee { system $command };
Actually, just using system might be good enough, YMMV.
External program has interactive mode asking for some details. Each passed argument must be accepted by return key. So far I managed to pass an argument to external process however the problem I'm facing more then one argument is passed, perl executes then all when you close pipe.
It's impractical in interactive modes when arguments are passed one by one.
#!/usr/bin/perl
use strict;
use warnings;
use IPC::Open2;
open(HANDLE, "|cmd|");
print HANDLE "time /T\n";
print HANDLE "date /T\n";
print HANDLE "dir\n";
close HANDLE;
Unfortunately you can't pass double pipes into open as one would like, and loading IPC::Open2 doesn't fix that. You have to use the open2 function exported by IPC::Open2.
use strict;
use warnings;
use IPC::Open2;
use IO::Handle; # so we can call methods on filehandles
my $command = 'cat';
open2( my $out, my $in, $command ) or die "Can't open $command: $!";
# Set both filehandles to print immediately and not wait for a newline.
# Just a good idea to prevent hanging.
$out->autoflush(1);
$in->autoflush(1);
# Send lines to the command
print $in "Something\n";
print $in "Something else\n";
# Close input to the command so it knows nothing more is coming.
# If you don't do this, you risk hanging reading the output.
# The command thinks there could be more input and will not
# send an end-of-file.
close $in;
# Read all the output
print <$out>;
# Close the output so the command process shuts down
close $out;
This pattern works if all you have to do is send a command a bunch of lines and then read the output once. If you need to be interactive, it's very very easy for your program to hang waiting for output that is never coming. For interactive work, I would suggest IPC::Run. It's rather overpowered, but it will cover just about everything you might want to do with an external process.