I have a Perl script
for my $i (1..10) {
print "How $i\n";
sleep(1);
}
I want to run it from another perl script and capture its output. I know that I can use qx() or backticks, but they will return all the output simultaneously when the program exits.
But, what I instead want is to print the output of the first script from the second script as soon as they are available ie. for the example code the output of the first script is printed in ten steps from the second script and not in one go.
I looked at this question and wrote my second script as
my $cmd = "perl a.pl";
open my $cmd_fh, "$cmd |";
while(<$cmd_fh>) {
print "Received: $_\n";
STDOUT->flush();
}
close $cmd_fh;
However, the output is still being printed simultaneously. I wanted to know a way to get this done ?
The child sends output in chunks of 4 or KiB or 8 KiB rather than a line at a time. Perl programs, like most programs, flush STDOUT on linefeed, but only when connected to a terminal; they fully buffer the output otherwise. You can add $| = 1; to the child to disable buffering of STDOUT.
If you can't modify the child, such programs can be fooled by using pseudo-ttys instead of pipes. See IPC::Run. (Search its documentation for "pty".)
Related
I'm having problems with Perl on Windows (both ActivePerl and Strawberry), when redirecting a script STDOUT to a pipe, and using sleep(). Try this:
perl -e "for (;;) { print 'Printing line ', $i++, \"\n\"; sleep(1); }"
This works as expected. Now pipe it to Tee (or some file, same result):
perl -e "for (;;) { print 'Printing line ', $i++, \"\n\"; sleep(1); }" | tee
There's no output at all, tee captures nothing. However, the perl script is still running, only there's nothing on STDOUT until the script finishes, and then all output is dumped to tee. Except, if the STDOUT buffer fills the script might hang.
Now, if you remove the sleep(call), the pipe works as expected! What's going on?
I found a workaround; disabling the STDOUT buffering with $|=1 makes the pipe work when using the sleep, but... why? Can anyone explain and offer a better solution?
You are suffering from buffering. Add $| = 1; to unbuffer STDOUT.
All file handles except STDERR are buffered by default, but STDOUT uses a minimal form of buffering (flushed by newlines) when connected to a terminal. By substituting the terminal for a pipe, normal buffering is reinstated.
Removing the sleep call doesn't change anything except to speeds things up. Instead of taking minutes to fill up the buffer, it takes milliseconds. With or without it, the output is still written in 4k or 8k blocks (depending on your version of Perl).
I need to know how is possible return values of Perl file from other Perl file.
In my first file i call to the second file with sentence similar to:
$variable = qx( perl file2.pl --param1 $p1 --param2 $p2);
I have tried with exit and return to get this data but is not possible.
Any idea?
Processes are no subroutines.
Communication between processes (“IPC”) is mostly done via normal file handles. Such file handles can specifically be
STDIN and STDOUT,
pipes that are set up by the parent process, these are then shared by the child,
sockets
Every process also has an exit code. This code is zero for success, and non-zero to indicate a failure. The code can be any integer in the range 0–255. The exit code can be set via the exit function, e.g. exit(1), and is also set by die.
Using STDIN and STDOUT is the normal mode of operation for command line programs that follow the Unix philosophy. This allows them to be chained with pipes to more complex programs, e.g.
cat a b c | grep foo | sort >out
Such a tool can be implemented in Perl by reading from the ARGV or STDIN file handle, and printing to STDOUT:
while (<>) {
# do something with $_
print $output;
}
Another program can then feed data to that script, and read it from the STDOUT. We can use open to treat the output as a regular file handle:
use autodie;
open my $tool, "-|", "perl", "somescript.pl", "input-data"; # notice -| open mode
while (<$tool>) {
...
}
close $tool;
When you want all the output in one variable (scalar or array), you can use qx as a shortcut: my $tool_output = qx/perl somescript.pl input-data/, but this has two disadvantages: One, a shell process is executed to parse the command (shell escaping problems, inefficiency). Two, the output is available only when the command has finished. Using open on the other hand allows you to do parallel computations.
In file2.pl, you must print something to STDOUT. For example:
print "abc\n";
print is the solution.
Sorry for my idiot question!
#
$variable = system( perl file2.pl --param1 $p1 --param2 $p2);
#$variable has return value of perl file2.pl ...
I basically want to reopen STDERR/STDOUT so they write to one logfile with both the stream and the timestamp included on every line. So print STDERR "Hello World" prints STDERR: 20130215123456: Hello World. I don't want to rewrite all my print statements into function calls, also some of the output will be coming from external processes via system() calls anyway which I won't be able to rewrite.
I also need for the output to be placed in the file "live", i.e. not only written when the process completes.
(p.s. I'm not asking particularly for details of how to generate timestamps, just how to redirect to a file and prepend a string)
I've worked out the following code, but it's messy:
my $mode = ">>";
my $file = "outerr.txt";
open(STDOUT, "|-", qq(perl -e 'open(FILE, "$mode", "$file"); while (<>) { print FILE "STDOUT: \$\_"; }'));
open(STDERR, "|-", qq(perl -e 'open(FILE, "$mode", "$file"); while (<>) { print FILE "STDERR: \$\_"; }'));
(The above doesn't add dates, but that should be trivial to add)
I'm looking for a cleaner solution, one that doesn't require quoting perl code and passing it on the command line, or at least module that hides some of the complexity. Looking at the code for Capture::Tiny it doesn't look like it can handle writing a part of output, though I'm not sure about that. annotate-output only works on an external command sadly, I need this to work on both external commands and ordinary perl printing.
The child launched via system doesn't write to STDOUT because it does not have access to variables in your program. Therefore, means having code run on a Perl file handle write (e.g. tie) won't work.
Write another script that runs your script with STDOUT and STDERR replaced with pipes. Read from those pipes and print out the modified output. I suggest using IPC::Run to do this, because it'll save you from using select. You can get away without it if you combine STDOUT and STDERR in one stream.
I have a Perl script that has to wrap a PHP script that produces a lot of output, and takes about half an hour to run.
At moment I'm shelling out with:
print `$command`;
This works in the sense that the PHP script is called, and it does it's job, but, there is no output rendered by Perl until the PHP script finishes half an hour later.
Is there a way I could shell out so that the output from PHP is printed by perl as soon as it receives it?
The problem is that Perl's not going to finish reading until the PHP script terminates, and only when it finishes reading will it write. The backticks operator blocks until the child process exits, and there's no magic to make a read/write loop implicitly.
So you need to write one. Try a piped open:
open my $fh, '-|', $command or die 'Unable to open';
while (<$fh>) {
print;
}
close $fh;
This should then read each line as the PHP script writes it, and immediately output it. If the PHP script doesn't output in convenient lines and you want to do it with individual characters, you'll need to look into using read to get data from the file handle, and disable output buffering ($| = 1) on stdout for writing it.
See also http://perldoc.perl.org/perlipc.html#Using-open()-for-IPC
Are you really doing print `$command`?
If you are only running a command and not capturing any of its output, simply use system $command. It will write to stdout directly without passing through Perl.
You might want to investigate Capture::Tiny. IIRC something like this should work:
use strict;
use warnings;
use Capture::Tiny qw/tee/;
my ($stdout, $stderr, #result) = tee { system $command };
Actually, just using system might be good enough, YMMV.
If I have this perl app:
print `someshellscript.sh`;
that prints bunch of stuff and takes a long time to complete, how can I print that output in the middle of execution of the shell script?
Looks like Perl will only print the someshellscript.sh result when it completes, is there a way to make output flush in the middle of execution?
What you probably want to do is something like this:
open(F, "someshellscript.sh|");
while (<F>) {
print;
}
close(F);
This runs someshellscript.sh and opens a pipe that reads its output. The while loop reads each line of output generated by the script and prints it. See the open documentation page for more information.
The problem here is that escaping with backticks stores your script to a string, which you then print. For this reason, there would be no way to "flush" with print.
Using the system() command should print output continuously, but you won't be able to capture the output:
system "someshellscript.sh";