I have perl script that I am running in Windows. In the script I call another perl scrip. I am trying get both of those scripts to print to the cmd window and a file. This basically how I am doing the call
using IO::Tee
open (my $file, '>>', "C:\\Logs\\logfile.txt") or die "couldn't open log file: $!";
me $tee = IO::Tee->new(\*STDOUT, $file);
# doing some stuff
print $tee "log about what i just did";
# do more stuff
print $tee "more logs";
print $tee `c:\\secondScript.pl arg1`;
print $tee "done with script";
The second script is basically
# do stuff
print "script 2 log about stuff";
# do more stuff
print "script 2 log about more stuff";
print "script 2 done";
This does get everything to the screen and a file. However, I don't see the "script 2 log about stuff", "script 2 log about more stuff", and "script 2 done" until after script 2 has finished. I would like to see all of that stream to the screen and the file as soon as the print is reached.
Printing to STDOUT is usually line buffered (to speed things up) when output goes to a terminal and block buffered otherwise (e.g. when redirecting output to a file).
You can think of it as if anything printed is first placed into a buffer (typically 4096 bytes large) and only when the buffer is full (i.e. 4096 characters were printed), it gets output to the screen.
line buffered means the output is only shown on screen after a \n is found (or the buffer is exhausted). You don't have \ns in your 2nd script, so no output is shown until either a) \n comes, b) buffer is full, or c) the script ends.
block buffered means the output is shown only when the buffer is full. \ns don't influence this here (except for counting as one character).
To avoid buffering there's a magic variable called $|. From the docs (scroll down to the $| section):
If set to nonzero, forces a flush right away and after every write or
print on the currently selected output channel.
So you could append "\n" to your print statements or – better – set $| = 1; on top of your 2nd script (only once, not for each print). This will slow down the output of the 2nd script (in theory) but for a few lines it will make no difference.
Related
I have a Perl script
for my $i (1..10) {
print "How $i\n";
sleep(1);
}
I want to run it from another perl script and capture its output. I know that I can use qx() or backticks, but they will return all the output simultaneously when the program exits.
But, what I instead want is to print the output of the first script from the second script as soon as they are available ie. for the example code the output of the first script is printed in ten steps from the second script and not in one go.
I looked at this question and wrote my second script as
my $cmd = "perl a.pl";
open my $cmd_fh, "$cmd |";
while(<$cmd_fh>) {
print "Received: $_\n";
STDOUT->flush();
}
close $cmd_fh;
However, the output is still being printed simultaneously. I wanted to know a way to get this done ?
The child sends output in chunks of 4 or KiB or 8 KiB rather than a line at a time. Perl programs, like most programs, flush STDOUT on linefeed, but only when connected to a terminal; they fully buffer the output otherwise. You can add $| = 1; to the child to disable buffering of STDOUT.
If you can't modify the child, such programs can be fooled by using pseudo-ttys instead of pipes. See IPC::Run. (Search its documentation for "pty".)
My program is trying to print to a file which for it is the STDOUT.
To say, print "text here"; prints to a file x.log , while I am also trying to print to the file x.log using file-handler method as in print FH1 "text here"; . I notice that when the file-handler method statement is provided first and then followed by the STDOUT procedure. My second print can override the first.I would like to know more on why this occurs.
This makes me think of a race condition or that the file-handler is relatively slow (if it goes through buffer?) than the STDOUT print statements. This I am not sure on how if that is the way Perl works. Perl version - 5.22.0
As far as I understand your program basically looks like this:
open(my $fh,'>','foobar.txt');
print $fh "foo\n";
print "bar\n"; # prints to STDOUT
And then you use it in a way that STDOUT is redirected in the shell to the same file which is already opened in your program:
$ perl test.pl > foobar.txt
This will open two independent file handles to the same file: one within your program and the other within the shell where you start the program. Both file handles manage their own file position for writing, start at position 0 and advance the position after each write.
Since these file handles are independent from each other they will not care if there are other file handles dealing currently with this file, no matter if these other file handles are inside or outside the program. This means that these writes will overwrite each other.
In addition to this there is also internal buffering done, i.e. each print will first result into a write into some internal buffer and might result immediately into a write to the file handle. When data are written to the file handle depends on the mode of the file handle, i.e. unbuffered, line-buffered or a buffer of a specific size. This makes the result kind of unpredictable.
If you don't want this behavior but still want to write to the same file using multiple file handle you better use the append-mode, i.e. open with >> instead of > in both Perl code and shell. This will make sure that all data will be appended to the end of the file instead of written to the file position maintained by the file handle. This way data will not get overwritten. Additionally you might want to make the file handles unbuffered so that data in the file end up in the same order as the print statements where done:
open(my $fh,'>>','foobar.txt');
$fh->autoflush(1); # make $fh unbuffered
$|=1; # make STDOUT unbuffered
print $fh "foo\n";
print "bar\n"; # prints to STDOUT
$ perl test.pl >> foobar.txt
I have this code in Perl:
print "Processing ... ";
while ( some condition ) {
# do something over than 10 minutes
}
print "OK\n";
Now I get back the first print after the while loop is finished.
How can I print the messeage before the while loop is started?
Output is buffered, meaning the program decides when it actually renders what you printed. You can put
$| = 1;
to flush stdout in this single instance. For more methods (auto-flushing, file flushing etc) you can search around SO for questions about this.
Ordinarily, perl will buffer up to 8KB of output text before flushing it to the device, or up to the next newline if the device is a terminal. You can avoid this by adding
STDOUT->autoflush
to the top of your code, assuming that you are printing to STDOUT. This will force the data to be flushed after every print, say or write operation
Note that this is the same as using $| = 1 but is significantly less cryptic and allows you to change the properties of any given file handle
You can see the prints by flushing the buffers immediately after.
print "Processing ... ";
STDOUT->flush;
If you are using autoflush, you should save the current configuration by duplicating the file handle.
use autodie; # dies if error on open and close.
{
STDOUT->flush; # empty its buffer
open my $saved_stdout, '>&', \*STDOUT;
STDOUT->autoflush;
# ... output with autoflush active
open STDOUT, '>&', $saved_stdout; # restore old STDOUT
}
See perldoc -f open and search for /\>\&/
I have a script called my_bash.sh and it calls a perl script, and appends the output of the perl script to a log file. Does the file only get written once the perl script has completed?
my_bash.sh
#!/bin/bash
echo "Starting!" > "my_log.log"
perl my_perl.pl >> "my_log.log" 2>&1
echo "Ending!" >> "my_log.log"
The issue is that as the perl script is running, I'd like to manipulate contents of the my_log.log file while it's running, but it appears the file is blank. Is there a proper way to do this? Please let me know if you'd like more information.
my_perl.pl
...
foreach $component (#arrayOfComponents)
{
print "Component Name: $component (MORE_INFO)\n";
# Do some work to gather more info (including other prints)
#...
# I want to replace "MORE_INFO" above with what I've calculated here
system("sed 's/MORE_INFO/$moreInfo/' my_log.log");
}
The sed isn't working correctly since the print statements haven't yet made it to the my_log.log.
Perl would buffer the output by default. To disable buffering set $| to a non-zero value. Add
$|++;
at the top of your perl script.
Quoting perldoc pervar:
$|
If set to nonzero, forces a flush right away and after every write or
print on the currently selected output channel. Default is 0
(regardless of whether the channel is really buffered by the system or
not; $| tells you only whether you've asked Perl explicitly to flush
after each write). STDOUT will typically be line buffered if output is
to the terminal and block buffered otherwise. Setting this variable is
useful primarily when you are outputting to a pipe or socket, such as
when you are running a Perl program under rsh and want to see the
output as it's happening. This has no effect on input buffering. See
getc for that. See select on how to select the output channel. See
also IO::Handle.
Mnemonic: when you want your pipes to be piping hot.
The answer to this question depends on how your my_perl.pl is outputting data and how much data is being output.
If you're using normal (buffered) I/O to produce your output, then my_log.log will only be written to once the STDOUT buffer fills. Generally speaking, if you're not producing a lot of output, this is when the program ends and the buffer is flushed.
If you're producing enough output to fill the output buffer, you will get output in my_log.log prior to my_perl.pl completing.
Additionally, in Perl, you can make your STDOUT unbuffered with the following code:
select STDOUT; $| = 1;
In which case, your output would be written to STDOUT (and then to my_log.log via redirection) the moment it is produced in your script.
Depending on what you need to do to the log file, who might be able to read each line of output from the perl script, do something with the line, then write it to the log yourself (or not):
#!/bin/bash
echo "Starting!" > "my_log.log"
perl my_perl.pl | \
while read line; do
# do something with the line
echo "$line" >> "my_log.log"
done
echo "Ending!" >> "my_log.log"
In between
print "Component Name: $component (MORE_INFO)\n";
and
system("sed 's/MORE_INFO/$moreInfo/' my_log.log");
do you print stuff? If not, delay the first print until you've figured out $moreInfo
my $header = "Component Name: $component (MORE_INFO)";
# ...
# now I have $moreInfo
$header =~ s/MORE_INFO/$moreInfo/;
print $header, "\n";
If you do print stuff, you could always "queue" it until you have the info you need:
my #output;
foreach my $component (...) {
#output = ("Component Name: $component (MORE_INFO)");
# ...
push #output, "something to print";
# ...
$output[0] =~ s/MORE_INFO/$moreInfo/;
print join("\n", #output), "\n";
I am trying to write a Perl CGI which executes an RKHunter scan. While executing the comman, I would like to show something to indicate progress instead of the actual output which is to be redirected to another file. The code thus far is:
open(my $quik_rk, '-|', 'rkhunter', '--enable', '"known_rkts"') or print "ERROR RUNNING QUICK ROOTKIT CHECK!!";
while(<$quik_rk>)
{ print ".";
}
print "\n";
close($quik_rk);
This doesn't show any output and I am presented with a blank screen while waiting for execution to complete. All the dots are printed to the screen together instead of one-by-one., Moreover, when I use the following to redirect, the command doesn't execute at all:
open(my $quik_rk, '-|', 'rkhunter', '--enable', '"known_rkts"', '>>', '/path/to/file') or print "ERROR RUNNING QUICK ROOTKIT CHECK!!";
How can I fix this in such a way that the verbose output is redirected to a file and only a .... steadily progresses on the screen?
$|=1;
At the beginning of your script.
This turns autoflush on so every print actually prints instead of waiting for a newline before flushing the buffer.
Also see: http://perldoc.perl.org/perlvar.html#Variables-related-to-filehandles