I am trying to write a Perl CGI which executes an RKHunter scan. While executing the comman, I would like to show something to indicate progress instead of the actual output which is to be redirected to another file. The code thus far is:
open(my $quik_rk, '-|', 'rkhunter', '--enable', '"known_rkts"') or print "ERROR RUNNING QUICK ROOTKIT CHECK!!";
while(<$quik_rk>)
{ print ".";
}
print "\n";
close($quik_rk);
This doesn't show any output and I am presented with a blank screen while waiting for execution to complete. All the dots are printed to the screen together instead of one-by-one., Moreover, when I use the following to redirect, the command doesn't execute at all:
open(my $quik_rk, '-|', 'rkhunter', '--enable', '"known_rkts"', '>>', '/path/to/file') or print "ERROR RUNNING QUICK ROOTKIT CHECK!!";
How can I fix this in such a way that the verbose output is redirected to a file and only a .... steadily progresses on the screen?
$|=1;
At the beginning of your script.
This turns autoflush on so every print actually prints instead of waiting for a newline before flushing the buffer.
Also see: http://perldoc.perl.org/perlvar.html#Variables-related-to-filehandles
Related
I have this code in Perl:
print "Processing ... ";
while ( some condition ) {
# do something over than 10 minutes
}
print "OK\n";
Now I get back the first print after the while loop is finished.
How can I print the messeage before the while loop is started?
Output is buffered, meaning the program decides when it actually renders what you printed. You can put
$| = 1;
to flush stdout in this single instance. For more methods (auto-flushing, file flushing etc) you can search around SO for questions about this.
Ordinarily, perl will buffer up to 8KB of output text before flushing it to the device, or up to the next newline if the device is a terminal. You can avoid this by adding
STDOUT->autoflush
to the top of your code, assuming that you are printing to STDOUT. This will force the data to be flushed after every print, say or write operation
Note that this is the same as using $| = 1 but is significantly less cryptic and allows you to change the properties of any given file handle
You can see the prints by flushing the buffers immediately after.
print "Processing ... ";
STDOUT->flush;
If you are using autoflush, you should save the current configuration by duplicating the file handle.
use autodie; # dies if error on open and close.
{
STDOUT->flush; # empty its buffer
open my $saved_stdout, '>&', \*STDOUT;
STDOUT->autoflush;
# ... output with autoflush active
open STDOUT, '>&', $saved_stdout; # restore old STDOUT
}
See perldoc -f open and search for /\>\&/
I'm working on a library with a test suite that uses Perl open to run it's tests. It looks something like this:
open (MYOUT, "$myprog $arg1 $arg2 $arg3 2>&1 |") die "Bad stuff happened";
What I'd really like to do is to measure the runtime of $myprog. Unfortunately, just grabbing a start time and end time around the open command just grabs roughly how long it takes to start up the process.
Is there some way of either forcing the open command to finish the process (and therefore accurately measure time) or perhaps something else that would accomplish the same thing?
Key constraints are that we need to capture (potentially a lot of) STDOUT and STDERR.
Since you open a pipe, you need to time from before opening to at least after the reading
use warnings;
use strict;
use Time::HiRes qw(gettimeofday tv_interval sleep);
my $t0 = [gettimeofday];
open my $read, '-|', qw(ls -l) or die "Can't open process: $!";
while (<$read>)
{
sleep 0.1;
print;
}
print "It took ", tv_interval($t0), " seconds\n";
# close pipe and check
or, to time the whole process, after calling close on the pipe (after all reading is done)
my $t0 = [gettimeofday];
open my $read, '-|', qw(ls -l) or die "Can't open process: $!";
# ... while ($read) { ... }
close $read or
warn $! ? "Error closing pipe: $!" : "Exit status: $?";
print "It took ", tv_interval($t0), " seconds\n";
The close blocks and waits for the program to finish
Closing a pipe also waits for the process executing on the pipe to exit--in case you wish to look at the output of the pipe afterwards--and implicitly puts the exit status value of that command into $? [...]
For the status check see $? variable in perlvar and system
If the timed program forks and doesn't wait on its children in a blocking way this won't time them correctly.
In that case you need to identify resources that they use (files?) and monitor that.
I'd like to add that external commands should be put together carefully, to avoid shell injection trouble. A good module is String::ShellQuote. See for example this answer and this answer
Using a module for capturing streams would free you from the shell and perhaps open other ways to run and time this more reliably. A good one is Capture::Tiny (and there are others as well).
Thanks to HåkonHægland for comments. Thanks to ikegami for setting me straight, to use close (and not waitpid).
I have perl script that I am running in Windows. In the script I call another perl scrip. I am trying get both of those scripts to print to the cmd window and a file. This basically how I am doing the call
using IO::Tee
open (my $file, '>>', "C:\\Logs\\logfile.txt") or die "couldn't open log file: $!";
me $tee = IO::Tee->new(\*STDOUT, $file);
# doing some stuff
print $tee "log about what i just did";
# do more stuff
print $tee "more logs";
print $tee `c:\\secondScript.pl arg1`;
print $tee "done with script";
The second script is basically
# do stuff
print "script 2 log about stuff";
# do more stuff
print "script 2 log about more stuff";
print "script 2 done";
This does get everything to the screen and a file. However, I don't see the "script 2 log about stuff", "script 2 log about more stuff", and "script 2 done" until after script 2 has finished. I would like to see all of that stream to the screen and the file as soon as the print is reached.
Printing to STDOUT is usually line buffered (to speed things up) when output goes to a terminal and block buffered otherwise (e.g. when redirecting output to a file).
You can think of it as if anything printed is first placed into a buffer (typically 4096 bytes large) and only when the buffer is full (i.e. 4096 characters were printed), it gets output to the screen.
line buffered means the output is only shown on screen after a \n is found (or the buffer is exhausted). You don't have \ns in your 2nd script, so no output is shown until either a) \n comes, b) buffer is full, or c) the script ends.
block buffered means the output is shown only when the buffer is full. \ns don't influence this here (except for counting as one character).
To avoid buffering there's a magic variable called $|. From the docs (scroll down to the $| section):
If set to nonzero, forces a flush right away and after every write or
print on the currently selected output channel.
So you could append "\n" to your print statements or – better – set $| = 1; on top of your 2nd script (only once, not for each print). This will slow down the output of the 2nd script (in theory) but for a few lines it will make no difference.
I want to display an iterative progress bar during the execution of a particular command in my Perl-CGI program. I use the CGI::ProgressBar module to achieve this. For example, if I want to show the progress bar during the execution of an RKHunter scan, this is the code I wrote:
use CGI::ProgressBar qw/:standard/;
$| = 1;
print progress_bar( -from=>1, -to=>100 );
open(my $quik_rk, '-|', 'rkhunter', '--enable', '"known_rkts"') or print "ERROR RUNNING BASIC ROOTKIT CHECK!!";
# print the progress bar while the command executes
while(<$quik_rk>)
{
print update_progress_bar;
#print "<img src=\"ajax-loader.gif\"></img>";
}
close($quik_rk);
This works fine. However, I try the same on another command(this one's to scan using Linux Maldet) immediate after the code above:
open(my $quik_lmd, '-|', 'maldet', '-a', '/home?/?/public_html') or print "ERROR RUNNING BASIC MALWARE CHECK!!";
my $this_ctr = 0;
while(<$quik_lmd>)
{ $this_ctr++;
print update_progress_bar;
}
close($quik_lmd);
The progress bar doesn't execute but te command itself runs in the background.
What am I doing wrong?
Is there a better way to show a progress bar on a browser in Perl-CGI?
I am not familiar with RKHunter, but based on your results my guess is that it outputs a line of text for each test it runs, while the other command does not.
Each line of text output by RKHunter will trigger the next iteration of <$quik_rk>.
The second command, <$quik_lmd>, it is likely silent, so it never triggers the loop. Once the command terminates, execution continues after your while.
The key bit here is "line of text". The <$filehandle> operator returns a line of text each time it sees a newline character. In order to do what you want using this construct, you would need to coerce the second command into being verbose about it's activities, and most importantly, to be verbose with a lot of newlines.
Alternatively, you can open a background process and use sleep to manage your loop, e.g.,
use strict;
use POSIX qw(WNOHANG);
my $pid = open(my $quik_rk, '-|', 'sleep', '5'); # replace with your command
do {
print "waiting\n"; # update_progress_bar;
sleep 1; # seconds between update
} while (waitpid($pid, WNOHANG)==0);
I'm using perl back-ticks syntax to run some commands.
I would like the output of the command to written to a file and also printed out to stdout.
I can accomplish the first by adding a > at the end of my back-ticked string, but I do not know hot to make the output be printed as soon as it is generated. If I do something like
print `command`;
the output is printed only after command finished executing.
Thanks,
Dave
You cannot do it with the backticks, as they return to the Perl program only when the execution has finished.
So,
print `command1; command2; command3`;
will wait until command3 finishes to output anything.
You should use a pipe instead of backticks to be able to get output immediately:
open (my $cmds, "-|", "command1; command2; command3");
while (<$cmds>) {
print;
}
close $cmds;
After you've done this, you then have to see if you want or not buffering (depending on how immediate you want the output to be): Suffering from buffering?
To print and store the output, you can open() a file to write the output to:
open (my $cmds, "-|", "command1; command2; command3");
open (my $outfile, ">", "file.txt");
while (<$cmds>) {
print;
print $outfile $_;
}
close $cmds;
close $outfile;