Can't stop while loop in script - perl

I've wrote super simple script that rotate trough my logs files and tail them, endlessly.
My problem that I can't Ctrl+C this program and I need to Ctrl+Z and kill it after.
How can I solve this ?
I've tried with perl, but I have a msg "alarm clock" that I want to avoid.
with perl i could Ctrl+C it fine.
perl -e "alarm 10; exec #ARGV" "tail -15f $line | filter"
And my code:
#!/bin/bash
#
while :
do while read line
do
charcount=$(ls $line | awk '{ print length; }')
printf '%0.s=' $(seq 1 $charcount)
echo -e "\n$line"
printf '%0.s=' $(seq 1 $charcount)
printf '\n'
timeout 10s tail -15f $line | filter
done < <(ls /var/log/net/*.log)
done
Thanks!

The 'Alarm Clock' alert you refer to is because that's the default way of handling a SIGALRM signal, generated by the alarm function in perl.
The usual reason why Ctrl-C doesn't work, is because there's a blocking call occuring, and so the SIGINT signal sent doesn't get caught and handled. I can't see anything obvious in your code that would be causing this though.
To my mind, the most obvious way is - stop mixing perl and bash, because that way lies madness.
How about using the File::Tail module in Perl? This even has an example of how to do what you're wanting:
#!/usr/bin/perl
use strict;
use warnings;
use File::Tail;
foreach (#ARGV) {
push( #files, File::Tail->new( name => "$_", debug => $debug ) );
}
while (1) {
( $nfound, $timeleft, #pending ) =
File::Tail::select( undef, undef, undef, $timeout, #files );
unless ($nfound) {
# timeout - do something else here, if you need to
}
else {
foreach (#pending) {
print $_->{"input"} . " ("
. localtime(time) . ") "
. $_->read;
}
}
}
This will probably allow you to ctrl-c whilst running it anyway, but Perl does allow you to have better control over signals via the %SIG hash - allowing you to define custom handlers for kill signals - such as SIGALRM from alarm and SIGINT from Ctrl-C.

Related

Real Time stamp in perl code

Here's my perl code snippet
if($line =~ m/^Warning: (.*)$/){
$subStepValues = {
Warning => $1,
Warning_timeStamp => `date`,
};
push #{$subsSteps->{'subStepValues'}}, $subStepValues;
}
I am parsing the output of tail -f from a file to my perl code and i am really interested to get the actual on the go time stamp, currently some how executing date is not working
any other better suggestion?
How about a nice ISO timestamp?
use POSIX qw(strftime);
if ($line =~ m/^Warning: (.*)$/)
{
$subStepValues = {
Warning => $1,
Warning_timeStamp => strftime("%Y-%m-%dT%H:%M:%S", localtime),
};
push #{$subsSteps->{'subStepValues'}}, $subStepValues;
}
Here is a simple proof of concept from the command line using an empty file and running tail -f on it and then going to another terminal and appending a few lines to it in the manner echo something >> log
schumack#daddyo2 12-18T1:57:23 338> touch log
schumack#daddyo2 12-18T1:57:26 339> tail -f log | perl -lne 'BEGIN{use POSIX qw(strftime);}chomp; printf "%s -- %s\n", strftime("%Y-%m-%dT%H:%M:%S", localtime), $_;'
2015-12-18T01:57:40 -- hello
2015-12-18T01:57:46 -- line 2
2015-12-18T01:57:50 -- line 3

How to make perl to keep perform action until the match is found

I am new to Perl and trying to write a code to keep executing an action until the match is found and else give an error.
I am trying to execute a command ps -ef and check if it has got any process running in the name of "box", if there is no process named "box" found, I want to repeat ps -ef command execution until it gets the "box" process and then proceed to next action.
#!/usr/bin/perl -w
open (FH, "ps -ef |") or die "Cannot run the command:$!\n";
$line = "box";
while (<FH>) {
if (/$line/i) { next; }
else {
print ("ps -ef |") or die "Cannot run the command:$!\n");
}
}
close (FH);
You need to use an infinite loop and an exit-condition. Your condition is that the ps -ef command contains the word box. There is no need to open a pipe to that command explicitly, you can just run it as a system call with the qx operator (same as backticks).
use strict;
use warnings;
my $ps;
PS: while (1) {
$ps = qx/ps -ef/;
last PS if $ps =~ m/box/i;
print '.'; # do something in every run
}
print $ps;
As this has come up in the comments as well as in in AdrianHHH's answer: it might make sense to sleep after every run to make sure you don't hog the CPU. Depending on the nature of the process you are looking for, either the sleep builtin or usleep from Time::HiRes might be appropriate. The latter let's your program rest for milliseconds, while the builtin only works with full seconds. These might be too long if the target box process is very quick.
Explanation of your code:
Note that you have some issues in your implementation. I'll explain what your code does. This is taken from the question, comments are mine.
#!/usr/bin/perl -w
# open a filehandle to the ps command
open (FH, "ps -ef |") or die "Cannot run the command:$!\n";
$line = "box";
# read the output of one run line by line, for each line execute
# the block
while (<FH>) {
# if there is 'box' case-insensitive, skip the line
if (/$line/i) { next; }
else {
# else output (not run!) the command
print ("ps -ef |") or die "Cannot run the command:$!\n");
}
}
close (FH);
After it went through all the lines of the output of your command once it will stop.
I would recommend using pgrep(1) instead of ps because it lets you do a more granular search. With ps -ef, you potentially have to deal with cases like:
boxford 6254 6211 0 08:23 pts/1 00:00:00 /home/boxford/box --bounding-box=123
It's hard to tell if you're matching a process being run by a user with box in their username, a process that has box somewhere in its path, a process named box, or a process with box somewhere in its argument list.
pgrep, on the other hand, lets you match against just the process name or the full path, a specific user or users, and more. The following prints a message when a process named box appears (this looks for an exact match, so it will not match processes named dropbox, for example):
use strict;
use warnings;
use 5.010;
use String::ShellQuote qw(shell_quote);
sub is_running {
my ($proc) = #_;
my $cmd = 'pgrep -x ' . shell_quote($proc) . ' >/dev/null 2>&1';
system($cmd);
if ($? == -1) {
die "failed to execute pgrep: $!";
}
elsif ($? & 127) {
die "pgrep died with signal ", $? & 127;
}
else {
my $status = $? >> 8;
die "pgrep exited with error: exit status $status" if $status > 1;
return $status == 0;
}
}
my $proc = 'box';
until ( is_running($proc) ) {
sleep 1;
}
say "Process '$proc' is running";
Note that pgrep doesn't have a case-insensitive flag, probably because process names in *nix are almost always lowercase. If you really need to do a case-insensitive match, you can pass [Bb][Oo][Xx] to the is_running function.
The ps command outputs the current list of processes, then it completes. The code in the question reads that output. Suppose that the first ps command that is executed does not contain the wanted line, then there is nothing in the code in the question to run the ps command again.
The next statement in the question makes the script move on to the next line in the output from ps, not to rerun the command. The else print ... after the next will probably be executed for the first line of the output from ps. The outcome is that the print is run for each line in the ps output that does not have the wanted text and that the next command has no significant effect. In the code print ... or die "..." the or die "..." part is not very useful, the print is unlikely to fail and even if it did the die message would be wrong.
Perhaps you should write some code in the following style. Here the ps is run repeatedly until the wanted text is found. Note the sleep call, without that the script will keep running without pause, possibly preventing real work or at least slowing it down.
# This code is not tested.
use strict;
use warnings;
my $found_wanted_line = 0; # Boolean, set to false
my $line = "box";
while ( ! $found_wanted_line ) {
open (my $FH, "ps -ef |") or die "Cannot run the command:$!\n";
while (<$FH>) {
if (/$line/i) {
$found_wanted_line = 1; # Boolean, set to true
last;
}
}
close ($FH);
if ( ! $found_wanted_line )
sleep 2; # Pause for 2 seconds, to prevent this script hogging the CPU.
}
}

Capture the output of Perl's 'system()'

I need to run a shell command with system() in Perl. For example,
system('ls')
The system call will print to STDOUT, but I want to capture the output into a variable so that I can do future processing with my Perl code.
That's what backticks are for. From perldoc perlfaq8:
Why can't I get the output of a command with system()?
You're confusing the purpose of system() and backticks (``). system()
runs a command and returns exit status information (as a 16 bit value:
the low 7 bits are the signal the process died from, if any, and the
high 8 bits are the actual exit value). Backticks (``) run a command
and return what it sent to STDOUT.
my $exit_status = system("mail-users");
my $output_string = `ls`;
See perldoc perlop for more details.
IPC::Run is my favourite module for this kind of task. Very powerful and flexible, and also trivially simple for small cases.
use IPC::Run 'run';
run [ "command", "arguments", "here" ], ">", \my $stdout;
# Now $stdout contains output
Simply use similar to the Bash example:
$variable=`some_command some args`;
That's all. Notice, you will not see any printings to STDOUT on the output because this is redirected to a variable.
This example is unusable for a command that interact with the user, except when you have prepared answers. For that, you can use something like this using a stack of shell commands:
$variable=`cat answers.txt|some_command some args`;
Inside the answers.txt file you should prepare all answers for some_command to work properly.
I know this isn't the best way for programming :) But this is the simplest way how to achieve the goal, specially for Bash programmers.
Of course, if the output is bigger (ls with subdirectory), you shouldn't get all output at once. Read the command by the same way as you read a regular file:
open CMD,'-|','your_command some args' or die $#;
my $line;
while (defined($line=<CMD>)) {
print $line; # Or push #table,$line or do whatever what you want processing line by line
}
close CMD;
An additional extended solution for processing a long command output without extra Bash calling:
my #CommandCall=qw(find / -type d); # Some example single command
my $commandSTDOUT; # File handler
my $pid=open($commandSTDOUT),'-|'); # There will be an implicit fork!
if ($pid) {
#parent side
my $singleLine;
while(defined($singleline=<$commandSTDOUT>)) {
chomp $line; # Typically we don't need EOL
do_some_processing_with($line);
};
close $commandSTDOUT; # In this place $? will be set for capture
$exitcode=$? >> 8;
do_something_with_exit_code($exitcode);
} else {
# Child side, there you really calls a command
open STDERR, '>>&', 'STDOUT'; # Redirect stderr to stdout if needed. It works only for child - remember about fork
exec(#CommandCall); # At this point the child code is overloaded by an external command with parameters
die "Cannot call #CommandCall"; # Error procedure if the call will fail
}
If you use a procedure like that, you will capture all procedure output, and you can do everything processing line by line. Good luck :)
I wanted to run system() instead of backticks because I wanted to see the output of rsync --progress. However, I also wanted to capture the output in case something goes wrong depending on the return value. (This is for a backup script). This is what I am using now:
use File::Temp qw(tempfile);
use Term::ANSIColor qw(colored colorstrip);
sub mysystem {
my $cmd = shift; # "rsync -avz --progress -h $fullfile $copyfile";
my ($fh, $filename) = tempfile();
# http://stackoverflow.com/a/6872163/2923406
# I want to have rsync progress output on the terminal AND capture it in case of error.
# Need to use pipefail because 'tee' would be the last cmd otherwise and hence $? would be wrong.
my #cmd = ("bash", "-c", "set -o pipefail && $cmd 2>&1 | tee $filename");
my $ret = system(#cmd);
my $outerr = join('', <$fh>);
if ($ret != 0) {
logit(colored("ERROR: Could not execute command: $cmd", "red"));
logit(colored("ERROR: stdout+stderr = $outerr", "red"));
logit(colored("ERROR: \$? = $?, \$! = $!", "red"));
}
close $fh;
unlink($filename);
return $ret;
}
# And logit() is something like:
sub logit {
my $s = shift;
my ($logsec, $logmin, $loghour, $logmday, $logmon, $logyear, $logwday, $logyday, $logisdst) = localtime(time);
$logyear += 1900;
my $logtimestamp = sprintf("%4d-%02d-%02d %02d:%02d:%02d", $logyear, $logmon+1, $logmday, $loghour, $logmin, $logsec);
my $msg = "$logtimestamp $s\n";
print $msg;
open LOG, ">>$LOGFILE";
print LOG colorstrip($msg);
close LOG;
}

how to source a shell script [environment variables] in perl script without forking a subshell?

I want to call "env.sh " from "my_perl.pl" without forking a subshell. I tried with backtics and system like this --> system (. env.sh) [dot space env.sh] , however wont work.
Child environments cannot change parent environments. Your best bet is to parse env.sh from inside the Perl code and set the variables in %ENV:
#!/usr/bin/perl
use strict;
use warnings;
sub source {
my $name = shift;
open my $fh, "<", $name
or die "could not open $name: $!";
while (<$fh>) {
chomp;
my ($k, $v) = split /=/, $_, 2;
$v =~ s/^(['"])(.*)\1/$2/; #' fix highlighter
$v =~ s/\$([a-zA-Z]\w*)/$ENV{$1}/g;
$v =~ s/`(.*?)`/`$1`/ge; #dangerous
$ENV{$k} = $v;
}
}
source "env.sh";
for my $k (qw/foo bar baz quux/) {
print "$k => $ENV{$k}\n";
}
Given
foo=5
bar=10
baz="$foo$bar"
quux=`date +%Y%m%d`
it prints
foo => 5
bar => 10
baz => 510
quux => 20110726
The code can only handle simple files (for instance, it doesn't handle if statements or foo=$(date)). If you need something more complex, then writing a wrapper for your Perl script that sources env.sh first is the right way to go (it is also probably the right way to go in the first place).
Another reason to source env.sh before executing the Perl script is that setting the environment variables in Perl may happen too late for modules that are expecting to see them.
In the file foo:
#!/bin/bash
source env.sh
exec foo.real
where foo.real is your Perl script.
You can use arbitrarily complex shell scripts by executing them with the relevant shell, dumping their environment to standard output in the same process, and parsing that in perl. Feeding the output into something other than %ENV or filtering for specific values of interest is prudent so you don't change things like PATH that may have interesting side effects elsewhere. I've discarded standard output and error from the spawned shell script although they could be redirected to temporary files and used for diagnostic output in the perl script.
foo.pl:
#!/usr/bin/perl
open SOURCE, "bash -c '. foo.sh >& /dev/null; env'|" or
die "Can't fork: $!";
while(<SOURCE>) {
if (/^(BAR|BAZ)=(.*)/) {
$ENV{$1} = ${2} ;
}
}
close SOURCE;
print $ENV{'BAR'} . "\n";
foo.sh:
export BAR=baz
Try this (unix code sample):
cd /tmp
vi s
#!/bin/bash
export blah=test
vi t
#!/usr/bin/perl
if ($ARGV[0]) {
print "ENV second call is : $ENV{blah}\n";
} else {
print "ENV first call is : $ENV{blah}\n";
exec(". /tmp/s; /tmp/t 1");
}
chmod 777 s t
./t
ENV first call is :
ENV second call is : test
The trick is using the exec to source your bash script first and then calling your perl script again with an argument so u know that you are being called for a second time.

In Perl, how do I process input as soon as it arrives, instead of waiting for newline?

I'd like to run a subcommand from Perl (or pipe it into a Perl script) and have the script process the command's output immediately, rather than waiting for a timeout, a newline, or a certain number of blocks. For example, let's say I want to surround each chunk of input with square brackets. When I run the script like this:
$ ( echo -n foo ; sleep 5 ; echo -n bar ; sleep 5; echo baz) | my_script.pl
I'd like the output to be this, with each line appearing five seconds after the previous one:
[foo]
[bar]
[baz]
How do I do that?
This works, but is really ugly:
#! /usr/bin/perl -w
use strict;
use Fcntl;
my $flags = '';
fcntl(STDIN, F_GETFL, $flags);
$flags |= O_NONBLOCK;
fcntl(STDIN, F_SETFL, $flags);
my $rin = '';
vec($rin,fileno(STDIN),1) = 1;
my $rout;
while (1) {
select($rout=$rin, undef, undef, undef);
last if eof();
my $buffer = '';
while (my $c = getc()) {
$buffer .= $c;
}
print "[$buffer]\n";
}
Is there a more elegant way to do it?
From perlfaq5: How can I read a single character from a file? From the keyboard?. You probably also want to read How can I tell whether there's a character waiting on a filehandle?. Poll the filehandle. If there is a character there, read it and reset a timer. If there is not character there, try again. If you've retried and passed a certain time, process the input.
After you read the characters, it's up to you to decide what to do with them. With all the flexibility of reading single characters comes the extra work of handling them.
Term::ReadKey can do this for you. In particular setting the ReadKey() mode to do the polling for you.
use Term::ReadKey;
$| = 1;
while( my $key = ReadKey(10) ) {
print $key;
}
If there's time inbetween each character, you might be able to detect the pauses.
Perl also does line input - if you don't use getc you should be able to add newlines to the end of foo, bar, etc and perl will give you each line.
If you can't add newlines, and you can't depend on a pause, then what exactly do you expect the system to do to tell perl that it's started a new command? As far as perl is concerned, there's a stdin pipe, it's eating data from it, and there's nothing in the stdin pipe to tell you when you are executing a new command.
You might consider the following instead:
$ echo "( echo -n foo ; sleep 5 ; echo -n bar ; sleep 5; echo baz)" | my_script.pl
or
$ my_script.pl$ "echo -n foo ; sleep 5 ; echo -n bar ; sleep 5; echo baz"
And modify your perl program to parse the input "command line" and execute each task, eating the stdout as needed.
-Adam
See How to change Open2 input buffering. (Basically, you have to make the other program think it's talking to a tty.)
You didn't mention how you are reading input in your Perl script, but you might want to look at the getc function:
$|++; # set autoflush on output
while ($c = getc(STDIN)) {
print $c;
}