I run a simple file test in perl with the code below:
my $f1 = "$pre_file";
unless (-e $1) {
print "\n Pre_check file does not exists! \n";
die;
}
It prints the following output:
Pre_check file does not exists!
Died at ./huawei-postcheck line 81.
However I do not want the last line "Died at ./huawei-postcheck line 81.".
I want to to "die" with no error message.
Is it possible?
See the documentation for die.
If the last element of LIST does not end in a newline, the current
script line number and input line number (if any) are also printed,
and a newline is supplied.
So you can get die to work without printing anything by just using die "\n". But given that you have an error message, I can't see why you don't use that.
unless (-e $f1) {
die "\n Pre_check file does not exist!\n";
}
Of course, the difference is that the message will now go to STDERR rather than STDOUT. But that's probably the right place for it to go.
use exit instead of die.
You could just say
die "\n";
to suppress the message.
You probably want to exit 1 instead of dying then.
my $f1 = "$pre_file";
unless (-e $1) {
print "\n Pre_check file does not exists! \n";
exit 1;
}
Related
I want to write perl scripts that can read the STDIN that is given at invocation of the script, finish reading it, and then interactively prompt the user for a one-line STDIN. This one-line STDIN will tell the script how to proceed.
In a practical application, I would like the script to create a temporary file, report on the size of temporary file, and then ask the user if they really want to print the entire temporary file to STDOUT, or do they want to give a filename that will be clobbered by the temporary file's contents.
The following script behaves as desired if I give STDIN as a filename but does not work if I pipe STDIN to the script.
#! /usr/bin/perl
use strict; use warnings;
my $count = 0;
while(<>)
{
$count++;
}
print "you counted $count lines. Now do you want to proceed?";
my $answer = <STDIN>;
chomp $answer;
print STDERR "answer=$answer\n";
if ( $answer eq "yes" )
{
print STDERR "you said $answer so we do something affirmative\n";
}
else
{
print STDERR "you said $answer which is not \"yes\" so we do NOT proceed\n";
}
for instance
> wc junk
193 1042 11312 junk
> junk.pl junk
you counted 193 lines. Now do you want to proceed?yes
answer=yes
you said yes so we do something affirmative
> junk.pl junk
you counted 193 lines. Now do you want to proceed?no
answer=no
you said no which is not "yes" so we do NOT proceed
> cat junk | junk.pl
Use of uninitialized value $answer in scalar chomp at /Users/BNW/u/kh/bin/junk.pl line 10.
Use of uninitialized value $answer in concatenation (.) or string at /Users/BNW/u/kh/bin/junk.pl line 11.
answer=
Use of uninitialized value $answer in string eq at /Users/BNW/u/kh/bin/junk.pl line 12.
Use of uninitialized value $answer in concatenation (.) or string at /Users/BNW/u/kh/bin/junk.pl line 18.
you said which is not "yes" so we do NOT proceed
you counted 193 lines. Now do you want to proceed?>
Sort of. Maybe.
First off, in your first example, it's not true that you "gave STDIN as a filename". STDIN is the terminal throughout. <> is reading from the ARGV handle, not STDIN, so STDIN is available later when you need it.
The problem with the second example example is that the pipe from cat is STDIN. Closing it and reopening it to what it was initially doesn't do anything for you, because it will still be an exhausted pipe.
Many systems, though, have a special device /dev/tty which points to the controlling terminal of whichever process asks for it. On such a system, you could reopen STDIN from /dev/tty after it gives EOF, and you would get the console that the user invoked your program from, instead of whatever file or pipe they initially gave you as STDIN.
Thanks to #hobbs. Note that this works either way: piping the file junk into the script or passing junk as ARGV.
> printf "line 1 \nline 222 \n" > junk
> perl -e 'use strict; use warnings; while(<>) { print; } my $stuff = "/dev/tty"; my $h; open $h, "<", $stuff or die "waah $stuff"; print "give answer:"; my $answer=<$h>; print "answer=$answer\n";' junk
line 1
line 222
give answer:This is an answer!
answer=This is an answer!
> cat junk | perl -e 'use strict; use warnings; while(<>) { print; } my $stuff = "/dev/tty"; my $h; open $h, "<", $stuff or die "waah $stuff"; print "give answer:"; my $answer=<$h>; print "answer=$answer\n";'
line 1
line 222
give answer:So, what was it she was saying?? ??
answer=So, what was it she was saying?? ??
>
I tried the below code snippet and the seek function doesn't seem to work.
funct("ls -ltr /scratch/dummy/dum*");
sub funct {
print "\nRecording\n";
open(SENSOR_LIST1, "$_[0] |") || die "Failed to read sensor list: $!\n";
for $sensor_line1 (<SENSOR_LIST1>) {
print "$sensor_line1";
}
my $pos = tell SENSOR_LIST1;
print "\nposition is $pos"; #Here the position is 613
print "\nRecording again";
seek (SENSOR_LIST1, SEEK_SET, 0);
$pos = tell SENSOR_LIST1; # Here again the position is 613, even after a seek
print "\nposition now is $pos";
for $sensor_line1 (<SENSOR_LIST1>) {
print "$sensor_line1";
}
close SENSOR_LIST1;
}
Note: All variants of seek doesn't work.
Output:
The position does not change even after the seek. It remains in 613.
Can you guys, check and let me know what is the issue here?
Thanks.
You cannot seek on a pipe.
Either use a temporary file or store the data in memory.
Your choice as to the best solution
Try writing the output of your ls command to a file and opening that file instead of reading the command's output directly. You can't seek on a transient data stream (such as a command's output), only on data which still exists after being read (such as a file).
I am having a file which has so many lines. I want to discard first line and
trying to read a file from second line till end but not getting enough help on google.
Please help me out in this case.
Below is the code in which I am trying to extract 4 and 5 column of a csv file however It is including first line that is header as well, that I do not want.
My code should get me only values not headers. that are starting from second line.
foreach my $inputfile (glob("$previous_path/*Analysis*.txt")) {
open(INFILE, $inputfile) or die("Could not open file.");
foreach my $line (<INFILE>){
my #values = split(',', $line); # parse the file
my $previous_result = $values[5];
my $previous_time = $values[4];
print $previous_result,"\n";
print $previous_time,"\n";
push (#previous_result, $previous_result);
push (#previous_time, $previous_time);
}
close(INFILE);
}
Just skip the first line, then read the rest.
<>; # read and discard a line
while (<>) { # loop over the other lines
print $_
}
UPDATE: after you've edited the question, it turns out you want something completely different, to
read a CSV file in Perl
That is a completely different question, and what you should have asked for in the first place. The answer is to use an established library, like CSV::Slurp
Just skip line number ($.) 1, perhaps using next, like this:
while (<>) {
next if ($. == 1);
print $_;
}
Live demo.
u can skip the first line while reading the file itself
ex.
open(IN,"cat filename|tail -n +2|") || die "can not open file :$!";
while(<IN>){
//process further
}
close(IN);
As per solution provided in perldoc, I am trying to emulate tail -f but it's not working as expected. The below code could print all lines first time but not the newly lines appended. Could you please elaborate if I am missing any thing here.
#!/usr/bin/perl
open (LOGFILE, "aa") or die "could not open file reason $! \n";
for (;;)
{
seek(LOGFILE,0,1); ### clear OF condition
for ($curpos = tell(LOGFILE); <LOGFILE>; $curpos = tell(LOGFILE))
{
print "$_ \n";
}
sleep 1;
seek(LOGFILE,$curpos,0); ### Setting cursor at the EOF
}
works fine for me. How are you updating "aa" ?
You wont see the data immediately if it is a buffered write to "aa".
can you try the following in a different terminal and check whether you are seeing any update.
while ( 1 )
echo "test" >> aa
end
If you are using perl to update aa, check this section on buffering and how to disable.
I have written a script which is reading some data from log file and transform the data to simpler form and writing it back to another file. the reading is done line by line with a delay of 5 seconds i.e. sleep(5).
Meanwhile, on the command line if a user enters 'suspend' (through STDIN) then the program went on to sleep unless 'resume' is not entered and then read the next line.
Since, with every iteration in the loop I am checking STDIN whether 'suspend' is entered or not by the user.
if not then read the next line from the file. but when my programs runs I have to at least hit a ENTER key, otherwise it does not picks the next line from the input log file albeit i put an if statement to check if STDIN is undefined or not.
I am not a perl expert and this the first time I am writing a code in PERL. infact i have never done this file parsing thing before :'-(
my code implementation is like this;
#!/usr/local/bin/perl
my $line_no = 0;my $cancel = 0; my $line = "";
my $curpos = 0; my $whence = 0;
my $log_file = "/var/log/tmp_nagios.log";
#open(LOGFILE, "+< $log_file")or die "Failed to open $log_file, $!";
my $inp_file = "/var/log/sec_input";
my $logbuffer="";
#open(LOGFILE, "+< $log_file")or die "Failed to open $log_file, $!";
my $in;
while(1){
print "in While (1) Pos: $curpos and Whence:$whence\n";
open(LOGFILE, "+< $log_file")or die "Failed to open $log_file, $!";
seek(LOGFILE, $curpos, $whence);
next if(eof(LOGFILE));
print "Beginning\n";
while(<LOGFILE>){
#chomp($in = <STDIN>);
#if(defined($in) && $in =~ /^suspend$/i){
### Problem here ###
if(defined(<STDIN>) && <STDIN> =~ /^suspend\n$/i){ ## checking if 'suspend' is entered
print "Suspend Mode";
do{
sleep(5);
}while(!(<STDIN> =~ /^resume\n$/i));
print "Resume now\n";
close(LOGFILE);
last;
}
else{
$line = $_;
if($line =~ m/^\[(\d+)\]\sCURRENT\sSERVICE\sSTATE:\s(\w+);(\w+|\_|\d+)+;(CRITICAL|OK);.+$/){
$logbuffer = "$1,$2-$3,$4\n";
print $logbuffer;
open(INPFILE, ">> $inp_file")or die "Failed! to open $inp_file, $!";
#print INPFILE $logbuffer;
close(INPUTFILE);
sleep(5);
$curpos = tell(LOGFILE);
$whence = 1;
}
}
}
print "\nRe openning the file from Pos=$curpos and Whence=$whence\n";
}
close(LOGFILE);
here is the sample log file (/var/log/tmp_nagios.log) data;
[1284336000] CURRENT SERVICE STATE: host1;event1;CRITICAL; s
[1284336000] CURRENT SERVICE STATE: host2;event1;CRITICAL; f
[1284336000] CURRENT SERVICE STATE: host3;event3;CRITICAL; g
[1284336000] CURRENT SERVICE STATE: host4;event4;CRITICAL; j
[1284336000] CURRENT SERVICE STATE: host5;event1;CRITICAL; s
[1284336000] CURRENT SERVICE STATE: host6;event1;CRITICAL; f
[1284336000] CURRENT SERVICE STATE: host7;event7;CRITICAL; s
Sorry guys! Typo mistake
In the beginning I said, 'my script is reading data from log file with a delay of 5 seconds i.e. sleep(5)'
but actually i forget to mention it in my code, therefore, uncomment this line: #sleep(3); and make 'sleep(5);'
thanks
If I have understood your question correctly: check out the Term::ReadKey CPAN Module.
You can use it to do non-blocking buffer reads. (If there is nothing in the buffer, your script does not pause for user input)
https://metacpan.org/pod/Term::ReadKey
You may also like to approach this problem slightly differently - using signals:
http://perldoc.perl.org/perlipc.html. You can have your program run normally, but capture interrupts (e.g. CTRL-C)
Alternatively, you could just use CTRL-Z and fg to make your script sleep and wake.
Another option is POE::Wheel::FollowTail for the log and POE::Wheel::ReadLine or Term::Visual for user input. Though this might be a little overkill