Piping Perl Commands? - perl

I'm currently trying to take my program and have it take user input, usually a text file then call an external script to count the words. The script I'm working on is essentially a "middle man" and I'm trying to get more familiar with piping to external scripts/commands. It's currently not correctly executing the word counter script. Here's the code:
I'm still receiving error for ./word_counter.pl saying "no such file or directory at glue.pl (which is this script you see here)".
#!usr/bin/perl
use warnings;
use strict;
use IO::Handle qw();
open (PIPE_TO, "|-", "./word_counter.pl");
While(<>)
{
$PIPE_TO -> autoflush(1);
print PIPE_TO $_;
}

Suffering from buffering?
use IO::Handle qw( );
PIPE_TO->autoflush(1);

The reason it doesn't work is probably that you have syntax errors.
Otherwise: Other than introducing line-buffered semantics, you are really doing nothing here (you just pipe what you read to another program, which is in this case equivalent to just running the program)
Modulo the buffering (which you don't seem to explicitly need) an equivalent script would be:
#!/usr/bin/perl
exec ("./word_counter.pl");

Is this what you are trying to do?
#!/usr/bin/perl
use warnings;
use strict;
open (my $PIPE_TO, "|-", "./word_counter.pl") or die $!;
while(<>) {
print $PIPE_TO $_;
}

Related

Perl wrongly complaining about Name "main::FILE" used only once

I simplified my program to the following trivial snippet and I'm still getting the message
Name "main::FILE" used only once: possible typo...
#!/usr/bin/perl -w
use strict;
use autodie qw(open close);
foreach my $f (#ARGV) {
local $/;
open FILE, "<", $f;
local $_ = <FILE>; # <--- HERE
close FILE;
print $_;
}
which obviously isn't true as it gets used three times. For whatever reason, only the marked occurrence counts.
I am aware about nicer ways to open a file (using a $filehandle), but it doesn't pay for short script, does it? So how can I get rid of the wrong warning?
According to the documentation for autodie:
BUGS
"Used only once" warnings can be generated when autodie or Fatal is used with package filehandles (eg, FILE ). Scalar filehandles are strongly recommended instead.
I get the warning on Perl 5.10.1, but not 5.16.3, so there may be something else going on as well.

Perl script to parse a text file and match a string

I'm editing my question to add more details
The script executes the command and redirects the output to a text file.
The script then parses the text file to match the following string " Standard 1.1.1.1"
The output in the text file is :
Host Configuration
------------------
Profile Hostname
-------- ---------
standard 1.1.1.1
standard 1.1.1.2
The code works if i search for either 1.1.1.1 or standard . When i search for standard 1.1.1.1 together the below script fails.
this is the error that i get "Unable to find string: standard 172.25.44.241 at testtest.pl
#!/usr/bin/perl
use Net::SSH::Expect;
use strict;
use warnings;
use autodie;
open (HOSTRULES, ">hostrules.txt") || die "could not open output file";
my $hos = $ssh->exec(" I typed the command here ");
print HOSTRULES ($hos);
close(HOSTRULES);
sub find_string
{
my ($file, $string) = #_;
open my $fh, '<', $file;
while (<$fh>) {
return 1 if /\Q$string/;
}
die "Unable to find string: $string";
}
find_string('hostrules.txt', 'standard 1.1.1.1');
Perhaps write a function:
use strict;
use warnings;
use autodie;
sub find_string {
my ($file, $string) = #_;
open my $fh, '<', $file;
while (<$fh>) {
return 1 if /\Q$string/;
}
die "Unable to find string: $string";
}
find_string('output.txt', 'object-cache enabled');
Or just slurp the entire file:
use strict;
use warnings;
use autodie;
my $data = do {
open my $fh, '<', 'output.txt';
local $/;
<$fh>;
};
die "Unable to find string" if $data !~ /object-cache enabled/;
You're scanning a file for a particular string. If that string is not found in that file, you want an error thrown. Sounds like a job for grep.
use strict;
use warnings;
use features qw(say);
use autodie;
use constant {
OUTPUT_FILE => 'output.txt',
NEEDED_STRING => "object-cache enabled",
};
open my $out_fh, "<", OUTPUT_FILE;
my #output_lines = <$out_fh>;
close $out_fh;
chomp #output_lines;
grep { /#{[NEEDED_STRING]}/ } #output_lines or
die qq(ERROR! ERROR! ERROR!); #Or whatever you want
The die command will end the program and exit with a non-zero exit code. The error will be printed on STDERR.
I don't know why, but using qr(object-cache enabled), and then grep { NEEDED_STRING } didn't seem to work. Using #{[...]} allows you to interpolate constants.
Instead of constants, you might want to be able to pass in the error string and the name of the file using GetOptions.
I used the old fashion <...> file handling instead of IO::File, but that's because I'm an old fogy who learned Perl back in the 20th century before it was cool. You can use IO::File which is probably better and more modern.
ADDENDUM
Any reason for slurping the entire file in memory? - Leonardo Herrera
As long as the file is reasonably sized (say 100,000 lines or so), reading the entire file into memory shouldn't be that bad. However, you could use a loop:
use strict;
use warnings;
use features qw(say);
use autodie;
use constant {
OUTPUT_FILE => 'output.txt',
NEEDED_STRING => qr(object-cache enabled),
};
open my $out_fh, "<", OUTPUT_FILE;
my $output_string_found; # Flag to see if output string is found
while ( my $line = <$out_fh> ) {
if ( $line =~ NEEDED_STRING ){
$output_string_found = "Yup!"
last; # We found the string. No more looping.
}
}
die qq(ERROR, ERROR, ERROR) unless $output_string_found;
This will work with the constant NEEDED_STRING defined as a quoted regexp.
perl -ne '/object-cache enabled/ and $found++; END{ print "Object cache disabled\n" unless $found}' < input_file
This just reads the file a line at a time; if we find the key phrase, we increment $found. At the end, after we've read the whole file, we print the message unless we found the phrase.
If the message is insufficient, you can exit 1 unless $found instead.
I suggest this because there are two things to learn from this:
Perl provides good tools for doing basic filtering and data munging right at the command line.
Sometimes a simpler approach gets a solution out better and faster.
This absolutely isn't the perfect solution for every possible data extraction problem, but for this particular one, it's just what you need.
The -ne option flags tell Perl to set up a while loop to read all of standard input a line at a time, and to take any code following it and run it into the middle of that loop, resulting in a 'run this pattern match on each line in the file' program in a single command line.
END blocks can occur anywhere and are always run at the end of the program only, so defining it inside the while loop generated by -n is perfectly fine. When the program runs out of lines, we fall out the bottom of the while loop and run out of program, so Perl ends the program, triggering the execution of the END block to print (or not) the warning.
If the file you are searching contained a string that indicated the cache was disabled (the condition you want to catch), you could go even shorter:
perl -ne '/object-cache disabled/ and die "Object cache disabled\n"' < input_file
The program would scan the file only until it saw the indication that the cache was disabled, and would exit abnormally at that point.
First, why are you using Net::SSH::Expect? Are you executing a remote command? If not, all you need to execute a program and wait for its completion is system.
system("cmd > file.txt") or die "Couldn't execute: $!";
Second, it appears that what fails is your regular expression. You are searching for the literal expression standard 1.1.1.1 but in your sample text it appears that the wanted string contains either tabs or several spaces instead of a single space. Try changing your call to your find_string function:
find_string('hostrules.txt', 'standard\s+1.1.1.1'); # note '\s+' here

perl - string compare failing while fetching a line from a file.

my code,
#!/usr/bin/perl -w
use strict;
use warnings;
my $codes=" ";
my $count=0;
my $str1="code1";
open (FILE, '/home/vpnuser/testFile.txt') or die("Could not open the file.");
while($codes=<FILE>)
{
print($codes);
if($codes eq $str1)
{
$count++;
}
}
print "$count";
the comparison always fails. my testFile.txt contains one simple line - code1
when i have written a separate perl script where i have two strings declared in the script it self rather than getting it from a file, the eq operator works fine. but when i am getting it from a file, there is a problem. Pease help,
Thanks in advance!
Don't forget to chomp your file input if you don't want it to end in a return character.
while(my $codes = <FILE>)
{
chomp $codes;
That is likely the reason why your string comparison is failing.
As on additional aside, kudus for including use strict; and use warnings; at the the top of your script, like one should always do.
I'd like to recommend that you also include use autodie; at the top as well when doing file processing. It will automatically give you a detailed error message for doing many kinds of operations, such as opening a file, so you won't have to remember to include the error code $! or the filename in your die statement.

How to send STDIN(multiple arguments) to external process and work within interactive mode

External program has interactive mode asking for some details. Each passed argument must be accepted by return key. So far I managed to pass an argument to external process however the problem I'm facing more then one argument is passed, perl executes then all when you close pipe.
It's impractical in interactive modes when arguments are passed one by one.
#!/usr/bin/perl
use strict;
use warnings;
use IPC::Open2;
open(HANDLE, "|cmd|");
print HANDLE "time /T\n";
print HANDLE "date /T\n";
print HANDLE "dir\n";
close HANDLE;
Unfortunately you can't pass double pipes into open as one would like, and loading IPC::Open2 doesn't fix that. You have to use the open2 function exported by IPC::Open2.
use strict;
use warnings;
use IPC::Open2;
use IO::Handle; # so we can call methods on filehandles
my $command = 'cat';
open2( my $out, my $in, $command ) or die "Can't open $command: $!";
# Set both filehandles to print immediately and not wait for a newline.
# Just a good idea to prevent hanging.
$out->autoflush(1);
$in->autoflush(1);
# Send lines to the command
print $in "Something\n";
print $in "Something else\n";
# Close input to the command so it knows nothing more is coming.
# If you don't do this, you risk hanging reading the output.
# The command thinks there could be more input and will not
# send an end-of-file.
close $in;
# Read all the output
print <$out>;
# Close the output so the command process shuts down
close $out;
This pattern works if all you have to do is send a command a bunch of lines and then read the output once. If you need to be interactive, it's very very easy for your program to hang waiting for output that is never coming. For interactive work, I would suggest IPC::Run. It's rather overpowered, but it will cover just about everything you might want to do with an external process.

Is there a module that searches for superfluous code?

Is there a module, which can find code not needed?
As an example a script with code not needed to run the script:
#!/usr/bin/env perl
use warnings;
use 5.12.0;
use utf8;
binmode STDOUT, ':utf8';
use DateTime;
use WWW::Mechanize;
sub my_print {
my ( $string, $tab, $color ) = #_;
say $string;
}
sub check {
my $string = shift;
return if length $string > 10;
return $string;
}
my_print( 'Hello World' );
Not categorically. Perl is notoriously difficult to analyze without actually executing, to the point that compiling a Perl program to be run later actually requires including a copy of the perl interpreter! As a result there are very few code analysis tools for Perl. What you can do is use a profiler, but this is a bit overkill (and as I mentioned, requires actually executing the program. I like Devel::NYTProf. This will spit out some HTML files showing how many times eaqch line or sub was executed, as well as how much time was spent there, but this only works for that specific execution of the program. It will allow you to see that WWW::Mechanize is loaded but never called, but it will not be able to tell you if warnings or binmode had any effect on execution.
Devel::Cover provides code coverage metrics that may be of some use here.