Can't call method "_compile" error when using File::Find::Rule - perl

Can't call method "_compile" without a package or object reference at /homes/sauravb/perl5/lib/perl5/File/Find/Rule.pm line 292
#!/usr/bin/perl
use strict;
use warnings;
#use File Perl Module
#use Cwd sub perl module
use File::Find::Rule;
use Cwd;
use Cwd 'chdir';
chdir "/volume/regressions/results/JUNOS/HEAD/EABUPDTREGRESSIONS/15.1/15.1F5";
my $cwd = getcwd();
my $fh;
sub fileindex {
open($fh, ">", "/homes/sauravb/line_with_X-TESTCASE.txt" ) || die $!;
my $rule = File::Find::Rule->new;
my #files = $rule->any (File::Find::Rule->name('*.log'),
File::Find::Rule->in($cwd)
);
#print $fh map{ "$_\n" } #files;
}
fileindex();

any allows you to specify alternate sets of criteria. For example, the following returns all files whose name ends with .log or .txt.
my #files =
File::Find::Rule
->any(
File::Find::Rule->name('*.log'),
File::Find::Rule->name('*.txt'),
)
->in($dir_qfn);
Instead, you passed one criteria plus the names of all the files in a directory to any. That is incorrect.
Did you simply want the list of log files in the directory? If so, you simply need
my #files =
File::Find::Rule
->name('*.log')
->in($dir_qfn);
Or if you want to make sure they're plain files (and not directories),
my #files =
File::Find::Rule
->name('*.log')
->file
->in($dir_qfn);

Related

Traversing Through Directories

I am trying to traverse through directories to change certain file extensions in those directories.
I made it to where I can go through a directory that is given through command-line, but I cannot make it traverse through that directories' subdirectories.
For example: If I want to change the file extensions in the directory Test then if Test has a subdirectory I want to be able to go through that directory and change the file extensions of those files too.
I came up with this. This works for one directory. It correctly changes the file extensions of the files in one specific directory.
#!/usr/local/bin/perl
use strict;
use warnings;
my #argv;
my $dir = $ARGV[0];
my #files = glob "${dir}/*pl";
foreach (#files) {
next if -d;
(my $txt = $_) =~ s/pl$/txt/;
rename($_, $txt);
}
I then heard of File::Find::Rule, so I tried to use that to traverse through the directories.
I came up with this:
#!/usr/local/bin/perl
use strict;
use warnings;
use File::Find;
use File::Find::Rule;
my #argv;
my $dir = $ARGV[0];
my #subdirs = File::find::Rule->directory->in( $dir );
sub fileRecurs{
my #files = glob "${dir}/*pl";
foreach (#files) {
next if -d;
(my $txt = $_) =~ s/pl$/txt/;
rename($_, $txt);
}
}
This does not work/ will not work because I am not familiar enough with File::Find::Rule
Is there a better way to traverse through the directories to change the file extensions?
#!/usr/local/bin/perl
use strict;
use warnings;
use File::Find;
my #argv;
my $dir = $ARGV[0];
find(\&dirRecurs, $dir);
sub dirRecurs{
if (-f)
{
(my $txt = $_) =~ s/pl$/txt/;
rename($_, $txt);
}
}
I figured it out with the help of the tutorial #David sent me! Thank you!

Printing cwd without full path in perl?

I have a bunch of data that is stored in sub-directories labeled by date. I have used the Cwd command to get the Current working directory so that I can then print it to the vi file that I am writing with the recovered data from the sub-directories. I am using the Cwd as a prefix to the data strings. Is there a way to print only the current directory name and not the path?
example:
Instead of printing-
/d2/aschwa/archive_data/METAR_data/20120302KDUX 121255Z.........
Is there a way to print only-
20120302KDUX 121255Z.........
Here's the code I'm using-
use strict;
use warnings;
use file::find;
use Cwd;
my #folder = ("/d2/aschwa/archive_project/METAR_data/");
open( OUT , '>', 'KDUX_METARS.txt') or die "Could not open $!";
print OUT "Station, Day/Time, Obs Type, Wind/Gust, Vis, Sky, T/Td, Alt, Rmk\n";
print STDOUT "Finding METAR files\n";
my $criteria = sub {if(-e && /^2012/) {
open(my $file,$_) or die "Could not open $_ $!\n";
my $dir = getcwd;
while(<$file>) {
print OUT $dir,$_ if /KDUX ....55Z|KDUX ....05Z/;
}
}
};
find($criteria, #folder);
close OUT;
In Perl, you can use functions basename or fileparse to extract the file name from a path.
They are included in the core module File::Basename.
Simply split, then pop.
Shamelessly stolen from perlmonks:
$ perl -e 'print pop #{[split m|/|, "/home/bin/scripts/test.pl"]};'
test.pl
Reference link: http://www.perlmonks.org/?node_id=241089
You can combing the perl module File::Basename with Cwd to get the directory without the path
perl -MCwd -MFile::Basename -e 'my $dir = cwd; print basename($dir)'
Why don't you just get the content after the last slash with a regexp like below:
$path = '/d2/aschwa/archive_data/METAR_data/20120302KDUX 121255Z.........';
$path = $1 if $path =~ m~/([^/]*)/?$~;
This is in my opinion the best way to do it. The above code is just an example, but the regexp there will do the job you want.

Get names of all the directories with similar naming pattern in Perl

I have a directory "logs" which contains sub-directories as "A1", "A2", "A3", "B1", "B2", "B3".
I want to write a perl code that search all the sub directories with name pattern as "A", i.e. all the directories names starting from character A.
Please help me.
Use the Perl core module File::Find:
use strict;
use warnings;
use File::Find;
#Find in 'logs' directory, assume the script is executed at this folder level
find(\&wanted, 'logs');
sub wanted {
#Subroutine called for every file / folder founded ($_ has the name of the current)
if(-d and /^A/ ) {
print $_, "\n";
}
}
Update:
If you want to parametrize the prefix, you can do this:
use strict;
use warnings;
use File::Find;
my $prefix = 'B';
find(\&wanted, 'logs');
sub wanted {
if(-d and /^$prefix/ ) {
print $_, "\n";
}
}
File::Find is overkill for simply searching a directory. opendir/readdir still has a purpose!
This program does a chdir to the specified directory so that there is no need to build the full path from the names generated by readdir.
The directory to search and the required prefix can be passed as command-line parameters and will default to logs and A if they are not supplied.
use strict;
use warnings;
use autodie;
my ($dir, $prefix) = #ARGV ? #ARGV : qw/ logs A /;
chdir $dir;
my #wanted = do {
opendir(my $dh, '.');
grep { -d and /^\Q$prefix/ } readdir $dh;
};
print "$_\n" for #wanted;

Find::File to search a directory of a list of files

I'm writing a Perl script and I'm new to Perl -- I have a file that contains a list of files. For each item on the list I want to search a given directory and its sub-directories to find the file return the full path. I've been unsuccessful thus far trying to use File::Find. Here's what I got:
use strict;
use warnings;
use File::Find;
my $directory = '/home/directory/';
my $input_file = '/home/directory/file_list';
my #file_list;
find(\&wanted, $directory);
sub wanted {
open (FILE, $input_file);
foreach my $file (<FILE>) {
chomp($file);
push ( #file_list, $file );
}
close (FILE);
return #file_list;
}
I find File::Find::Rule a tad easier and more elegant to use.
use File::Find::Rule;
my $path = '/some/path';
# Find all directories under $path
my #paths = File::Find::Rule->directory->in( $path );
# Find all files in $path
my #files = File::Find::Rule->file->in( $path );
The arrays contain full paths to the objects File::Find::Rule finds.
File::Find is used to traverse a directory structure in the filesystem. Instead of doing what you're trying to do, namely, have the wanted subroutine read in the file, you should read in the file as follows:
use strict;
use warnings;
use vars qw/#file_list/;
my $directory = '/home/directory/';
my $input_file = '/home/directory/file_list';
open FILE, "$input_file" or die "$!\n";
foreach my $file (<FILE>) {
chomp($file);
push ( #file_list, $file );
}
# do what you need to here with the #file_list array
Okay, well re-read the doc and I misunderstood the wanted subroutine. The wanted is a subroutine that is called on every file and directory that is found. So here's my code to take that into account
use strict;
use warnings;
use File::Find;
my $directory = '/home/directory/';
my $input_file = '/home/directory/file_list';
my #file_list;
open (FILE, $input_file);
foreach my $file (<FILE>) {
chomp($file);
push ( #file_list, $file );
}
close (FILE);
find(\&wanted, $directory);
sub wanted {
if ( $_ ~~ #file_list ) {
print "$File::Find::name\n";
}
return;
}

change the directory and grab the xml file to parse certain data in perl

I am trying to parse specific XML file which is located in sub directories of one directory. For some reason i am getting error saying file does not exists. if the file does not exist it should move on to next sub directory.
HERE IS MY CODE
use strict;
use warnings;
use Data::Dumper;
use XML::Simple;
my #xmlsearch = map { chomp; $_ } `ls`;
foreach my $directory (#xmlsearch) {
print "$directory \n";
chdir($directory) or die "Couldn't change to [$directory]: $!";
my #findResults = `find -name education.xml`;
foreach my $educationresults (#findResults){
print $educationresults;
my $parser = new XML::Simple;
my $data = $parser->XMLin($educationresults);
print Dumper($data);
chdir('..');
}
}
ERROR
music/gitar/education.xml
File does not exist: ./music/gitar/education.xml
Using chdir the way you did makes the code IMO less readable. You can use File::Find for that:
use autodie;
use File::Find;
use XML::Simple;
use Data::Dumper;
sub findxml {
my #found;
opendir(DIR, '.');
my #where = grep { -d && m#^[^.]+$# } readdir(DIR);
closedir(DIR);
File::Find::find({wanted => sub {
push #found, $File::Find::name if m#^education\.xml$#s && -f _;
} }, #where);
return #found;
}
foreach my $xml (findxml()){
say $xml;
print Dumper XMLin($xml);
}
Whenever you find yourself relying on backticks to execute shell commands, you should consider whether there is a proper perl way to do it. In this case, there is.
ls can be replaced with <*>, which is a simple glob. The line:
my #array = map { chomp; $_ } `ls`;
Is just a roundabout way of saying
chomp(my #array = `ls`); # chomp takes list arguments as well
But of course the proper way is
my #array = <*>; # no chomp required
Now, the simple solution to all of this is simply to do
for my $xml (<*/education.xml>) { # find the xml files in dir 1 level up
Which will cover one level of directories, with no recursion. For full recursion, use File::Find:
use strict;
use warnings;
use File::Find;
my #list;
find( sub { push #list, $File::Find::name if /^education\.xml$/i; }, ".");
for (#list) {
# do stuff
# #list contains full path names of education.xml files found in subdirs
# e.g. ./music/gitar/education.xml
}
You should note that changing directories is not required, and in my experience, not worth the trouble. Instead of doing:
chdir($somedir);
my $data = XMLin($somefile);
chdir("..");
Simply do:
my $data = XMLin("$somedir/$somefile");