Are both of the examples below OK, or is the second one bad style?
Case 1: Stay in top directory and use catdir to access subdirectories
#!/usr/bin/env perl
use warnings; use strict;
my $dir = 'my_dir_with_subdir';
my ( $count, $dh );
use File::Spec::Functions;
$count = 0;
opendir $dh, $dir or die $!;
while ( defined( my $file = readdir $dh ) ) {
next if $file =~ /^\.{1,2}$/;
my $sub_dir = catdir $dir, $file;
if ( -d $sub_dir ) {
opendir my $dh, $sub_dir or die $!;
while ( defined( my $file = readdir $dh ) ) {
next if $file =~ /^\.{1,2}$/;
$count++;
}
closedir $dh or die $!;
}
else {
$count++;
}
}
closedir $dh or die $!;
print "$count\n";
Case 2: Change to subdirectories and restore top directory before exit
use Cwd;
my $old = cwd;
$count = 0;
opendir $dh, $dir or die $!;
chdir $dir or die $!;
while ( defined( my $file = readdir $dh ) ) {
next if $file =~ /^\.{1,2}$/;
if ( -d $file ) {
opendir my $dh, $file or die $!;
chdir $file or die $!;
while ( defined( my $file = readdir $dh ) ) {
next if $file =~ /^\.{1,2}$/;
$count++;
}
closedir $dh or die $!;
chdir $dir;
}
else {
$count++;
}
}
closedir $dh or die $!;
chdir $old or die $!;
print "$count\n";
Your question is whether you should change to the directories you are going through or stay in the top level directory.
The answer is: It depends.
For example, consider File::Find. The default behavior is to indeed change directories. However, the module also provides a no_chdir option in case that is not desirable.
In the case of your examples, File::Find is probably not appropriate because you do not want to recurse through all subdirectories but only one. Here is a File::Slurp::read_dir based variation on your script.
#!/usr/bin/perl
use strict; use warnings;
use File::Slurp;
use File::Spec::Functions qw( catfile );
my ($dir) = #ARGV;
my $contents = read_dir $dir;
my $count = 0;
for my $entry ( #$contents ) {
my $path = catfile $dir, $entry;
-f $path and ++ $count and next;
-d _ and $count += () = read_dir $path;
}
print "$count\n";
For your example, it's best to change to subdirectories, and don't bother changing back to the original directory at the end. That's because each process has its own "current directory", so the fact that your perl script is changing it's own current directory does not mean that the shell's current directory is changed; that stays unaltered.
If this was part of a larger script it would be different; my general preference then would be not to change directory, just to reduce confusion over what the current directory is at any point in the script.
Use File::Find, as you already proposed :)
It's almost always better to use a module for solved problems like this than to roll your own, unless you really want to learn about walking dirs...
Related
Please correct my code, I cannot seem to open my file to parse.
The error is this line open(my $fh, $file) or die "Cannot open file, $!";
Cannot open file, No such file or directory at ./sample.pl line 28.
use strict;
my $dir = $ARGV[0];
my $dp_dpd = $ENV{'DP_DPD'};
my $log_dir = $ENV{'DP_LOG'};
my $xmlFlag = 0;
my #fileList = "";
my #not_proc_dir = `find $dp_dpd -type d -name "NotProcessed"`;
#print "#not_proc_dir\n";
foreach my $dir (#not_proc_dir) {
chomp ($dir);
#print "$dir\n";
opendir (DIR, $dir) or die "Couldn't open directory, $!";
while ( my $file = readdir DIR) {
next if $file =~ /^\.\.?$/;
next if (-d $file);
# print "$file\n";
next if $file eq "." or $file eq "..";
if ($file =~ /.xml$/ig) {
$xmlFlag = 1;
print "$file\n";
open(my $fh, $file) or die "Cannot open file, $!";
#fileList = <$fh>;
close $file;
}
}
closedir DIR;
}
Quoting readdir's documentation:
If you're planning to filetest the return values out of a readdir, you'd better prepend the directory in question. Otherwise, because we didn't chdir there, it would have been testing the wrong file.
Your open(my $fh, $file) should therefore be open my $fh, '<', "$dir/$file" (note how I also added '<' as well: you should always use 3-argument open).
Your next if (-d $file); is also wrong and should be next if -d "$dir/$file";
Some additional remarks on your code:
always add use warnings to your script (in addition to use strict, which you already have)
use lexical file/directory handle rather than global ones. That is, do opendir my $DH, $dir, rather than opendir DH, $dir.
properly indent your code (if ($file =~ /.xml$/ig) { is one level too deep; it makes it harder to read you code)
next if $file =~ /^\.\.?$/; and next if $file eq "." or $file eq ".."; are redundant (even though not technically equivalent); I'd suggest using only the latter.
the variable $dir defined in my $dir = $ARGV[0]; is never used.
I'm writing test for function which gets list of all *.pm files in current directory.
Here is function:
sub get_inspected_modules_list {
my ( $dir ) = #_;
opendir(my $dh, $dir) or die $!;
my #files;
while (my $file = readdir($dh)) {
next unless (-f "$dir/$file"); # skip nested dirs
next unless ($file =~ m/\.pm$/); # push only *.pm
push #files, $file;
}
closedir($dh);
return \#files
}
I tried to use Test::MockFile::DirHandle for test, but it prints No such file or directory error:
subtest "get_inspected_modules_list" => sub {
my $handle = Test::MockFile::DirHandle->new(
"/fake/path",
[qw/Foo.pm Bar.pm Baz.pm test.txt 1.pl/]
);
warn Dumper get_inspected_modules_list( '/fake/path' ); # error
};
How to mock opendir/readdir calls ?
Right usage is
my $mocked_dir = Test::MockFile->dir("/fake/path", [ 'Foo.pm', 'bar.pl' ] );
opendir(my $dh, "/fake/path") or die $!;
while (my $file = readdir($dh)) {
print "$file "; # will print '. .. Foo.pm bar.pl'
}
undef $mocked_dir;
So, instead of Test::MockFile::DirHandle you should use Test::MockFile->dir
This is just a small script I am running to continuous loop to check a directory and move every file that is there. This code works and i am running it in the background processes. But for some reason I am getting the following error: '/home/srvc_ibdcoe_pcdev/Niall_Test/new_dir/..' and '/home/srvc_ibdcoe_pcdev/Niall_Test/perl_files/..' are identical (not copied) at move2.pl line 27
any idea why it is telling me it is identical even though the paths are different?
Many thanks
script below
#!/usr/bin/perl
use diagnostics;
use strict;
use warnings;
use File::Copy;
my $poll_cycle = 10;
my $dest_dir = "/home/srvc_ibdcoe_pcdev/Niall_Test/perl_files";
while (1) {
sleep $poll_cycle;
my $dirname = '/home/srvc_ibdcoe_pcdev/Niall_Test/new_dir';
opendir my $dh, $dirname
or die "Can't open directory '$dirname' for reading: $!";
my #files = readdir $dh;
closedir $dh;
if ( grep( !/^[.][.]?$/, #files ) > 0 ) {
print "Dir is not empty\n";
foreach my $target (#files) {
# Move file
move("$dirname/$target", "$dest_dir/$target");
}
}
}
You need to filter out the special .. and . entries from #files.
#!/usr/bin/perl
use diagnostics;
use strict;
use warnings;
use File::Copy;
my $poll_cycle = 10;
my $dest_dir = "/home/srvc_ibdcoe_pcdev/Niall_Test/perl_files";
while (1) {
sleep $poll_cycle;
my $dirname = '/home/srvc_ibdcoe_pcdev/Niall_Test/new_dir';
opendir my $dh, $dirname
or die "Can't open directory '$dirname' for reading: $!";
my #files = grep !/^[.][.]?$/, readdir $dh;
closedir $dh;
if (#files) {
print "Dir is not empty\n";
foreach my $target (#files) {
# Move file
move("$dirname/$target", "$dest_dir/$target");
}
}
}
The message you see is correct. Both paths resolve to the same directory because of the ..; both resolve to /home/srvc_ibdcoe_pcdev/Niall_Test
.. refers to the directory's parent directory.
Currently in a perl script I am using the glob function to get a list of files with specific extensions.
my #filearray = glob("$DIR/*.abc $DIR/*.llc");
Is there any alternative to glob, to get the list of files with specific extension from a folder? If so please provide me some example? Thank you
Yes, there are much more complicated ways, like opendir, readdir and a regex filter. They will also give you the hidden files (or dotfiles):
opendir DIR, $DIR or die $!;
my #filearray = grep { /\.(abc|llc)$/ } readdir DIR;
closedir DIR;
#Using:
opendir(DIR, $dir) || die "$!";
my #files = grep(/\.[abc|lic]*$/, readdir(DIR));
closedir(DIR);
#Reference: CPAN
use Path::Class; # Exports dir() by default
my $dir = dir('foo', 'bar'); # Path::Class::Dir object
my $dir = Path::Class::Dir->new('foo', 'bar'); # Same thing
my $file = $dir->file('file.txt'); # A file in this directory
my $handle = $dir->open;
while (my $file = $handle->read)
{
$file = $dir->file($file); # Turn into Path::Class::File object
...
}
#Reference: Refered: http://accad.osu.edu/~mlewis/Class/Perl/perl.html#cd
# search for a file in all subdirectories
#!/usr/local/bin/perl
if ($#ARGV != 0) {
print "usage: findfile filename\n";
exit;
}
$filename = $ARGV[0];
# look in current directory
$dir = getcwd();
chop($dir);
&searchDirectory($dir);
sub searchDirectory
{
local($dir);
local(#lines);
local($line);
local($file);
local($subdir);
$dir = $_[0];
# check for permission
if(-x $dir)
{
# search this directory
#lines = `cd $dir; ls -l | grep $filename`;
foreach $line (#lines)
{
$line =~ /\s+(\S+)$/;
$file = $1;
print "Found $file in $dir\n";
}
# search any sub directories
#lines = `cd $dir; ls -l`;
foreach $line (#lines)
{
if($line =~ /^d/)
{
$line =~ /\s+(\S+)$/;
$subdir = $dir."/".$1;
&searchDirectory($subdir);
}
}
}
}
Please try another one:
use Cwd;
use File::Find;
my $dir = getcwd();
my #abclicfiles;
find(\&wanted, $dir);
sub wanted
{
push(#abclicfiles, $File::Find::name) if($File::Find::name=~m/\.(abc|lic)$/i);
}
print join "\n", #abclicfiles;
This the directory which is getting from user:
print "Please enter the directory: ";
my $dir = <STDIN>;
chomp($dir);
opendir(DIR, $dir) || die "Couldn't able to read dir: $!";
my #files = grep(/\.(txt|lic)$/, readdir(DIR));
closedir(DIR);
print join "\n", #files;
I'm trying to get the name of all directories in the specified path
I tried the following but that gives me every level down not just at the path i specified
find(\&dir_names, "C:\\mydata\\");
sub dir_names {
print "$File::Find::dir\n" if(-f $File::Find::dir,'/');
}
my #dirs = grep { -d } glob 'C:\mydata\*';
Use opendir instead
opendir DIR, $dirname or die "Couldn't open dir '$dirname': $!";
my #files = readdir(DIR);
closedir DIR;
#next processing...
EDIT:
"This will give all the files, not just the directories. You'd still have to grep."
Yes, and in that case you can just use file test operator to see whether it's a directory or not.
In Windows:
$dirname="C:\\";
opendir(DIR, $dirname);
#files = readdir(DIR);
closedir DIR;
foreach $key (#files)
{
if(-d "$dirname\\$key")
{
print "$key\n";
}
}
See chapter 2 Filesystems from Automating System Administration with Perl. That provides us with this:
sub ScanDirectory{
my ($workdir) = shift;
chdir($workdir) or die "Unable to enter dir $workdir:$!\n";
opendir(DIR, ".") or die "Unable to open $workdir:$!\n";
my #names = readdir(DIR) or die "Unable to read $workdir:$!\n";
closedir(DIR);
foreach my $name (#names){
next if ($name eq ".");
next if ($name eq "..");
if (-d $name){ # is this a directory?
#Whatever you want to do goes here.
}
}
}
glob or readdir would probably be my choice too. Another way to do it is to use the windows dir command to do the job:
my #dirs = qx(dir /AD /B);
chomp #dirs;