I'm trying to determine which of the content of a folder is a directory and which is a file, I wrote the following but the result is not what I would expect:
opendir DH, $dir or die "Cannot open Dir: $!";
my #dirs = grep !/^\.\.?$/, readdir DH ;
foreach my $files (#dirs) {
print $files."<br>";
if ( -d $files )
{
print $files." is a directory<br>";
}
}
closedir DH;
The result is something as the example below:
.file1
file.log
file3.zip
file4
file5.zip
dir1.name1.suffix1.yyyy.MM.dd.hh.mm.ss
file5.zip
file6.tar
dir2
dir3.name1.suffix1.yyyy.MM.dd.hh.mm.ss
where the item starting with dir are actual directory, so my question is why the if is failing discover them as such?
What am I doing wrong?
$diris missing...
if ( -d "$dir/$files" )
{
print $files." is a directory<br>";
}
It's easiest to chdir to $dir so that you don't have to prefix the node names with the path. You can also use autodie if you are running Perl v5.10.1 or better. Finally, if you use $_ as your loop control variable (the file/directory names) you can omit it from the parameters of print, -d and regex matches
Like this
use strict;
use warnings;
use v5.10.1;
use autodie;
my ($dir) = #ARGV;
opendir my $dh, $dir;
chdir $dh;
while ( readdir $dh ) {
next if /\A\.\.?\z/;
print;
print " is a directory" if -d;
print "<br/>\n";
}
... # local expires. working directory returns to its original value
Update
In view of ikegami's (deleted) comment about returning back to the original working directory, here's an example of using the File::chdir module to do this tidily. It exports a tied variable $CWD which will change your working directory if you assign to it. You can also localise it, so just wrapping the above code in braces and adding a new local value for $CWD keeps things neat. Note that File::chdir is not a core module so you will likely need to install it
Note however that there is still a very small possibility that the process may be started with a present working directory that it cannot chdir to. This module will not solve that problem
use strict;
use warnings;
use v5.10.1;
use autodie;
use File::chdir;
my ($dir) = #ARGV;
{
opendir my $dh, $dir;
local $CWD = $dir;
while ( readdir $dh ) {
next if /\A\.\.?\z/;
print;
print " is a directory" if -d;
print "<br/>\n";
}
}
Related
This question already has answers here:
Why can't I open files returned by Perl's readdir?
(2 answers)
Closed 7 years ago.
I have a problem with a Perl script, as follows.
I must open and analyze all the *.txt files in a directory, but I cannot.
I can read file names that are saved in the #files array and printed, but I cannot open those files for reading.
This is my code:
my $dir= "../Scrivania/programmi" ;
opendir my ($dh), $dir;
my #files = grep { -f and /\.txt/i } readdir $dir;
closedir $dh;
for my $file ( #files ) {
$file = catfile($dir, $file);
print qq{Opening "$file"\n};
open my $fh, '<', $file;
# Do stuff with the data from $fh
print "sono nel foreach\n";
print " in : "."$fh\n";
#open(CANALI,$fh);
##righe=<CANALI>;
#close(CANALI);
#print "canali:"."#righe\n";
#foreach $canali (#righe)
#{
# $canali =~ /\d\d:\d\d (-) (.*)/;
# $ora= $1;
#
# if($hhSplit[0] == $ora)
# {
# push(#output, "$canali");
#
# }
#}
}
The main problem you have is that the file names returned by readdir have no path, so you're trying to open, say, x.txt when you should be opening ../Sc/direct/x.txt. The file doesn't exist in the current working directory so your open call fails
You also have a strange mixture of stuff in glob("$dir/(.*).txt/") which looks a little like a regex pattern, which glob doesn't understand. The value of $dir is a directory handle left open from the opendir on the first line. What you should be using is glob '../Sc/direct/*.txt', but then there's no need for the readdir
There are two ways to find the contents of a file. You can use opendir and readdir to read everything in the directory, or you can use glob
The first method returns only the bare name of each entry, which means you must concatenate each name with the path to the containing directory, preferably using catfile from File::Spec::Functions. It also includes the pseudo-directories . and .. so you must filter those out before you can use the list of names
glob has neither of these disadvantages. All the strings it returns are real directory entries, and they will include a path if you provided one in the pattern you passed as a parameter
You seem to have become rather muddled over the two, so I have written this program which differentiates between the two approaches. I hope it makes things clearer
use strict;
use warnings;
use v5.10.1;
use autodie;
use File::Spec::Functions qw/ catfile /;
my $dir = '../Sc/direct';
### Using glob
for my $file ( glob catfile($dir, '*.txt') ) {
print qq{Opening "$file"\n};
open my $fh, '<', $file;
# Do stuff with the data from $fh
}
### Using opendir / readdir
opendir my ($dh), $dir;
my #files = grep { -f and /\.txt$/i } readdir $dir;
closedir $dh;
for my $file ( #files ) {
$file = catfile($dir, $file);
print qq{Opening "$file"\n};
open my $fh, '<', $file;
# Do stuff with the data from $fh
}
Using $dir in the glob is incorrect. $dir is a GLOB type not a string value. Rather you should be looping over the #files array and looking for names that match what you want. Maybe something like so:
foreach my $fp (#files) {
if ($fp =~ /(.*).txt/) {
print "$fp is a .txt\n";
open (my $in, "<", $fp)
while (<$in>) ...
}
}
I'm just a beginner in Perl. I try to rename a file or directory using the following script, but it is not renaming the file. Please help me in identifying the problem.
I'm using Perl version 5.8.4
#!/usr/bin/perl
use strict;
use warnings;
use File::Copy;
my $dir="/home/hari/perl-s/abc/";
opendir (DIR, $dir);
my #fileList = readdir DIR;
foreach (#fileList){
next if -d;
my $oldname = $_;
print "Newfile after assigning: $_ \n";
s/(^[0-9])(.)//;
print "Newfile: $_ \n";
print "oldname: $oldname \n";
rename ($oldname,$_);
}
The return values of readdir are just filenames; they do not include the path that was provided to opendir. You generally have to include that manually.
opendir (DIR, $dir);
my #fileList = readdir DIR;
foreach (#fileList){
# $_ is just "filename"
$_ = "$dir/$_"; # now $_ is "/home/hari/perl-s/abc/$filename"
next if -d;
...
}
There's more than one way to do things in Perl. Another way to get the set of files in a directory is with the glob function. One of the advantages of glob is that you can use it in such a way so that it returns filenames with their full paths, and so sometimes glob is preferable to the opendir/readdir/closedir idioms:
my #filelist = glob("$dir/*");
foreach (#filelist) {
# $_ is "/home/hari/perl-s/abc/filename"
...
}
I'm trying to run this Perl script but it is not running as required. It is supposed to store the values of folders name which are in the format of date( example : 11-03-23)
I have some folders placed at this location in my account:
/hqfs/datastore/files
11-02-23 11-02-17 11-04-21
I'm storing these in "processed_dirs.dat" file.
But in the output: I got "pst12345678" in processed_dirs.dat
And when I printed $dh, I got GLOB(0x12345) some thing like this:
Please help me in getting the right output.
#!/usr/bin/perl
use strict;
use warnings;
use Storable;
# This script to be run 1 time only. Sets up 'processed' directories hash.
# After this script is run, ready to run the daily script.
my $dir = '/hqfs/datastore/files'; # or what ever directory the date-directories are stored in
opendir my $dh, $dir or die "Opening failed for directory $dir $!";
my #dir = grep {-d && /^\d\d-\d\d-\d\d$/ && $_ le '11-07-25'} readdir $dh;
closedir $dh or die "Unable to close $dir $!";
my %processed = map {$_ => 1} #dir;
store \%processed, 'processed_dirs.dat';
You are missing an argument for -d. Try -d "$dir/$_" && .... (Unless the current directory is always going to be the directory you are reading.)
There is almost no reason to ever use store instead of Storable::nstore.
Why were you trying to print dh?
$dh is a directory handle object. There's nothing useful you can get by printing it.
The output of Storable::store is not intended to be human-readable. If you're expecting something readable in processed_dirs.dat, don't... you will need to use Storable::retrieve to fetch it back out through perl, or Data::Dumper to print out the variable in a readable format.
This implementation works and gives you accurate information.
#!/usr/bin/perl
my $dir = '/Volumes/Data/Alex/';
opendir $dh , $dir
or die "Cannot open dir: $!";
my #result = ();
foreach ( readdir $dh )
{
if ( ! /^\d{2}-\d{2}-\d{2}$/ ) { next; } else { push #result , $_; }
}
I usually use something like
my $dir="/path/to/dir";
opendir(DIR, $dir) or die "can't open $dir: $!";
my #files = readdir DIR;
closedir DIR;
or sometimes I use glob, but anyway, I always need to add a line or two to filter out . and .. which is quite annoying.
How do you usually go about this common task?
my #files = grep {!/^\./} readdir DIR;
This will exclude all the dotfiles as well, but that's usually What You Want.
I often use File::Slurp. Benefits include: (1) Dies automatically if the directory does not exist. (2) Excludes . and .. by default. It's behavior is like readdir in that it does not return the full paths.
use File::Slurp qw(read_dir);
my $dir = '/path/to/dir';
my #contents = read_dir($dir);
Another useful module is File::Util, which provides many options when reading a directory. For example:
use File::Util;
my $dir = '/path/to/dir';
my $fu = File::Util->new;
my #contents = $fu->list_dir( $dir, '--with-paths', '--no-fsdots' );
I will normally use the glob method:
for my $file (glob "$dir/*") {
#do stuff with $file
}
This works fine unless the directory has lots of files in it. In those cases you have to switch back to readdir in a while loop (putting readdir in list context is just as bad as the glob):
open my $dh, $dir
or die "could not open $dir: $!";
while (my $file = readdir $dh) {
next if $file =~ /^[.]/;
#do stuff with $file
}
Often though, if I am reading a bunch of files in a directory, I want to read them in a recursive manner. In those cases I use File::Find:
use File::Find;
find sub {
return if /^[.]/;
#do stuff with $_ or $File::Find::name
}, $dir;
If some of the dotfiles are important,
my #files = grep !/^\.\.?$/, readdir DIR;
will only exclude . and ..
When I just want the files (as opposed to directories), I use grep with a -f test:
my #files = grep { -f } readdir $dir;
Thanks Chris and Ether for your recommendations. I used the following to read a listing of all files (excluded directories), from a directory handle referencing a directory other than my current directory, into an array. The array was always missing one file when not using the absolute path in the grep statement
use File::Slurp;
print "\nWhich folder do you want to replace text? " ;
chomp (my $input = <>);
if ($input eq "") {
print "\nNo folder entered exiting program!!!\n";
exit 0;
}
opendir(my $dh, $input) or die "\nUnable to access directory $input!!!\n";
my #dir = grep { -f "$input\\$_" } readdir $dh;
I am trying to prefix a string (reference_) to the names of all the *.bmp files in all the directories as well sub-directories. The first time we run the silk script, it will create directories as well subdirectories, and under each subdirectory it will store each mobile application's sceenshot with .bmp extension.
When I run the automated silkscript for second time it will again create the *.bmp files in all the subdirectories. Before running the script for second time I want to prefix all the *.bmp with a string reference_.
For example first_screen.bmp to reference_first_screen.bmp,
I have the directory structure as below:
C:\Image_Repository\BG_Images\second
...
C:\Image_Repository\BG_Images\sixth
having first_screen.bmp and first_screen.bmp files etc...
Could any one help me out?
How can I prefix all the image file names with reference_ string?
When I run the script for second time, the Perl script in silk will take both the images from the sub-directory and compare them both pixel by pixel. I am trying with code below.
Could you please guide me how can I proceed to complete this task.
#!/usr/bin/perl -w
&one;
&two;
sub one {
use Cwd;
my $dir ="C:\\Image_Repository";
#print "$dir\n";
opendir(DIR,"+<$dir") or "die $!\n";
my #dir = readdir DIR;
#$lines=#dir;
delete $dir[-1];
print "$lines\n";
foreach my $item (#dir)
{
print "$item\n";
}
closedir DIR;
}
sub two {
use Cwd;
my $dir1 ="C:\\Image_Repository\\BG_Images";
#print "$dir1\n";
opendir(D,"+<$dir1") or "die $!\n";
my #dire = readdir D;
#$lines=#dire;
delete $dire[-1];
#print "$lines\n";
foreach my $item (#dire)
{
#print "$item\n";
$dir2="C:\\Image_Repository\\BG_Images\\$item";
print $dir2;
opendir(D1,"+<$dir2") or die " $!\n";
my #files=readdir D1;
#print "#files\n";
foreach $one (#files)
{
$one="reference_".$one;
print "$one\n";
#rename $one,Reference_.$one;
}
}
closedir DIR;
}
I tried open call with '+<' mode but I am getting compilation error for the read and write mode.
When I am running this code, it shows the files in BG_images folder with prefixed string but actually it's not updating the files in the sub-directories.
You don't open a directory for writing. Just use opendir without the mode parts of the string:
opendir my($dir), $dirname or die "Could not open $dirname: $!";
However, you don't need that. You can use File::Find to make the list of files you need.
#!/usr/bin/perl
use strict;
use warnings;
use File::Basename;
use File::Find;
use File::Find::Closures qw(find_regular_files);
use File::Spec::Functions qw(catfile);
my( $wanted, $reporter ) = find_regular_files;
find( $wanted, $ARGV[0] );
my $prefix = 'recursive_';
foreach my $file ( $reporter->() )
{
my $basename = basename( $file );
if( index( $basename, $prefix ) == 0 )
{
print STDERR "$file already has '$prefix'! Skipping.\n";
next;
}
my $new_path = catfile(
dirname( $file ),
"recursive_$basename"
);
unless( rename $file, $new_path )
{
print STDERR "Could not rename $file: $!\n";
next;
}
print $file, "\n";
}
You should probably check out the File::Find module for this - it will make recursing up and down the directory tree simpler.
You should probably be scanning the file names and modifying those that don't start with reference_ so that they do. That may require splitting the file name up into a directory name and a file name and then prefixing the file name part with reference_. That's done with the File::Basename module.
At some point, you need to decide what happens when you run the script the third time. Do the files that already start with reference_ get overwritten, or do the unprefixed files get overwritten, or what?
The reason the files are not being renamed is that the rename operation is commented out. Remember to add use strict; at the top of your script (as well as the -w option which you did use).
If you get a list of files in an array #files (and the names are base names, so you don't have to fiddle with File::Basename), then the loop might look like:
foreach my $one (#files)
{
my $new = "reference_$one";
print "$one --> $new\n";
rename $one, $new or die "failed to rename $one to $new ($!)";
}
With the aid of find utility from coreutils for Windows:
$ find -iname "*.bmp" | perl -wlne"chomp; ($prefix, $basename) = split(m~\/([^/]+)$~, $_); rename($_, join(q(/), ($prefix, q(reference_).$basename))) or warn qq(failed to rename '$_': $!)"