I'm trying to enumerate directory content and check the sizes of the files there (no recursion). So I opendir/readdir through the directory, skip certain types of files (directories and such), and by using something like my $size = -s "$file_path get the size of the current file.
However, I'm having a weird situation - in one directory I can't get the size of any file (containing all .exe files). The same program runs fine on another directory (.txt files, .pl and similar).
If I copy some .exe file from the first directory to the other one, its size is properly determined.
If I run the program on the first directory again, the size of that one copied .exe is properly determined, all others still fail. So it seems like some weird caching problem.
Any idea why this is happening?
Edit: With the -f check, the .exe files for which size check doesn't work are not plain files. However they "become" plain files if I just copy them from that directory somewhere. Then the size check works
The part of the code used for enumerating files:
my $dir_handle;
my $dir_entry;
my $retVal = opendir($dir_handle, "$path");
if (!$retVal)
{
print "Unable to open directory. \n$!";
exit(0);
}
while ($dir_entry = readdir($dir_handle))
{
print "Current file: $dir_entry \n";
next if (! -f $dir_entry);
my $size_bytes = -s "$dir_entry";
if ($size_bytes)
{
print "Size: $size_bytes \n";
}
}
closedir($dir_handle);
readdir() returns the file name only, and doesn't include the path information - so if the directory is not the same as the current working dir, this will fail.
You want to include the path:
while ($dir_entry = readdir($dir_handle))
{
print "Current file: $dir_entry \n";
next if (! -f "$path/$dir_entry");
my $size_bytes = -s "$path/$dir_entry";
if ($size_bytes)
{
print "Size: $size_bytes \n";
}
}
(and yes, using Unix-style path separators works fine here; feel free to use \ instead if you like escaping things)
readdir only returns the name of each directory entry. It doesn't include the path to the directory being read. For example, if you're reading /var/tmp and that directory contains a file named foo, then readdir is going to return foo, not /var/tmp/foo.
To check whether a directory entry is a file or to get its size, you have to provide a full pathname to the file, including the directory part. Unless you're specifically calling readdir on the current directory, you will need to convert each filename to a pathname:
while ($dir_entry = readdir($dir_handle))
{
my $pn = $path . '/' . $dir_entry;
print "Current file: $pn \n";
next if (! -f $pn);
my $size_bytes = -s $pn;
...
}
As has already been stated, your bug was in not including the path information during your file operations.
I would recommend using Path::Class to make your file and directory operations cross platform compatible, but also to automatically handle issues like this:
use strict;
use warnings;
use Path::Class;
my $path = '.';
my $dir = dir($path);
for my $child ( $dir->children ) {
printf "Current file: %s\n", $child->basename;
next if !-f $child;
if ( my $size = -s $child ) {
print "Size: $size\n";
}
}
Related
I want to create a directory with a certain name before the beginning of a method. At each iteration the directory should be replaced with new entries instead of getting appended.
I have used this code:
sub makingDirectoryForClient {
$nameOfTheClientDirectory = $_[0];
my $directory = "D:\\ATEF\\clientfolder\\$nameOfTheClientDirectory";
my $outputvar = mkdir $directory;
}
but still the folder is not getting replaced. Any ideas?
If mkdir appears to be doing nothing then you should code a warning statement to find out why. The reason for failure will be in built-in variable $!, so I suggest you write your subroutine like this
sub make_client_dir {
my $client_dir = shift;
my $path = "D:\\ATEF\\clientfolder\\$client_dir";
( my $success = mkdir $path ) or
warn qq{Unable to create directory "$path": $!};
return $success;
}
Note that I've also modified your code to be more idiomatic
I needed to create a directory of the same name that already existed since I thought that would replace the folder with a new empty folder. But Perl does not work that way: mkdir will not work if a folder of the same name already exists.
So I first deleted the directory using rmtree from File::Path and then created a new directory of the same name.
I have the perl code to delete the files inside the directory and later the directory.
find ( sub {
my $file = $File::Find::name;
if ( -f $file ) {
push (#file_list, $file);
}
}, #find_dirs);
for my $file (#file_list) {
my #stats = stat($file);
if ($now-$stats[9] > $AGE) {
unlink $file;
}
}
But the above code is deleting only the contents inside the directories and sub directories leaving behind all the empty folder.
Could anyone please help me with the changes to be done to above coding so that it deletes the files and also the directories.
unlink does not delete directories, only files.
Note: unlink will not attempt to delete directories unless you are
superuser and the -U flag is supplied to Perl. Even if these
conditions are met, be warned that unlinking a directory can inflict
damage on your filesystem. Finally, using unlink on directories is not
supported on many operating systems. Use rmdir instead.
You want rmdir, and you probably want to check with -d which one to use, unless you don't care about warnings.
I am only putting the code together, so you may upvote #simbabque that answered first. Try:
finddepth( sub {
my $file = $File::Find::name;
my #stats = stat($file);
if( -f $file && $now - $stats[9] > $AGE ) {
unlink $file;
}
elsif( -d $file ) {
rmdir $file;
}
}, #find_dirs );
A few comments:
File::Find will find both files and directories.
-f checks for a file; -d for a directory.
rmdir will only remove a directory if the directory is empty. That is why files must be deleted first. finddepth takes care of this.
-f and -d are simple to use, but stat may also be used for such a check (see second field, mode.)
I have not tested the code; I cannot easily recreate your conditions.
EDIT: Now it uses finddepth instead of find because:
finddepth() works just like find() except that it invokes the &wanted function for a directory after invoking it for the directory's contents. It does a postorder traversal instead of a preorder traversal, working from the bottom of the directory tree up where find() works from the top of the tree down.
This should take care of removing the directories in order, deepest first. Some directories may still not be removed if files remain in them that do not match the delete condition. If you want them removed when empty regardless of their timestamp, then remove the if -d condition. The non-empty ones will remain. Directories that cannot be removed may issue a warning...
I've modified some script that I've written to now only copy .jpg files.
The script seems to work. It will copy all of the .jpg files from one folder to another but the script is meant to continually loop every X amount of seconds.
If I add a new .jpg file to the folder I'm moving items from after I have already started the script it will not copy over the newly added file. If I stop and restart the script then it will copy the new .jpg file that was added but I want the script to copy items as they are put into the folders and not have to stop and restart the script.
Before I added the glob function trying to only copy .jpg files the script would copy anything in the folder even if it was moved into the folder while the script was still running.
Why is this happening? Any help would be awesome.
Here is my code:
use File::Copy;
use File::Find;
my #source = glob ("C:/sorce/*.jpg");
my $target = q{C:/target};
while (1)
{ sleep (10);
find(
sub {
if (-f) {
print "$File::Find::name -> $target";
copy($File::Find::name, $target)
or die(q{copy failed:} . $!);
}
},
#source
);
}
Your #source array contains a list of file names. It should contain a list of folders to start your search in. So simply change it to:
my $source = "C:/source";
I changed it to a scalar, because it only holds one value. If you want to add more directories at a later point, an array can be used instead. Also, of course, why mix a glob and File::Find? It makes little sense, as File::Find is recursive.
The file checking is then done in the wanted subroutine:
if (-f && /\.jpg$/i)
It won't refresh its list of files if you only glob the list once.
I prefer to use File::Find::Rule, and would use that for each iteration on the directory instead to update the list.
use File::Find::Rule;
my $source_dir = 'C:/source';
my $target_dir = 'C:/target';
while (1) {
sleep 10;
my #files = File::Find::Rule->file()
->name( '*.jpg' )
->in( $source_dir );
for my $file (#files) {
copy $file, $target
or die "Copy failed on $file: $!";
}
}
I want to locate the latest subdirectory on a network path and copy the entire contents of the latest subdirectory into another folder in the network path
We have lot of subfolders under the folder \\10.184.132.202\projectdump I need to sort the sub folders to get into latest folder and copy the entire contents into another folder on \\10.184.132.203\baseline
I am using the below mentioned script i am able to list the latest modified folder under the directory but I am unaware of copying the contents.
use File::stat;
use File::Copy qw(copy);
$dirname = '\\\\10.184.132.202\\projectdump\\Testing\\';
$destination = '\\\\10.184.132.203\\baseline\\Testing\\';
$timediff=0;
opendir DIR, "$dirname";
while (defined ($sub_dir = readdir(DIR)))
{
if($sub_dir ne "." && $sub_dir ne "..")
{
$diff = time()-stat("$dirname/$sub_dir")->mtime;
if($timediff == 0)
{
$timediff=$diff;
$newest=$sub_dir;
}
if($diff<$timediff)
{
$timediff=$diff;
$newest=$sub_dir;
}
}
}
print $newest,"\n";
open my $in, '<', $newest or die $!;
while (<$in>) {
copy *, $destination; --------> Here i want to copy the entire contents of the $newest to $destination.
}
Use File::Copy::Recursive. This is an optional module, but allows you to copy entire directory trees. Unfortunately, File::Copy::Recursive is not a standard Perl module, but you can install it via the cpan command.
If installing modules is a problem (sometimes it is), you can use the File::Find to go through the directory tree and copy files one at a time.
By the way, you can use forward slashes in Perl for Windows file names, so you don't have to double up on backslashes.
Why don't call a simple shell cmd to find the latest dir?
I think, this will be much simpler in shell...
my $newestdir=`ls -1rt $dirname|tail -n 1`;
in shell:
LATESTDIR=`ls -1rt $dirname|tail -n 1`
cp -r ${LATESTDIR}/* $destination/
Ups, I just realized that you might using Windows...
Get all dirs and their times into a hash then sort that hash reverse order to find the newest one
my ($newest) = sort {$hash{$b} cmp $hash{$a} keys %hash;
then
opendir NDIR, "$newest";
while ($dir=<NDIR>) {
next if $dir eq '.' or $dir eq '..';
copy $dir, $destination;
}
I am struggling with a method of walking a directory tree to check existence of a file in multiple directories. I am using Perl and I only have the ability to use File::Find as I am unable to install any other modules for this.
Here's the layout of the file system I want to traverse:
Cars/Honda/Civic/Setup/config.txt
Cars/Honda/Pathfinder/Setup/config.txt
Cars/Toyota/Corolla/Setup/config.txt
Cars/Toyota/Avalon/Setup/
Note that the last Setup folder is missing a config.txt file.
Edit: also, in each of the Setup folders there are a number of other files as well that vary from Setup folder to Setup folder. There really isn't any single file to search against to get into the Setup folder itself.
So you can see that the file path stays the same except for the make and model folders. I want to find all of the Setup folders and then check to see if there is a config.txt file in that folder.
At first I was using the following code with File::Find
my $dir = '/test/Cars/';
find(\&find_config, $dir);
sub find_config {
# find all Setup folders from the given top level dir
if ($File::Find::dir =~ m/Setup/) {
# create the file path of config.txt whether it exists or not, well check in the next line
$config_filepath = $File::Find::dir . "/config.txt";
# check existence of file; further processing
...
}
}
You can obviously see the flaw in trying to use $File::Find::dir =~ m/Setup/ since it will return a hit for every single file in the Setup folder. Is there any way to use a -d or some sort of directory check rather than a file check? The config.txt is not always in the folder (I will need to create it if it doesn't exist) so I can't really use something like return unless ($_ =~ m/config\.txt/) since I don't know if it's there or not.
I'm trying to find a way to use something like return unless ( <is a directory> and <the directory has a regex match of m/Setup/>).
Maybe File::Find is not the right method for something like this but I've been searching around for a while now without any good leads on working with directory names rather than file names.
File::Find finds directory names, too. You want to check for when $_ eq 'Setup' (note: eq, not your regular expression, which would also match XXXSetupXXX), and then see if there's a config.txt file in the directory ( -f "$File::Find::name/config.txt" ). If you want to avoid complaining about files named Setup, check that the found 'Setup' is a directory with -d.
I'm trying to find a way to use something like return unless ( <is a directory> and <the directory has a regex match of m/Setup/>).
use File::Spec::Functions qw( catfile );
my $dir = '/test/Cars/';
find(\&find_config, $dir);
sub find_config {
return unless $_ eq 'Setup' and -d $File::Find::name;
my $config_filepath = catfile $File::Find::name => 'config.txt';
# check for existence etc
}