I want to create a directory with a certain name before the beginning of a method. At each iteration the directory should be replaced with new entries instead of getting appended.
I have used this code:
sub makingDirectoryForClient {
$nameOfTheClientDirectory = $_[0];
my $directory = "D:\\ATEF\\clientfolder\\$nameOfTheClientDirectory";
my $outputvar = mkdir $directory;
}
but still the folder is not getting replaced. Any ideas?
If mkdir appears to be doing nothing then you should code a warning statement to find out why. The reason for failure will be in built-in variable $!, so I suggest you write your subroutine like this
sub make_client_dir {
my $client_dir = shift;
my $path = "D:\\ATEF\\clientfolder\\$client_dir";
( my $success = mkdir $path ) or
warn qq{Unable to create directory "$path": $!};
return $success;
}
Note that I've also modified your code to be more idiomatic
I needed to create a directory of the same name that already existed since I thought that would replace the folder with a new empty folder. But Perl does not work that way: mkdir will not work if a folder of the same name already exists.
So I first deleted the directory using rmtree from File::Path and then created a new directory of the same name.
Related
I have the perl code to delete the files inside the directory and later the directory.
find ( sub {
my $file = $File::Find::name;
if ( -f $file ) {
push (#file_list, $file);
}
}, #find_dirs);
for my $file (#file_list) {
my #stats = stat($file);
if ($now-$stats[9] > $AGE) {
unlink $file;
}
}
But the above code is deleting only the contents inside the directories and sub directories leaving behind all the empty folder.
Could anyone please help me with the changes to be done to above coding so that it deletes the files and also the directories.
unlink does not delete directories, only files.
Note: unlink will not attempt to delete directories unless you are
superuser and the -U flag is supplied to Perl. Even if these
conditions are met, be warned that unlinking a directory can inflict
damage on your filesystem. Finally, using unlink on directories is not
supported on many operating systems. Use rmdir instead.
You want rmdir, and you probably want to check with -d which one to use, unless you don't care about warnings.
I am only putting the code together, so you may upvote #simbabque that answered first. Try:
finddepth( sub {
my $file = $File::Find::name;
my #stats = stat($file);
if( -f $file && $now - $stats[9] > $AGE ) {
unlink $file;
}
elsif( -d $file ) {
rmdir $file;
}
}, #find_dirs );
A few comments:
File::Find will find both files and directories.
-f checks for a file; -d for a directory.
rmdir will only remove a directory if the directory is empty. That is why files must be deleted first. finddepth takes care of this.
-f and -d are simple to use, but stat may also be used for such a check (see second field, mode.)
I have not tested the code; I cannot easily recreate your conditions.
EDIT: Now it uses finddepth instead of find because:
finddepth() works just like find() except that it invokes the &wanted function for a directory after invoking it for the directory's contents. It does a postorder traversal instead of a preorder traversal, working from the bottom of the directory tree up where find() works from the top of the tree down.
This should take care of removing the directories in order, deepest first. Some directories may still not be removed if files remain in them that do not match the delete condition. If you want them removed when empty regardless of their timestamp, then remove the if -d condition. The non-empty ones will remain. Directories that cannot be removed may issue a warning...
I'm trying to enumerate directory content and check the sizes of the files there (no recursion). So I opendir/readdir through the directory, skip certain types of files (directories and such), and by using something like my $size = -s "$file_path get the size of the current file.
However, I'm having a weird situation - in one directory I can't get the size of any file (containing all .exe files). The same program runs fine on another directory (.txt files, .pl and similar).
If I copy some .exe file from the first directory to the other one, its size is properly determined.
If I run the program on the first directory again, the size of that one copied .exe is properly determined, all others still fail. So it seems like some weird caching problem.
Any idea why this is happening?
Edit: With the -f check, the .exe files for which size check doesn't work are not plain files. However they "become" plain files if I just copy them from that directory somewhere. Then the size check works
The part of the code used for enumerating files:
my $dir_handle;
my $dir_entry;
my $retVal = opendir($dir_handle, "$path");
if (!$retVal)
{
print "Unable to open directory. \n$!";
exit(0);
}
while ($dir_entry = readdir($dir_handle))
{
print "Current file: $dir_entry \n";
next if (! -f $dir_entry);
my $size_bytes = -s "$dir_entry";
if ($size_bytes)
{
print "Size: $size_bytes \n";
}
}
closedir($dir_handle);
readdir() returns the file name only, and doesn't include the path information - so if the directory is not the same as the current working dir, this will fail.
You want to include the path:
while ($dir_entry = readdir($dir_handle))
{
print "Current file: $dir_entry \n";
next if (! -f "$path/$dir_entry");
my $size_bytes = -s "$path/$dir_entry";
if ($size_bytes)
{
print "Size: $size_bytes \n";
}
}
(and yes, using Unix-style path separators works fine here; feel free to use \ instead if you like escaping things)
readdir only returns the name of each directory entry. It doesn't include the path to the directory being read. For example, if you're reading /var/tmp and that directory contains a file named foo, then readdir is going to return foo, not /var/tmp/foo.
To check whether a directory entry is a file or to get its size, you have to provide a full pathname to the file, including the directory part. Unless you're specifically calling readdir on the current directory, you will need to convert each filename to a pathname:
while ($dir_entry = readdir($dir_handle))
{
my $pn = $path . '/' . $dir_entry;
print "Current file: $pn \n";
next if (! -f $pn);
my $size_bytes = -s $pn;
...
}
As has already been stated, your bug was in not including the path information during your file operations.
I would recommend using Path::Class to make your file and directory operations cross platform compatible, but also to automatically handle issues like this:
use strict;
use warnings;
use Path::Class;
my $path = '.';
my $dir = dir($path);
for my $child ( $dir->children ) {
printf "Current file: %s\n", $child->basename;
next if !-f $child;
if ( my $size = -s $child ) {
print "Size: $size\n";
}
}
When i use fcopy to copy files from an UNC Path to another, it doesn't work if the target directory doesn't exist. But it does work perfectly on local path, (resulting in creating that directory)
use File::Copy::Recursive qw(fcopy rcopy dircopy fmove rmove dirmove);
use autodie qw'fcopy rmove';
#works. Folder is created, File is copied.
fcopy ("/path/to/file", "/path/to/non/existing/path");
#works too. No need to create a folder.
fcopy ("//path/to/UNC/Share/File", "//path/to/existing/Share");
#doesn't work.
fcopy ("path/to/UNC/Share/File", ""//path/to/existing/Share/with/non/existing/folder");
it dies with
Following example
my $file1 = "//server/existing/file"
if (! -f $file1 ) {
print "$file1 does not exist";
exit 2;
}
fcopy($file1, "//server/targetDirectory/newFolder"
dies with
can't fcopy('//server/existing/file', '//server/targetDirectory/newFolder'): No such file or d
irectory at test.pl line 20
Is it not possible to create directories with rcopy on samba shares, using an UNC path or is this a bug?
This is a bug. https://rt.cpan.org/Public/Bug/Display.html?id=43328. It does work if you use a drive letter to map the remote share -- but that is not always convenient. The bug was reported in 2009, someone posted a proposed solution in 2010, but no new version including a fix has been released yet. You can try the proposed solution by adjusting your local copy of File::Copy::Recursive, changing the beginning of sub pathmk to the following:
sub pathmk {
my ( $volume, #parts ) = File::Spec->splitpath(shift());
my $nofatal = shift;
#parts = File::Spec->splitdir( File::Spec->catdir(#parts));
my $pth = File::Spec->catdir($volume,$parts[0]);
my $zer = 0;
[EDIT] I've sent the maintainer of the package an email asking to release a new version with this fix included. I checked that the fix does not break any of the tests associated with the software package.
I've modified some script that I've written to now only copy .jpg files.
The script seems to work. It will copy all of the .jpg files from one folder to another but the script is meant to continually loop every X amount of seconds.
If I add a new .jpg file to the folder I'm moving items from after I have already started the script it will not copy over the newly added file. If I stop and restart the script then it will copy the new .jpg file that was added but I want the script to copy items as they are put into the folders and not have to stop and restart the script.
Before I added the glob function trying to only copy .jpg files the script would copy anything in the folder even if it was moved into the folder while the script was still running.
Why is this happening? Any help would be awesome.
Here is my code:
use File::Copy;
use File::Find;
my #source = glob ("C:/sorce/*.jpg");
my $target = q{C:/target};
while (1)
{ sleep (10);
find(
sub {
if (-f) {
print "$File::Find::name -> $target";
copy($File::Find::name, $target)
or die(q{copy failed:} . $!);
}
},
#source
);
}
Your #source array contains a list of file names. It should contain a list of folders to start your search in. So simply change it to:
my $source = "C:/source";
I changed it to a scalar, because it only holds one value. If you want to add more directories at a later point, an array can be used instead. Also, of course, why mix a glob and File::Find? It makes little sense, as File::Find is recursive.
The file checking is then done in the wanted subroutine:
if (-f && /\.jpg$/i)
It won't refresh its list of files if you only glob the list once.
I prefer to use File::Find::Rule, and would use that for each iteration on the directory instead to update the list.
use File::Find::Rule;
my $source_dir = 'C:/source';
my $target_dir = 'C:/target';
while (1) {
sleep 10;
my #files = File::Find::Rule->file()
->name( '*.jpg' )
->in( $source_dir );
for my $file (#files) {
copy $file, $target
or die "Copy failed on $file: $!";
}
}
I want to locate the latest subdirectory on a network path and copy the entire contents of the latest subdirectory into another folder in the network path
We have lot of subfolders under the folder \\10.184.132.202\projectdump I need to sort the sub folders to get into latest folder and copy the entire contents into another folder on \\10.184.132.203\baseline
I am using the below mentioned script i am able to list the latest modified folder under the directory but I am unaware of copying the contents.
use File::stat;
use File::Copy qw(copy);
$dirname = '\\\\10.184.132.202\\projectdump\\Testing\\';
$destination = '\\\\10.184.132.203\\baseline\\Testing\\';
$timediff=0;
opendir DIR, "$dirname";
while (defined ($sub_dir = readdir(DIR)))
{
if($sub_dir ne "." && $sub_dir ne "..")
{
$diff = time()-stat("$dirname/$sub_dir")->mtime;
if($timediff == 0)
{
$timediff=$diff;
$newest=$sub_dir;
}
if($diff<$timediff)
{
$timediff=$diff;
$newest=$sub_dir;
}
}
}
print $newest,"\n";
open my $in, '<', $newest or die $!;
while (<$in>) {
copy *, $destination; --------> Here i want to copy the entire contents of the $newest to $destination.
}
Use File::Copy::Recursive. This is an optional module, but allows you to copy entire directory trees. Unfortunately, File::Copy::Recursive is not a standard Perl module, but you can install it via the cpan command.
If installing modules is a problem (sometimes it is), you can use the File::Find to go through the directory tree and copy files one at a time.
By the way, you can use forward slashes in Perl for Windows file names, so you don't have to double up on backslashes.
Why don't call a simple shell cmd to find the latest dir?
I think, this will be much simpler in shell...
my $newestdir=`ls -1rt $dirname|tail -n 1`;
in shell:
LATESTDIR=`ls -1rt $dirname|tail -n 1`
cp -r ${LATESTDIR}/* $destination/
Ups, I just realized that you might using Windows...
Get all dirs and their times into a hash then sort that hash reverse order to find the newest one
my ($newest) = sort {$hash{$b} cmp $hash{$a} keys %hash;
then
opendir NDIR, "$newest";
while ($dir=<NDIR>) {
next if $dir eq '.' or $dir eq '..';
copy $dir, $destination;
}