Perl File::Copy::Recursive fcopy doesn't create directories on UNC Paths - perl

When i use fcopy to copy files from an UNC Path to another, it doesn't work if the target directory doesn't exist. But it does work perfectly on local path, (resulting in creating that directory)
use File::Copy::Recursive qw(fcopy rcopy dircopy fmove rmove dirmove);
use autodie qw'fcopy rmove';
#works. Folder is created, File is copied.
fcopy ("/path/to/file", "/path/to/non/existing/path");
#works too. No need to create a folder.
fcopy ("//path/to/UNC/Share/File", "//path/to/existing/Share");
#doesn't work.
fcopy ("path/to/UNC/Share/File", ""//path/to/existing/Share/with/non/existing/folder");
it dies with
Following example
my $file1 = "//server/existing/file"
if (! -f $file1 ) {
print "$file1 does not exist";
exit 2;
}
fcopy($file1, "//server/targetDirectory/newFolder"
dies with
can't fcopy('//server/existing/file', '//server/targetDirectory/newFolder'): No such file or d
irectory at test.pl line 20
Is it not possible to create directories with rcopy on samba shares, using an UNC path or is this a bug?

This is a bug. https://rt.cpan.org/Public/Bug/Display.html?id=43328. It does work if you use a drive letter to map the remote share -- but that is not always convenient. The bug was reported in 2009, someone posted a proposed solution in 2010, but no new version including a fix has been released yet. You can try the proposed solution by adjusting your local copy of File::Copy::Recursive, changing the beginning of sub pathmk to the following:
sub pathmk {
my ( $volume, #parts ) = File::Spec->splitpath(shift());
my $nofatal = shift;
#parts = File::Spec->splitdir( File::Spec->catdir(#parts));
my $pth = File::Spec->catdir($volume,$parts[0]);
my $zer = 0;
[EDIT] I've sent the maintainer of the package an email asking to release a new version with this fix included. I checked that the fix does not break any of the tests associated with the software package.

Related

Perl 5.18 | Archive::ZIP 1.68 | Recursive files not included in ZIP

I have to fix a Perl script which is having issues adding files recursively. As I am not a Perl programmer this is kinda hard for me, hope that somebody will have a clue for me.
What the script has to do is ZIP a directory from a mounted (NAS) share. I am running into the issue that the ZIP is created as expected, is a valid ZIP and does contain the directory structure of the original directory. By example, when I try to ZIP a directory like "/mounted_dir/to_zip/" which contains a sub-directory called "sub_dir" which contains a file like "filetozip.txt" I will only find the directory structure in the ZIP file. No errors are shown.
I tested it by creating a 2 pertty simple scripts to test:
use Archive::Zip qw( :ERROR_CODES :CONSTANTS );
my $zip = Archive::Zip->new();
my $dir_member = $zip->addDirectory( '/mounted_dir/to_zip/' );
unless ( $zip->writeToFileNamed('/tmp/someZip.zip') == AZ_OK ) {
die 'write error';
}
and with AddTree function:
use Archive::Zip qw( :ERROR_CODES :CONSTANTS );
my $zip = Archive::Zip->new();
$zip->addTree( '/mounted_dir/to_zip/' );
unless ( $zip->writeToFileNamed('/tmp/someZip.zip') == AZ_OK ) {
die 'write error';
}
Both with same results.
If I move the "to_zip" directory to the local filesystem, everything is working as expected. ZIP file is created with directories and files.
This has been working in the past on an older Fedora/Perl version. I had to update Fedora to a newer version because of the lack of SMB3 support in my version. With that upgrade my Perl moved to version 5.18. Archive::ZIP is at the latest 1.68 version.
When I access the directories and files on the command line I can do whatever I want with them. Create, rename, edit, remove, etc. etc. Also used the default Fedora ZIP and TAR tools, all working. So I think this is not a directory or file permission. Feels like I am missing some option in Perl / Archive::ZIP, but as said I am not a Perl progammer and do miss essential basic knowledge.

making new directory in perl

I want to create a directory with a certain name before the beginning of a method. At each iteration the directory should be replaced with new entries instead of getting appended.
I have used this code:
sub makingDirectoryForClient {
$nameOfTheClientDirectory = $_[0];
my $directory = "D:\\ATEF\\clientfolder\\$nameOfTheClientDirectory";
my $outputvar = mkdir $directory;
}
but still the folder is not getting replaced. Any ideas?
If mkdir appears to be doing nothing then you should code a warning statement to find out why. The reason for failure will be in built-in variable $!, so I suggest you write your subroutine like this
sub make_client_dir {
my $client_dir = shift;
my $path = "D:\\ATEF\\clientfolder\\$client_dir";
( my $success = mkdir $path ) or
warn qq{Unable to create directory "$path": $!};
return $success;
}
Note that I've also modified your code to be more idiomatic
I needed to create a directory of the same name that already existed since I thought that would replace the folder with a new empty folder. But Perl does not work that way: mkdir will not work if a folder of the same name already exists.
So I first deleted the directory using rmtree from File::Path and then created a new directory of the same name.

How can I unlink a windows folder with spaces?

I am trying to remove files and folders that do not match our naming standard. Well I can seem to loop through an array of collected files and folders that do not match and delete them. The problem I am running into is I cannot remove files that have spaces in them. I am running on Windows with Strawberry Perl.
Here is a sample of the array
Picture.jpg
Sample Document.doc
New Folder
The only thing I can delete successfully is Picture.jpg in this example.
Here is the function
foreach my $folder (#$returned_unmatches) {
print "$folder\n";
remove_root_junk($office,$folder);
}
Here is the subroutine.
sub remove_root_junk {
my $office = shift;
my $folder = shift;
my $order_docs_path = $office_names{$office};
unlink "$order_docs_path\\$folder";
}
$order_docs_path is just the full path up to where I am working. In this case C:\Data\Locations\Canton\Order_Documents if this help at all.
Under normal circumstances, directly unlinking a directory is not supported. The documentation for unlink says:
Note: unlink will not attempt to delete directories unless you are superuser and the -U flag is supplied to Perl. Even if these conditions are met, be warned that unlinking a directory can inflict damage on your filesystem. Finally, using unlink on directories is not supported on many operating systems. Use rmdir instead.
If your subdirectory is not empty, the rmdir documentation suggests using rmtree in File::Path.

Checking existence of a file given a directory format in perl

I am struggling with a method of walking a directory tree to check existence of a file in multiple directories. I am using Perl and I only have the ability to use File::Find as I am unable to install any other modules for this.
Here's the layout of the file system I want to traverse:
Cars/Honda/Civic/Setup/config.txt
Cars/Honda/Pathfinder/Setup/config.txt
Cars/Toyota/Corolla/Setup/config.txt
Cars/Toyota/Avalon/Setup/
Note that the last Setup folder is missing a config.txt file.
Edit: also, in each of the Setup folders there are a number of other files as well that vary from Setup folder to Setup folder. There really isn't any single file to search against to get into the Setup folder itself.
So you can see that the file path stays the same except for the make and model folders. I want to find all of the Setup folders and then check to see if there is a config.txt file in that folder.
At first I was using the following code with File::Find
my $dir = '/test/Cars/';
find(\&find_config, $dir);
sub find_config {
# find all Setup folders from the given top level dir
if ($File::Find::dir =~ m/Setup/) {
# create the file path of config.txt whether it exists or not, well check in the next line
$config_filepath = $File::Find::dir . "/config.txt";
# check existence of file; further processing
...
}
}
You can obviously see the flaw in trying to use $File::Find::dir =~ m/Setup/ since it will return a hit for every single file in the Setup folder. Is there any way to use a -d or some sort of directory check rather than a file check? The config.txt is not always in the folder (I will need to create it if it doesn't exist) so I can't really use something like return unless ($_ =~ m/config\.txt/) since I don't know if it's there or not.
I'm trying to find a way to use something like return unless ( <is a directory> and <the directory has a regex match of m/Setup/>).
Maybe File::Find is not the right method for something like this but I've been searching around for a while now without any good leads on working with directory names rather than file names.
File::Find finds directory names, too. You want to check for when $_ eq 'Setup' (note: eq, not your regular expression, which would also match XXXSetupXXX), and then see if there's a config.txt file in the directory ( -f "$File::Find::name/config.txt" ). If you want to avoid complaining about files named Setup, check that the found 'Setup' is a directory with -d.
I'm trying to find a way to use something like return unless ( <is a directory> and <the directory has a regex match of m/Setup/>).
use File::Spec::Functions qw( catfile );
my $dir = '/test/Cars/';
find(\&find_config, $dir);
sub find_config {
return unless $_ eq 'Setup' and -d $File::Find::name;
my $config_filepath = catfile $File::Find::name => 'config.txt';
# check for existence etc
}

How can I sync two directories with Perl?

I have a folder called "Lib" in my drive it contains many files inside and I have a problem that this "Lib" folder is there in many other places in the drive. My Perl script has to copy the contents from folder "Lib" which are latest updated and paste it in the folder "d:\perl\Latest_copy_of_Lib"
For example, I have a Lib folders in d:\functions, d:\abc, and many other places. I want to find the latest copy of each file in those directories. So, if the file d:\functions\foo.txt was last modified on 2009-10-12 and d:\abc\foo.txt was last modified on 2009-10-13, then I want the version in d:\abc to by copied to the target directory.
I have used file::find but it searches in whole dir and copies the contents that are not latest copy.
I think you just described rsync. Unless you have some sort of weird requirements here, I don't think you need to write any code to do this. I certainly wouldn't reach for Perl to do the job you described.
You need to use File::Find to create a hash of files to move. Only put the path to a file in the hash if the file is newer than the path already stored in the hash. Here is a simple implementation. Note, there may be problems on the windows platform, I am not used to using File::Spec to work with files and pathes in a cross platform manner.
#!/usr/bin/perl
use warnings;
use strict;
use File::Find;
use File::Spec;
my %copy;
my #sources = qw{
/Users/cowens/foo/Lib
/Users/cowens/bar/Lib
/Users/cowens/baz/Lib
};
find sub {
my ($volume, $dir, $file) = File::Spec->splitpath($File::Find::name);
my #dirs = File::Spec->splitdir($dir);
my #base = ($volume); #the base directory of the file
for my $dir (#dirs) {
last if $dir eq 'Lib';
push #base, $dir;
}
#the part that is common among the various bases
my #rest = #dirs[$#base .. $#dirs];
my $base = File::Spec->catdir(#base);
my $rest = File::Spec->catfile(#rest, $file);
#if we don't have this file yet, or if the file is newer than the one
#we have
if (not exists $copy{$rest} or (stat $File::Find::name)[9] > $copy{$rest}{mtime}) {
$copy{$rest} = {
mtime => (stat _)[9],
base => $base
};
}
}, #sources;
print "copy\n";
for my $rest (sort keys %copy) {
print "\t$rest from $copy{$rest}{base}\n";
}
If you can standardize on a single location for your libraries, and then use one of the following:
set PERL5LIB Environment variable and add
use lib 'C:\Lib';
or
perl -I C:\Lib myscript
Any of these will give you a single copy of your lib directory that any of your scripts will be able to access.