Perl 5.18 | Archive::ZIP 1.68 | Recursive files not included in ZIP - perl

I have to fix a Perl script which is having issues adding files recursively. As I am not a Perl programmer this is kinda hard for me, hope that somebody will have a clue for me.
What the script has to do is ZIP a directory from a mounted (NAS) share. I am running into the issue that the ZIP is created as expected, is a valid ZIP and does contain the directory structure of the original directory. By example, when I try to ZIP a directory like "/mounted_dir/to_zip/" which contains a sub-directory called "sub_dir" which contains a file like "filetozip.txt" I will only find the directory structure in the ZIP file. No errors are shown.
I tested it by creating a 2 pertty simple scripts to test:
use Archive::Zip qw( :ERROR_CODES :CONSTANTS );
my $zip = Archive::Zip->new();
my $dir_member = $zip->addDirectory( '/mounted_dir/to_zip/' );
unless ( $zip->writeToFileNamed('/tmp/someZip.zip') == AZ_OK ) {
die 'write error';
}
and with AddTree function:
use Archive::Zip qw( :ERROR_CODES :CONSTANTS );
my $zip = Archive::Zip->new();
$zip->addTree( '/mounted_dir/to_zip/' );
unless ( $zip->writeToFileNamed('/tmp/someZip.zip') == AZ_OK ) {
die 'write error';
}
Both with same results.
If I move the "to_zip" directory to the local filesystem, everything is working as expected. ZIP file is created with directories and files.
This has been working in the past on an older Fedora/Perl version. I had to update Fedora to a newer version because of the lack of SMB3 support in my version. With that upgrade my Perl moved to version 5.18. Archive::ZIP is at the latest 1.68 version.
When I access the directories and files on the command line I can do whatever I want with them. Create, rename, edit, remove, etc. etc. Also used the default Fedora ZIP and TAR tools, all working. So I think this is not a directory or file permission. Feels like I am missing some option in Perl / Archive::ZIP, but as said I am not a Perl progammer and do miss essential basic knowledge.

Related

PERL - renaming a file member in zip64 archive

I am changing a PERL code that does compression to be able to handle the zip64 extension.
the old code is using Archive::Zip module that can be used as follows.
# Create a Zip file
use Archive::Zip qw( :ERROR_CODES :CONSTANTS );
my $zip = Archive::Zip->new();
# Add a file from disk
my $file_member = $zip->addFile( 'xyz.pl', 'AnotherName.pl' );
Archive::Zip doesn't support zip64 extension and because of that I am using IO::Compress::Zip module instead.
I am looking for a way to mimic the addfFile functionality some way or another, renaming while zipping or maybe editing the archives after zipping.
I can't find any PERL module that can help me in doing so.
Is there any way in PERL to do that ?
In case there is not a direct way, can I change something in the
header of the archive file to rename its members ?
Thank you
I assume this is the same question you asked over on PerlMonks Renaming a file member in zip64 archive?
If so, here is the same reply.
Try this - it will automatically create the output file as a Zip64 compliant Zip archive if required (i.e. if the size exceed 4 Gig or you have > 64k members in the zip archive). Otherwise it creates a standard Zip archive.
use Archive::Zip::SimpleZip qw($SimpleZipError) ;
my $z = new Archive::Zip::SimpleZip "my1.zip"
or die "Cannot create zip file: $SimpleZipError\n" ;
$z->add('xyz.pl', Name => 'AnotherName.pl' );
$z->close();
If you want to force the creation of a Zip64 archive (even when the archive is small enough not to need it) add the Zip64 option when creating the Archive::Zip::SimpleZip object, like this
my $z = new Archive::Zip::SimpleZip "my1.zip", Zip64 => 1
or die "Cannot create zip file: $SimpleZipError\n" ;
I am not sure if there is a way to do it with IO::Compress::Zip, but.
Why do not you rename files before adding them to archive?
If you need to preserve original files with its names, copy the files to some temp folder, rename, and to archive and delete from temp after.

Perl File::Copy::Recursive fcopy doesn't create directories on UNC Paths

When i use fcopy to copy files from an UNC Path to another, it doesn't work if the target directory doesn't exist. But it does work perfectly on local path, (resulting in creating that directory)
use File::Copy::Recursive qw(fcopy rcopy dircopy fmove rmove dirmove);
use autodie qw'fcopy rmove';
#works. Folder is created, File is copied.
fcopy ("/path/to/file", "/path/to/non/existing/path");
#works too. No need to create a folder.
fcopy ("//path/to/UNC/Share/File", "//path/to/existing/Share");
#doesn't work.
fcopy ("path/to/UNC/Share/File", ""//path/to/existing/Share/with/non/existing/folder");
it dies with
Following example
my $file1 = "//server/existing/file"
if (! -f $file1 ) {
print "$file1 does not exist";
exit 2;
}
fcopy($file1, "//server/targetDirectory/newFolder"
dies with
can't fcopy('//server/existing/file', '//server/targetDirectory/newFolder'): No such file or d
irectory at test.pl line 20
Is it not possible to create directories with rcopy on samba shares, using an UNC path or is this a bug?
This is a bug. https://rt.cpan.org/Public/Bug/Display.html?id=43328. It does work if you use a drive letter to map the remote share -- but that is not always convenient. The bug was reported in 2009, someone posted a proposed solution in 2010, but no new version including a fix has been released yet. You can try the proposed solution by adjusting your local copy of File::Copy::Recursive, changing the beginning of sub pathmk to the following:
sub pathmk {
my ( $volume, #parts ) = File::Spec->splitpath(shift());
my $nofatal = shift;
#parts = File::Spec->splitdir( File::Spec->catdir(#parts));
my $pth = File::Spec->catdir($volume,$parts[0]);
my $zer = 0;
[EDIT] I've sent the maintainer of the package an email asking to release a new version with this fix included. I checked that the fix does not break any of the tests associated with the software package.

How can I unlink a windows folder with spaces?

I am trying to remove files and folders that do not match our naming standard. Well I can seem to loop through an array of collected files and folders that do not match and delete them. The problem I am running into is I cannot remove files that have spaces in them. I am running on Windows with Strawberry Perl.
Here is a sample of the array
Picture.jpg
Sample Document.doc
New Folder
The only thing I can delete successfully is Picture.jpg in this example.
Here is the function
foreach my $folder (#$returned_unmatches) {
print "$folder\n";
remove_root_junk($office,$folder);
}
Here is the subroutine.
sub remove_root_junk {
my $office = shift;
my $folder = shift;
my $order_docs_path = $office_names{$office};
unlink "$order_docs_path\\$folder";
}
$order_docs_path is just the full path up to where I am working. In this case C:\Data\Locations\Canton\Order_Documents if this help at all.
Under normal circumstances, directly unlinking a directory is not supported. The documentation for unlink says:
Note: unlink will not attempt to delete directories unless you are superuser and the -U flag is supplied to Perl. Even if these conditions are met, be warned that unlinking a directory can inflict damage on your filesystem. Finally, using unlink on directories is not supported on many operating systems. Use rmdir instead.
If your subdirectory is not empty, the rmdir documentation suggests using rmtree in File::Path.

Checking existence of a file given a directory format in perl

I am struggling with a method of walking a directory tree to check existence of a file in multiple directories. I am using Perl and I only have the ability to use File::Find as I am unable to install any other modules for this.
Here's the layout of the file system I want to traverse:
Cars/Honda/Civic/Setup/config.txt
Cars/Honda/Pathfinder/Setup/config.txt
Cars/Toyota/Corolla/Setup/config.txt
Cars/Toyota/Avalon/Setup/
Note that the last Setup folder is missing a config.txt file.
Edit: also, in each of the Setup folders there are a number of other files as well that vary from Setup folder to Setup folder. There really isn't any single file to search against to get into the Setup folder itself.
So you can see that the file path stays the same except for the make and model folders. I want to find all of the Setup folders and then check to see if there is a config.txt file in that folder.
At first I was using the following code with File::Find
my $dir = '/test/Cars/';
find(\&find_config, $dir);
sub find_config {
# find all Setup folders from the given top level dir
if ($File::Find::dir =~ m/Setup/) {
# create the file path of config.txt whether it exists or not, well check in the next line
$config_filepath = $File::Find::dir . "/config.txt";
# check existence of file; further processing
...
}
}
You can obviously see the flaw in trying to use $File::Find::dir =~ m/Setup/ since it will return a hit for every single file in the Setup folder. Is there any way to use a -d or some sort of directory check rather than a file check? The config.txt is not always in the folder (I will need to create it if it doesn't exist) so I can't really use something like return unless ($_ =~ m/config\.txt/) since I don't know if it's there or not.
I'm trying to find a way to use something like return unless ( <is a directory> and <the directory has a regex match of m/Setup/>).
Maybe File::Find is not the right method for something like this but I've been searching around for a while now without any good leads on working with directory names rather than file names.
File::Find finds directory names, too. You want to check for when $_ eq 'Setup' (note: eq, not your regular expression, which would also match XXXSetupXXX), and then see if there's a config.txt file in the directory ( -f "$File::Find::name/config.txt" ). If you want to avoid complaining about files named Setup, check that the found 'Setup' is a directory with -d.
I'm trying to find a way to use something like return unless ( <is a directory> and <the directory has a regex match of m/Setup/>).
use File::Spec::Functions qw( catfile );
my $dir = '/test/Cars/';
find(\&find_config, $dir);
sub find_config {
return unless $_ eq 'Setup' and -d $File::Find::name;
my $config_filepath = catfile $File::Find::name => 'config.txt';
# check for existence etc
}

How can I sync two directories with Perl?

I have a folder called "Lib" in my drive it contains many files inside and I have a problem that this "Lib" folder is there in many other places in the drive. My Perl script has to copy the contents from folder "Lib" which are latest updated and paste it in the folder "d:\perl\Latest_copy_of_Lib"
For example, I have a Lib folders in d:\functions, d:\abc, and many other places. I want to find the latest copy of each file in those directories. So, if the file d:\functions\foo.txt was last modified on 2009-10-12 and d:\abc\foo.txt was last modified on 2009-10-13, then I want the version in d:\abc to by copied to the target directory.
I have used file::find but it searches in whole dir and copies the contents that are not latest copy.
I think you just described rsync. Unless you have some sort of weird requirements here, I don't think you need to write any code to do this. I certainly wouldn't reach for Perl to do the job you described.
You need to use File::Find to create a hash of files to move. Only put the path to a file in the hash if the file is newer than the path already stored in the hash. Here is a simple implementation. Note, there may be problems on the windows platform, I am not used to using File::Spec to work with files and pathes in a cross platform manner.
#!/usr/bin/perl
use warnings;
use strict;
use File::Find;
use File::Spec;
my %copy;
my #sources = qw{
/Users/cowens/foo/Lib
/Users/cowens/bar/Lib
/Users/cowens/baz/Lib
};
find sub {
my ($volume, $dir, $file) = File::Spec->splitpath($File::Find::name);
my #dirs = File::Spec->splitdir($dir);
my #base = ($volume); #the base directory of the file
for my $dir (#dirs) {
last if $dir eq 'Lib';
push #base, $dir;
}
#the part that is common among the various bases
my #rest = #dirs[$#base .. $#dirs];
my $base = File::Spec->catdir(#base);
my $rest = File::Spec->catfile(#rest, $file);
#if we don't have this file yet, or if the file is newer than the one
#we have
if (not exists $copy{$rest} or (stat $File::Find::name)[9] > $copy{$rest}{mtime}) {
$copy{$rest} = {
mtime => (stat _)[9],
base => $base
};
}
}, #sources;
print "copy\n";
for my $rest (sort keys %copy) {
print "\t$rest from $copy{$rest}{base}\n";
}
If you can standardize on a single location for your libraries, and then use one of the following:
set PERL5LIB Environment variable and add
use lib 'C:\Lib';
or
perl -I C:\Lib myscript
Any of these will give you a single copy of your lib directory that any of your scripts will be able to access.