I am newbie to perl and working on a script to find the files with pattern matching having latest modified timestamp and copy to another location.
my code is working on Mac, but having issues on windows server.
use strict;
use warnings;
use File::stat;
my $UNC ="/Users/documents/";
my $FileNamePrefix = "abc*.csv";
my #files = sort {stat($a)->mtime <=> stat($b)->mtime} glob($UNC.$FileNamePrefix);
my $Recon = $files[-1];
print "Latest = $Recon\n";
can someone please help me with this code, Thank you
Source location:
abc_20181.csv (yesterdaydate)
abc_20182.csv (todaysdate)
Target location:
abc_20182.csv
You could use File::Spec to create the paths, and it will create the correct syntax based on the OS it's running on. For example:
use strict;
use warnings;
use File::stat;
my $UNC ="/Users/documents/";
my $FileNamePrefix = "abc*.csv";
# platform specific path:
my $pattern = File::Spec->catpath('', $UNC, $FileNamePrefix);
my #files = sort {stat($a)->mtime <=> stat($b)->mtime} glob($pattern);
my $Recon = $files[-1];
print "Latest = $Recon\n";
This will make the pattern "/Users/documents/abc*.csv" on Mac and "\Users\documents\abc*.csv" on Windows.
See "perldoc File::Spec" for more examples.
Related
For this problem I found one solution, but I am looking for alternative solutions.
Assume I am in directory
/home/user/testA/testB
and my home directory is /home/user, how can I obtain the path of the current directory relative to my home directory? That is: testA/testB.
I tried the following:
use Cwd qw(getcwd);
use Env qw(HOME);
my $cdir=getcwd;
my $p=$cdir=~s{^\Q$HOME\E/}{}r;
and it seems to work. My question is if there is a CPAN module for doing this? The closest I came was File::Spec::Functions qw(abs2rel) but it just gives me ../.. .. maybe I missed something?
This seems to work fine to me with File::Spec
#!usr/bin/perl
use File::Spec;
$base = '/home/user';
$path = '/home/user/testA/testB';
my $rel_path = File::Spec->abs2rel( $path, $base );
print($rel_path);
Output:
testA/testB
Using Path::Tiny:
#!/usr/bin/perl
use warnings;
use strict;
use Path::Tiny;
my $base = '/home/user';
my $path = '/home/user/testA/testB';
my $relative = path($path)->relative($base);
print "$relative\n";
I want to download files with .gz extension using Perl. I have wrote the following code:
#!/usr/bin/perl
use strict;
use warnings;
use LWP::Simple;
my $url = 'http://www.ebi.ac.uk/thornton-srv/databases/pdbsum/2kri/igrow.out.gz';
my $file = 'prot-prot.txt';
getstore($url, $file);
But I have realized that this code only works with text files and not compressed files. Any idea how I should change this code in order to download .gz files?
Thanks;
#!/usr/bin/perl
use strict;
use warnings;
use LWP::Simple;
my $url = 'http://www.ebi.ac.uk/thornton-srv/databases/pdbsum/2kri/igrow.out.gz';
my $file = 'igrow.out.gz';
getstore($url, $file);
If you want the perl script to unzip the file, you can either uses system() to run gunzip or search CPAN for a suitable perl module.
if you don't like typing 'igrow.out.gz twice (with the possibility of forgetting to change one of the filenames) replace $file = ... with something like
(my $file = $url) =~ s!^.*/!!;
Use File::Fetch.
I need to search in a directory for all the files that end in .123.
How do I (using Perl) get a list of those files?
Simply:
#files = glob "$dirname/*.123";
One way is to use glob:
use warnings;
use strict;
my #files = grep { -f } glob '*.123';
glob should do the job.
If you want to search recursively, you can use File::Find.
I am trying to copy all files in one location to a different location and am using the File::Copy module and copy command from that, but now the issue I am facing is that I have file whose name has special character whose ascii value is ý but in unix file system it is stored as ? and so my question is that will copy or move command consider this files with special characters while copying or moving to another location or not,
if now then what would be an possible work around for this ?
Note: I cannot create file with special characters in unix because special characters are replaced with ? and I cannot do so in Windows because on Windows Special Characters are replaced with the Encoded value as in my case of ý ?
my $folderpath = 'the_path';
open my $IN, '<', 'path/to/infile';
my $total;
while (<$IN>) {
chomp;
my $size = -s "$folderpath/$_";
print "$_ => $size\n";
$total += $size;
}
print "Total => $total\n";
Courtesy: RickF Answer
Any suggesion would be highly appreciated.
Reference Question : Perl File Handling Question
As workaround I can suggest to convert all unsupported characters to supported. This can be done in many ways. For example you can use URI::Escape:
use URI::Escape;
my $new_file_name = uri_escape($weird_file_name);
Update:
Here is how I was able to copy file by its uft-8 name. I'm on Windows. I've used Win32::GetANSIPathName to get short file name. Then it was copied nice:
use File::Copy;
use URI::Escape;
use Win32;
use utf8; ## tell perl that source code is in utf-9
use strict;
use warnings;
my $test_file = "IBMýSoftware.txt";
my $from_file = Win32::GetANSIPathName($test_file); ## get "short" name of file
my $to_file = uri_escape($test_file); ## name with special characters escaped
printf("copy [%s] -> [%s]\n", $from_file, $to_file);
copy($from_file, $to_file);
After coping all file to new names on Windows, you'll be able to work with them on linux without problems.
Here are some hints about utf-8 file opening:
How do I create a Unicode directory on Windows using Perl?
With a utf8-encoded Perl script, can it open a filename encoded as GB2312?
Character 253 is ý. I guess that on your Unix system the locale is not set, or only the most primitive fall-back locale is in effect, and that is why you see a replacement character. If I am guessing correctly, the solution is to simply set the locale to something, preferably to an UTF-8 locale since that can handle all characters, and Perl shouldn't even enter into the problem.
> cat 3761218.pl
use utf8;
use strict;
use warnings FATAL => 'all';
use autodie qw(:all);
my $file_name = '63551_106640_63551 IBMýSoftware Delivery&Fulfillment(Div-61) Data IPS 08-20-2010 v3.xlsm';
open my $h, '>', $file_name;
> perl 3761218.pl
> ls 6*
63551_106640_63551 IBMýSoftware Delivery&Fulfillment(Div-61) Data IPS 08-20-2010 v3.xlsm
> LANG=C ls 6* # temporarily cripple locale so that the problem in the question is exhibited
63551_106640_63551 IBM??Software Delivery&Fulfillment(Div-61) Data IPS 08-20-2010 v3.xlsm
> locale | head -1 # show which locale I have set
LANG=de_DE.UTF-8
The following script works as expected for me:
#!/usr/bin/perl
use strict; use warnings;
use autodie;
use File::Copy qw( copy );
use File::Spec::Functions qw( catfile );
my $fname = chr 0xfd;
open my $out, '>', catfile($ENV{TEMP}, $fname);
close $out;
copy catfile($ENV{TEMP}, $fname) => catfile($ENV{HOME}, $fname);
I am definitely new to Perl, and please forgive me if this seem like a stupid question to you.
I am trying to unzip a bunch of .cab file with jzip in Perl (ActivePerl, jzip, Windows XP):
#!/usr/bin/perl
use strict;
use warnings;
use File::Find;
use IO::File;
use v5.10;
my $prefix = 'myfileprefix';
my $dir = '.';
File::Find::find(
sub {
my $file = $_;
return if -d $file;
return if $file !~ /^$prefix(.*)\.cab$/;
my $cmd = 'jzip -eo '.$file;
system($cmd);
}, $dir
);
The code decompresses the first .cab files in the folder and hangs (without any errors). It hangs in there until I press Ctrl+c to stop. Anyone know what the problem is?
EDIT: I used processxp to inspect the processes, and I found that there are correct number of jzip processes fired up (per the number of cab files resides at the source folder). However, only one of them is run under cmd.exe => perl, and none of these process gets shut down after fired. Seems to me I need to shut down the process and execute it one by one, which I have no clue how to do so in perl. Any pointers?
EDIT: I also tried replacing jzip with notepad, it turns out it opens up notepad with one file at a time (in sequential order), and only if I manually close notepad then another instance is fired. Is this common behavior in ActivePerl?
EDIT: I finally solved it, and I am still not entire sure why. What I did was removing XML library in the script, which should not relevant. Sorry I removed "use XML::DOM" purposefully in the beginning as I thought it is completely irrelevant to this problem.
OLD:
use strict;
use warnings;
use File::Find;
use IO::File;
use File::Copy;
use XML::DOM;
use DBI;
use v5.10;
NEW:
#!/usr/bin/perl
use strict;
use warnings;
use File::Find;
use IO::File;
use File::Copy;
use DBI;
use v5.10;
my $prefix = 'myfileprefix';
my $dir = '.';
# retrieve xml file within given folder
File::Find::find(
sub {
my $file = $_;
return if -d $file;
return if $file !~ /^$prefix(.*)\.cab$/;
say $file;
#say $file or die $!;
my $cmd = 'jzip -eo '.$file;
say $cmd;
system($cmd);
}, $dir
);
This, however, imposes another problem, when the extracted file already exists, the script will hang again. I highly suspect this is a problem of jzip and an alternative of solving the problem is simply replacing jzip with extract, like #ghostdog74 pointed out below.
First off, if you are using commands via system() call, you should always redirect their output/error to a log or at least process within your program.
In this particular case, if you do that, you'd have a log of what every single command is doing and will see if/when any of them are stuck.
Second, just a general tip, it's a good idea to always use native Perl libraries - in this case, it may be impossible of course (I'm not that experienced with Windows Perl so no clue if there's a jzip module in Perl, but search CPAN).
UPDATE: Didn't find a Perl native CAB extractor, but found a jzip replacement that might work better - worth a try. http://www.cabextract.org.uk/ - there's a DOS version which will hopefully work on Windows
Based on your edit, this is what I suggest:
#!/usr/bin/perl
use strict;
use warnings;
use File::Find;
use IO::File;
use v5.10;
my $prefix = 'myfileprefix';
my $dir = '.';
my #commands;
File::Find::find(
sub {
my $file = $_;
return if -d $file;
return if $file !~ /^$prefix(.*)\.cab$/;
my $cmd = "jzip -eo $File::Find::name";
push #commands, $cmd;
}, $dir
);
#asynchronously kick off jzips
my $fresult;
for #commands
{
$fresult = fork();
if($fresult == 0) #child
{
`$_`;
}
elsif(! defined($fresult))
{
die("Fork failed");
}
else
{
#no-op, just keep moving
}
}
edit: added asynch. edit2: fixed scope issue.
What happens when you run the jzip command from the dos window? Does it work correctly? What happens if you add an end of line character (\n) to the command in the script? Does this prevent the hang?
here's an alternative, using extract.exe which you can download here or here
use File::Find;
use IO::File;
use v5.10;
my $prefix = 'myfileprefix';
my $dir = '.';
File::Find::find({wanted => \&wanted}, '.');
exit;
sub wanted {
my $destination = q(c:\test\temp);
if ( -f $_ && $_=~/^$prefix(.*)\.cab$/ ) {
$filename = "$File::Find::name";
$path = "$File::Find::dir";
$cmd = "extract /Y $path\\$filename /E /L $destination";
print $cmd."\n";
system($cmd);
}
} $dir;
Although no one has mentioned it explicitly, system blocks until the process finishes. The real problem, as people have noted, is figuring out why the process doesn't exit. Forking or any other parallelization won't help because you'll be left with a lot of hung processes.
Until you can figure out the issue, start small. Make the smallest Perl script that demonstrates the problem:
#!perl
system( '/path/to/jzip', '-eo', 'literal_file_name' ); # full path, list syntax!
print "I finished!\n";
Now the trick is to figure out why it hangs, and sometimes that means different solutions for different external programs. Sometimes you need to close STDIN before you run the external process or it sits there waiting for it to close, sometimes you do some other thing.
Instead of system, you might also try things such as IPC::System::Simple, which handles a lot of platform-specific details for you, or modules like IPC::Run or IPC::Open3.
Sometimes it just sucks, and this situation is one of those times.