location of file in perl - perl

As suggested by Chris, a user on this site. In the 1st perl script: The values are stored in the dictionary.The first script is fine.The first script runs for only one time and stores the values. It is working.
In the 2nd script:
my $processed = retrieve('processed_dirs.dat'); # $processed is a hashref
Here it is reading the "processed_durs.dat" which is in the first script. So, I am just wondering how the second script knows the location of Processed_dirs.dat here?
#!/usr/bin/perl
use strict;
use warnings;
use Storable;
# This script to be run 1 time only. Sets up 'processed' directories hash.
# After this script is run, ready to run the daily script.
my $dir = '.'; # or what ever directory the date-directories are stored in
opendir my $dh, $dir or die "Opening failed for directory $dir $!";
my #dir = grep {-d && /^\d\d-\d\d-\d\d$/ && $_ le '11-04-21'} readdir $dh;
closedir $dh or die "Unable to close $dir $!";
my %processed = map {$_ => 1} #dir;
store \%processed, 'processed_dirs.dat';
2nd Script:
#!/usr/bin/perl
use strict;
use warnings;
use File::Copy;
use Storable;
my $dir = shift or die "Provide path on command line. $!";
my $processed = retrieve('processed_dirs.dat'); # $processed is a hashref
opendir my $dh, $dir or die "Opening failed for directory $dir $!";
my #dir = grep {-d && /^\d\d-\d\d-\d\d$/ && !$processed->{$_} } readdir $dh;
closedir $dh or die "Unable to close $dir $!";
#dir or die "Found no unprocessed date directories";
my $fdir = '/some/example/path';
for my $date (#dir) {
my $dday = "$dir/$date";
my #gzfiles = glob("$dday/*tar.gz");
foreach my $zf (#gzfiles) {
next if $zf =~ /BMP/ || $zf =~ /LG/ || $zf =~ /MAP/ || $zf =~ /STR/;
print "$zf\n";
copy($zf, $fdir) or die "Unable to copy $zf to $fdir. $!";
}
$processed->{ $date } = 1;
}
store $processed, 'processed_dirs.dat';

Unless I'm missing something, the answer is: Both scripts use a file called "processed_dirs.dat", in whatever directory they are run from. So as long as both scripts are run from the same directory, they will both use the same file.

Related

To parse multiple files in Perl

Please correct my code, I cannot seem to open my file to parse.
The error is this line open(my $fh, $file) or die "Cannot open file, $!";
Cannot open file, No such file or directory at ./sample.pl line 28.
use strict;
my $dir = $ARGV[0];
my $dp_dpd = $ENV{'DP_DPD'};
my $log_dir = $ENV{'DP_LOG'};
my $xmlFlag = 0;
my #fileList = "";
my #not_proc_dir = `find $dp_dpd -type d -name "NotProcessed"`;
#print "#not_proc_dir\n";
foreach my $dir (#not_proc_dir) {
chomp ($dir);
#print "$dir\n";
opendir (DIR, $dir) or die "Couldn't open directory, $!";
while ( my $file = readdir DIR) {
next if $file =~ /^\.\.?$/;
next if (-d $file);
# print "$file\n";
next if $file eq "." or $file eq "..";
if ($file =~ /.xml$/ig) {
$xmlFlag = 1;
print "$file\n";
open(my $fh, $file) or die "Cannot open file, $!";
#fileList = <$fh>;
close $file;
}
}
closedir DIR;
}
Quoting readdir's documentation:
If you're planning to filetest the return values out of a readdir, you'd better prepend the directory in question. Otherwise, because we didn't chdir there, it would have been testing the wrong file.
Your open(my $fh, $file) should therefore be open my $fh, '<', "$dir/$file" (note how I also added '<' as well: you should always use 3-argument open).
Your next if (-d $file); is also wrong and should be next if -d "$dir/$file";
Some additional remarks on your code:
always add use warnings to your script (in addition to use strict, which you already have)
use lexical file/directory handle rather than global ones. That is, do opendir my $DH, $dir, rather than opendir DH, $dir.
properly indent your code (if ($file =~ /.xml$/ig) { is one level too deep; it makes it harder to read you code)
next if $file =~ /^\.\.?$/; and next if $file eq "." or $file eq ".."; are redundant (even though not technically equivalent); I'd suggest using only the latter.
the variable $dir defined in my $dir = $ARGV[0]; is never used.

Perl: Copy file from one location to another

This is just a small script I am running to continuous loop to check a directory and move every file that is there. This code works and i am running it in the background processes. But for some reason I am getting the following error: '/home/srvc_ibdcoe_pcdev/Niall_Test/new_dir/..' and '/home/srvc_ibdcoe_pcdev/Niall_Test/perl_files/..' are identical (not copied) at move2.pl line 27
any idea why it is telling me it is identical even though the paths are different?
Many thanks
script below
#!/usr/bin/perl
use diagnostics;
use strict;
use warnings;
use File::Copy;
my $poll_cycle = 10;
my $dest_dir = "/home/srvc_ibdcoe_pcdev/Niall_Test/perl_files";
while (1) {
sleep $poll_cycle;
my $dirname = '/home/srvc_ibdcoe_pcdev/Niall_Test/new_dir';
opendir my $dh, $dirname
or die "Can't open directory '$dirname' for reading: $!";
my #files = readdir $dh;
closedir $dh;
if ( grep( !/^[.][.]?$/, #files ) > 0 ) {
print "Dir is not empty\n";
foreach my $target (#files) {
# Move file
move("$dirname/$target", "$dest_dir/$target");
}
}
}
You need to filter out the special .. and . entries from #files.
#!/usr/bin/perl
use diagnostics;
use strict;
use warnings;
use File::Copy;
my $poll_cycle = 10;
my $dest_dir = "/home/srvc_ibdcoe_pcdev/Niall_Test/perl_files";
while (1) {
sleep $poll_cycle;
my $dirname = '/home/srvc_ibdcoe_pcdev/Niall_Test/new_dir';
opendir my $dh, $dirname
or die "Can't open directory '$dirname' for reading: $!";
my #files = grep !/^[.][.]?$/, readdir $dh;
closedir $dh;
if (#files) {
print "Dir is not empty\n";
foreach my $target (#files) {
# Move file
move("$dirname/$target", "$dest_dir/$target");
}
}
}
The message you see is correct. Both paths resolve to the same directory because of the ..; both resolve to /home/srvc_ibdcoe_pcdev/Niall_Test
.. refers to the directory's parent directory.

How to get files names with specific extension from a folder in perl

Currently in a perl script I am using the glob function to get a list of files with specific extensions.
my #filearray = glob("$DIR/*.abc $DIR/*.llc");
Is there any alternative to glob, to get the list of files with specific extension from a folder? If so please provide me some example? Thank you
Yes, there are much more complicated ways, like opendir, readdir and a regex filter. They will also give you the hidden files (or dotfiles):
opendir DIR, $DIR or die $!;
my #filearray = grep { /\.(abc|llc)$/ } readdir DIR;
closedir DIR;
#Using:
opendir(DIR, $dir) || die "$!";
my #files = grep(/\.[abc|lic]*$/, readdir(DIR));
closedir(DIR);
#Reference: CPAN
use Path::Class; # Exports dir() by default
my $dir = dir('foo', 'bar'); # Path::Class::Dir object
my $dir = Path::Class::Dir->new('foo', 'bar'); # Same thing
my $file = $dir->file('file.txt'); # A file in this directory
my $handle = $dir->open;
while (my $file = $handle->read)
{
$file = $dir->file($file); # Turn into Path::Class::File object
...
}
#Reference: Refered: http://accad.osu.edu/~mlewis/Class/Perl/perl.html#cd
# search for a file in all subdirectories
#!/usr/local/bin/perl
if ($#ARGV != 0) {
print "usage: findfile filename\n";
exit;
}
$filename = $ARGV[0];
# look in current directory
$dir = getcwd();
chop($dir);
&searchDirectory($dir);
sub searchDirectory
{
local($dir);
local(#lines);
local($line);
local($file);
local($subdir);
$dir = $_[0];
# check for permission
if(-x $dir)
{
# search this directory
#lines = `cd $dir; ls -l | grep $filename`;
foreach $line (#lines)
{
$line =~ /\s+(\S+)$/;
$file = $1;
print "Found $file in $dir\n";
}
# search any sub directories
#lines = `cd $dir; ls -l`;
foreach $line (#lines)
{
if($line =~ /^d/)
{
$line =~ /\s+(\S+)$/;
$subdir = $dir."/".$1;
&searchDirectory($subdir);
}
}
}
}
Please try another one:
use Cwd;
use File::Find;
my $dir = getcwd();
my #abclicfiles;
find(\&wanted, $dir);
sub wanted
{
push(#abclicfiles, $File::Find::name) if($File::Find::name=~m/\.(abc|lic)$/i);
}
print join "\n", #abclicfiles;
This the directory which is getting from user:
print "Please enter the directory: ";
my $dir = <STDIN>;
chomp($dir);
opendir(DIR, $dir) || die "Couldn't able to read dir: $!";
my #files = grep(/\.(txt|lic)$/, readdir(DIR));
closedir(DIR);
print join "\n", #files;

perl - loop through directory to find file.mdb and execute code if file.ldb not found

I am a beginner PERL programmer and I have come across a snag that I can't get by. I have been reading and re-reading web posts and Simon Cozens book at perl.org all day, but can't seem to solve the problem.
My intention with the code below is to loop through files in a directory and when the file has a certain string a name to verify that the same file name doesn't exist with a different extension and if it doesn't, to print me the file name (later I will implement a delete of the file, but for now I want to ensure it will work.) Specifically, I am finding .mdb files and after checking there are no associated .ldb's files, deleting the .mdb file.
right now my code returns this:
RRED_Database_KHOVIS.ldb
RRED_Database_KHOVIS.mdb
I will kill RRED_Database_KHOVIS.mdb
RRED_Database_mkuttler.mdb
I will kill RRED_Database_mkuttler.mdb
RRED_Database_SBreslow.ldb
RRED_Database_SBreslow.mdb
I will kill RRED_Database_SBreslow.mdb
i want it to only return the "I will kill..." after a .mdb file with no associated .ldb file.
My current code is below. I appreciate any help offered...
use strict;
use warnings;
use File::Find;
use diagnostics;
my $dir = "//vfg1msfs01ab/vfgcfs01\$/Regulatory Reporting/Access Database/";
my $filename = "RRED_Database";
my $fullname, my $ext;
opendir DH, $dir or die "Couldn't open the directory: $!";
while ($_ = readdir(DH)) {
my $ext = ".mdb";
if ((/$filename/) && ($_ ne $filename . $ext)) {
print "$_ \n";
unless (-e $dir . s/.mdb/.ldb/) {
s/.ldb/.mdb/;
print "I will kill $_ \n\n" ;
#unlink $_ or print "oops, couldn't delete $_: $!\n";
}
s/.ldb/.mdb/;
}
}
When looping through files, I like to use 'next' statements repeatedly to assure that I'm only looking at exactly what I want. Try this:
use strict;
use warnings;
use File::Find;
use diagnostics;
my $dir = "//vfg1msfs01ab/vfgcfs01\$/Regulatory Reporting/Access Database/";
my $filename = "RRED_Database";
my $fullname, my $ext;
opendir DH, $dir or die "Couldn't open the directory: $!";
while ($_ = readdir(DH)) {
my $ext = ".mdb";
# Jump to next while() iteration unless the file begins
# with $filename and ends with $ext,
# and capture the basename in $1
next unless $_ =~ m|($filename.*)$ext|;
# Jump to next while() iteration if if the file basename.ldb is found
next if -f $1 . ".ldb";
# At this point, we have an mdb file with no matching ldb file
print "$_ \n";
print "I will kill $_ \n\n" ;
#unlink $_ or print "oops, couldn't delete $_: $!\n";
}
While stuart's anwser made it more lean... I was able to also get it to work with the code below... (i changed .mdb to .accdb because I am now dealing with different file type)
use strict;
use warnings;
use File::Spec;
use diagnostics;
my $dir = "//vfg1msfs01ab/vfgcfs01\$/Regulatory Reporting/Access Database/";
my $filename = "RRED_Database";
my $ext;
opendir DH, $dir or die "Couldn't open the directory: $!";
while ($_ = readdir(DH)) {
my $ext = ".accdb";
if ((/$filename/) && ($_ ne $filename . $ext) && ($_ !~ /.laccdb/)) {
# if file contains database name, is not the main database and is not a locked version of db
s/$ext/.laccdb/;
unless (-e File::Spec->join($dir,$_)) {
s/.laccdb/$ext/;
#print "I will kill $_ \n\n";
unlink $_ or print "oops, couldn't delete $_: $!\n";
}
s/.laccdb/$ext/;
}
}

Adding more features in perl script

In the below perl script, I check my folder name (which is in the date format like 11-08-31) with the current date. If it matches, I process the folder. It also checks the previous day folder if there is no folder in today's date. I already asked this type of question here but I need to make some changes here and add new features as well:
The script checks for the previous date if todays not find. But I need to check if the previous date has already been processed or not so that I donot process it again. So, Do I need to create a list for it?
This script checks only for the one previous date. What if I have to check for the 2 previous days? Thanks for your help. hope you understand my doubts.
Updated: This perl script run automatically when It checks the curent date with the folder name. The folder is a tar folder which is loaded from other server.
So, basically I need to run the script if it matched with the folder name and current date.
Problem: Sometimes, I used to get the folder next day and my perl script checks only for the current date. The folder i get has the name which is previous date (not the current date).So, I need to do processing of the folder manually. I need to automate it in my perl script
#!/usr/bin/perl
use strict;
use warnings;
use Cwd;
use DateTime;
use File::Copy;
# set to your desired time zone
my $today = DateTime->now( time_zone => "America/New_York" );
my $td = $today->strftime("%y-%m-%d");
# strongly recommended to do date math in the 'floating'/UTC zone
my $yesterday = $today->set_time_zone('floating')->subtract( days => 1);
my $yd = $yesterday->set_time_zone('America/New_York')->strftime("%y-%m-%d");
my $dir = shift or die "Provide path on command line. $!";
if ($dir eq '.') {
$dir = cwd;
}
elsif ($dir !~ /^\//) {
$dir = cwd() . "/$dir";
}
opendir my $dh, $dir or die $!;
my #dir = sort grep {-d and /$td/ || /$yd/} readdir $dh;
closedir $dh or die $!;
#dir or die "Found no date directories. $!";
my $dday = "$dir/$dir[-1]"; # is today unless today not found, then yesterday
my $fdir = '/some/example/path/';
my #gzfiles = glob("$dday/*tar.gz");
foreach my $zf (#gzfiles) {
next if (($zf =~ /BMP/) || ($zf =~ /LG/) || ($zf =~ /MAP/) || ($zf =~ /STR/));
print "$zf\n";
copy($zf, $fdir) or die "Unable to copy. $!";
}
Well, another way to do it, as suggested by mugen kenichi, is to use Storable. This way stores a hash with all processed directories in it. Then when you run your program, it can check the hash to see if they have been processed.
You would need a one-time script to set up the hash of processed directories.
#!/usr/bin/perl
use strict;
use warnings;
use Storable;
# This script to be run 1 time only. Sets up 'processed' directories hash.
# After this script is run, ready to run the daily script.
my $dir = '.'; # or what ever directory the date-directories are stored in
opendir my $dh, $dir or die "Opening failed for directory $dir $!";
my #dir = grep {-d && /^\d\d-\d\d-\d\d$/ && $_ le '11-04-21'} readdir $dh;
closedir $dh or die "Unable to close $dir $!";
my %processed = map {$_ => 1} #dir;
store \%processed, 'processed_dirs.dat';
Then, a script to be run periodically to find and process your date directories.
#!/usr/bin/perl
use strict;
use warnings;
use File::Copy;
use Storable;
my $dir = shift or die "Provide path on command line. $!";
my $processed = retrieve('processed_dirs.dat'); # $processed is a hashref
opendir my $dh, $dir or die "Opening failed for directory $dir $!";
my #dir = grep {-d && /^\d\d-\d\d-\d\d$/ && !$processed->{$_} } readdir $dh;
closedir $dh or die "Unable to close $dir $!";
#dir or die "Found no unprocessed date directories";
my $fdir = '/some/example/path';
for my $date (#dir) {
my $dday = "$dir/$date";
my #gzfiles = glob("$dday/*tar.gz");
foreach my $zf (#gzfiles) {
next if $zf =~ /BMP/ || $zf =~ /LG/ || $zf =~ /MAP/ || $zf =~ /STR/;
print "$zf\n";
copy($zf, $fdir) or die "Unable to copy $zf to $fdir. $!";
}
$processed->{ $date } = 1;
}
store $processed, 'processed_dirs.dat';
If you want to persist the status of whether these directories were processed beyond a single run of your app, you could create a .processed file in each directory and check for the existence of this file before you process the directory.
If you just need to store the status of these directories (processed or unprocessed) during the execution of your script, you could use a hash keyed with the directory name:
my %PROCESSED = ();
if ($processing_done) {
%PROCESSED{$dirname} = 1;
} else {
%PROCESSED{$dirname} = 0;
}
You can check to see if each directory has been processed by reading the key value from the hash:
if (%PROCESSED{$dirname} == 0) {
... do some processing
} else {
... this one is already done
}
This solution finds all directories yet to be processed that are newer than the most recent direcory-date processed. You have manually record it the first time, (before the script is run). The script will update it from that point on.
The file could be named like my $last = 'dir_last.dat'; I just entered a file at the command line like:
C:\Old_Data\perlp>echo 11-07-14 > dir_last.bat
C:\Old_Data\perlp>type dir_last.bat
11-07-14
C:\Old_Data\perlp>
This assumes the newest directory was 11-07-14. You must find out this yourself before running the script.
#!/usr/bin/perl
use strict;
use warnings;
use File::Copy;
my $dir = shift or die "Provide path on command line. $!";
my $last = 'dir_last.dat';
open my $fh, "<", $last or die "Unable to open $last $!";
chomp(my $last_proc = <$fh>);
close $fh or die "Unable to close $last $!";
opendir my $dh, $dir or die "Opening failed for directory $dir $!";
my #dir = sort grep {-d && /^\d\d-\d\d-\d\d$/ && $_ gt $last_proc} readdir $dh;
closedir $dh or die "Unable to close $dir $!";
#dir or die "Found no date directories after last update: $last_proc";
my $fdir = '/some/example/path';
for my $date (#dir) {
my $dday = "$dir/$date";
my #gzfiles = glob("$dday/*tar.gz");
foreach my $zf (#gzfiles) {
next if $zf =~ /BMP/ || $zf =~ /LG/ || $zf =~ /MAP/ || $zf =~ /STR/;
print "$zf\n";
copy($zf, $fdir) or die "Unable to copy $zf to $fdir. $!";
}
}
open $fh, ">", $last or die "Unable to open $last $!";
print $fh "$dir[-1]\n"; # record the newest date-directory as processed
close $fh or die "Unable to close $last $!";
Notice that I didn't rely on cwd like the first script. It really wasn't needed there and isn't needed here. opendir, glob and copy all can handle the dot (cwd) directory and relative paths.
The header includes the lines use strict; and use warnings;. Their purpose is to alert you of errors in your code (most all perl scripts should use them unless an expert decides to exclude them - for what reason I don't know). The first line tells unix where to find the interpreter (perl).