Delete wildcard in perl - perl

I am a new to Perl, but I thought the following should work. I have the following snippet of a larger perl script
#mylist = ("${my_dir}AA_???_???.DAT", "${my_dir}AA???.DAT");
foreach my $list (#mylist) {
if (-e $list) {
system ("cp ${list} ${my_other_dir}");
}
}
The above snippet is not able to find those wildcards, with AA_???_???.DAT but it does able to find the file name with the wildcard AA???.DAT
I have tried also deleting the files AA??_???.DAT as
unlink(glob(${my_dir}AA_???_???.DAT"))
but the script just hangs up. But it is able to delete files match AA???.DAT using:
unlink(glob("${my_dir}AA???.DAT))
What could be the reasons?

-e $list checks for the existence of files, so return false for both AA_???_???.DAT or AA???.DAT (unless you actually have file named exactly that). It's not true that one works and he other one doesn't.
It's also not true that unlink(glob(${my_dir}AA_???_???.DAT")) hangs. For starters, it doesn't even compile.

I would use the opendir and readdir built-in functions (modified from the documentation example):
opendir(my $dh, $some_dir) || die "can't opendir $some_dir: $!";
#mylist = grep { /^(.AA_..._...\.DAT|AA...\.DAT)$/ && -f "$some_dir/$_" } readdir($dh);
closedir $dh;
Then you can plug in your original code:
foreach my $list (#mylist) {
if (-e $list) {
system ("cp $some_dir/${list} ${my_other_dir}/");
}
}
For directory recursive file operations I really like to use the File::Find CPAN module.
This will traverse through sub directories passing each file to a specified subroutine
to process that file. As an example:
#! /usr/bin/perl
use strict;
use warnings;
use File::Find;
my #dirs='/path/to/dir';
my $my_other_dir='/path/to/otherdir';
find(&process_files, #dirs);
sub process_files {
my($file) = $_;
my($fullpath) = $File::Find::name;
return if($file !~ /^AA_..._...\.DAT$/ and
$file !~ /^AA...\.DAT$/);
system ("cp $fullpath $my_other_dir/");
}

Related

Perl, how to choose a directory

I'm trying to determine which of the content of a folder is a directory and which is a file, I wrote the following but the result is not what I would expect:
opendir DH, $dir or die "Cannot open Dir: $!";
my #dirs = grep !/^\.\.?$/, readdir DH ;
foreach my $files (#dirs) {
print $files."<br>";
if ( -d $files )
{
print $files." is a directory<br>";
}
}
closedir DH;
The result is something as the example below:
.file1
file.log
file3.zip
file4
file5.zip
dir1.name1.suffix1.yyyy.MM.dd.hh.mm.ss
file5.zip
file6.tar
dir2
dir3.name1.suffix1.yyyy.MM.dd.hh.mm.ss
where the item starting with dir are actual directory, so my question is why the if is failing discover them as such?
What am I doing wrong?
$diris missing...
if ( -d "$dir/$files" )
{
print $files." is a directory<br>";
}
It's easiest to chdir to $dir so that you don't have to prefix the node names with the path. You can also use autodie if you are running Perl v5.10.1 or better. Finally, if you use $_ as your loop control variable (the file/directory names) you can omit it from the parameters of print, -d and regex matches
Like this
use strict;
use warnings;
use v5.10.1;
use autodie;
my ($dir) = #ARGV;
opendir my $dh, $dir;
chdir $dh;
while ( readdir $dh ) {
next if /\A\.\.?\z/;
print;
print " is a directory" if -d;
print "<br/>\n";
}
... # local expires. working directory returns to its original value
Update
In view of ikegami's (deleted) comment about returning back to the original working directory, here's an example of using the File::chdir module to do this tidily. It exports a tied variable $CWD which will change your working directory if you assign to it. You can also localise it, so just wrapping the above code in braces and adding a new local value for $CWD keeps things neat. Note that File::chdir is not a core module so you will likely need to install it
Note however that there is still a very small possibility that the process may be started with a present working directory that it cannot chdir to. This module will not solve that problem
use strict;
use warnings;
use v5.10.1;
use autodie;
use File::chdir;
my ($dir) = #ARGV;
{
opendir my $dh, $dir;
local $CWD = $dir;
while ( readdir $dh ) {
next if /\A\.\.?\z/;
print;
print " is a directory" if -d;
print "<br/>\n";
}
}

I'm unable to rename a file in linux system using perl

I'm just a beginner in Perl. I try to rename a file or directory using the following script, but it is not renaming the file. Please help me in identifying the problem.
I'm using Perl version 5.8.4
#!/usr/bin/perl
use strict;
use warnings;
use File::Copy;
my $dir="/home/hari/perl-s/abc/";
opendir (DIR, $dir);
my #fileList = readdir DIR;
foreach (#fileList){
next if -d;
my $oldname = $_;
print "Newfile after assigning: $_ \n";
s/(^[0-9])(.)//;
print "Newfile: $_ \n";
print "oldname: $oldname \n";
rename ($oldname,$_);
}
The return values of readdir are just filenames; they do not include the path that was provided to opendir. You generally have to include that manually.
opendir (DIR, $dir);
my #fileList = readdir DIR;
foreach (#fileList){
# $_ is just "filename"
$_ = "$dir/$_"; # now $_ is "/home/hari/perl-s/abc/$filename"
next if -d;
...
}
There's more than one way to do things in Perl. Another way to get the set of files in a directory is with the glob function. One of the advantages of glob is that you can use it in such a way so that it returns filenames with their full paths, and so sometimes glob is preferable to the opendir/readdir/closedir idioms:
my #filelist = glob("$dir/*");
foreach (#filelist) {
# $_ is "/home/hari/perl-s/abc/filename"
...
}

PERL - issues extracting a file from directory/subdirectories/..?

Quick note: I've been stuck with this problem for quite a few days and I'm not necessarily hoping to find an answer, but any kind of help that might "enlighten" me. I would also like to mention that I am a beginner in Perl, so my knowledge is not very vast and in this case recursivity is not my forte. here goes:
What I would like my Perl script to do is the following:
take a directory as an argument
go into the directory that was passed and its subdirectories to find an *.xml file
store the full path of the found *.xml file into an array.
Below is the code that i have so far, but i haven't managed to make it work:
#! /usr/bin/perl -W
my $path;
process_files ($path);
sub process_files
{
opendir (DIR, $path) or die "Unable to open $path: $!";
my #files =
# Third: Prepend the full path
map { $path . '/' . $_ }
# Second: take out '.' and '..'
grep { !/^\.{1,2}$/ }
# First: get all files
readdir (DIR);
closedir (DIR);
for (#files)
{
if (-d $_)
{
push #files, process_files ($_);
}
else
{
#analyse document
}
}
return #files;
}
Anybody have any clues to point me in the right direction? Or an easier way to do it?
Thank you,
sSmacKk :D
Sounds like you should be using File::Find. Its find subroutine will traverse a directory recursively.
use strict;
use warnings;
use File::Find;
my #files;
my $path = shift;
find(
sub { (-f && /\.xml$/i) or return;
push #files, $File::Find::name;
}, $path);
The subroutine will perform whatever code it contains on the files it finds. This one simply pushes the XML file names (with full path) onto the #files array. Read more in the documentation for the File::Find module, which is a core module in perl 5.

Recursive Perl detail need help

i think this is a simple problem, but i'm stuck with it for some time now! I need a fresh pair of eyes on this.
The thing is i have this code in perl:
#!c:/Perl/bin/perl
use CGI qw/param/;
use URI::Escape;
print "Content-type: text/html\n\n";
my $directory = param ('directory');
$directory = uri_unescape ($directory);
my #contents;
readDir($directory);
foreach (#contents) {
print "$_\n";
}
#------------------------------------------------------------------------
sub readDir(){
my $dir = shift;
opendir(DIR, $dir) or die $!;
while (my $file = readdir(DIR)) {
next if ($file =~ m/^\./);
if(-d $dir.$file)
{
#print $dir.$file. " ----- DIR\n";
readDir($dir.$file);
}
push #contents, ($dir . $file);
}
closedir(DIR);
}
I've tried to make it recursive. I need to have all the files of all of the directories and subdirectories, with the full path, so that i can open the files in the future.
But my output only returns the files in the current directory and the files in the first directory that it finds. If i have 3 folders inside the directory it only shows the first one.
Ex. of cmd call:
"perl readDir.pl directory=C:/PerlTest/"
Thanks
Avoid wheel reinvention, use CPAN.
use Path::Class::Iterator;
my $it = Path::Class::Iterator->new(
root => $dir,
breadth_first => 0
);
until ($it->done) {
my $f = $it->next;
push #contents, $f;
}
Make sure that you don't let people set $dir to something that will let them look somewhere you don't want them to look.
Your problem is the scope of the directory handle DIR. DIR has global scope so each recursive call to readDir is using the same DIR; so, when you closdir(DIR) and return to the caller, the caller does a readdir on a closed directory handle and everything stops. The solution is to use a local directory handle:
sub readDir {
my ($dir) = #_;
opendir(my $dh, $dir) or die $!;
while(my $file = readdir($dh)) {
next if($file eq '.' || $file eq '..');
my $path = $dir . '/' . $file;
if(-d $path) {
readDir($path);
}
push(#contents, $path);
}
closedir($dh);
}
Also notice that you would be missing a directory separator if (a) it wasn't at the end of $directory or (b) on every recursive call. AFAIK, slashes will be internally converted to backslashes on Windows but you might want to use a path mangling module from CPAN anyway (I only care about Unix systems so I don't have any recommendations).
I'd also recommend that you pass a reference to #contents to readDir rather than leaving it as a global variable, fewer errors and less confusion that way. And don't use parentheses on sub definitions unless you know exactly what they do and what they're for. Some sanity checking and scrubbing on $directory would be a good idea as well.
There are many modules that are available for recursively listing files in a directory.
My favourite is File::Find::Rule
use strict ;
use Data::Dumper ;
use File::Find::Rule ;
my $dir = shift ; # get directory from command line
my #files = File::Find::Rule->in( $dir );
print Dumper( \#files ) ;
Which sends a list of files into an array ( which your program was doing).
$VAR1 = [
'testdir',
'testdir/file1.txt',
'testdir/file2.txt',
'testdir/subdir',
'testdir/subdir/file3.txt'
];
There a loads of other options, like only listing files with particular names. Or you can set it up as an iterator, which is described in How can I use File::Find
How can I use File::Find in Perl?
If you want to stick to modules that come with Perl Core, have a look at File::Find.

How do I search for .exe files

Do you guys have an idea on how to search or list down .exe files on the server
I am currently using (or maybe place it in an array)?
I will use this command in my Perl program. Assuming that my program is also located on the said server.
My OS is Linux - Ubuntu if that even matters, just in case. Working in CLI here. =)
As mentioned, It is not clear whether you want '*.exe' files, or executable files.
You can use File::Find::Rule to find all executable files.
my #exe= File::Find::Rule->executable->in( '/'); # all executable files
my #exe= File::Find::Rule->name( '*.exe')->in( '/'); # all .exe files
If you are looking for executable files, you (the user running the script) need to be able to execute the file, so you probably need to run the script as root.
It might take a long time to run to.
If you are looking for .exe files, chances are that your disk is already indexed by locate. So this would be much faster:
my #exe= `locate \.exe | grep '\.exe$'`
Perl to find every file under a specified directory that has a .exe suffix:
#!/usr/bin/perl
use strict;
use File::Spec;
use IO::Handle;
die "Usage: $0 startdir\n"
unless scalar #ARGV == 1;
my $startdir = shift #ARGV;
my #stack;
sub process_file($) {
my $file = shift;
print $file
if $file =~ /\.exe$/io;
}
sub process_dir($) {
my $dir = shift;
my $dh = new IO::Handle;
opendir $dh, $dir or
die "Cannot open $dir: $!\n";
while(defined(my $cont = readdir($dh))) {
next
if $cont eq '.' || $cont eq '..';
my $fullpath = File::Spec->catfile($dir, $cont);
if(-d $fullpath) {
push #stack, $fullpath
if -r $fullpath;
} elsif(-f $fullpath) {
process_file($fullpath);
}
}
closedir($dh);
}
if(-f $startdir) {
process_file($startdir);
} elsif(-d $startdir) {
#stack = ($startdir);
while(scalar(#stack)) {
process_dir(shift(#stack));
}
} else {
die "$startdir is not a file or directory\n";
}
Have a look at File::Find.
Alternatively, if you can come up with a command line to the *nix file command, you can use find2perl to convert that command line to a Perl snippet.
I'll probably be shot down for suggesting this, but you don't have to use modules for a simple task. For example:
#!/usr/bin/perl -w
#array = `find ~ -name '*.exe' -print`;
foreach (#array) {
print;
}
Of course, it will need to have some tweaking for your particular choice of starting directory (here, I used ~ for the home directory)
EDIT: Maybe I should have said until you get the modules installed
to get recursively use
use File::Find;
##cal the function by sending your search dir and type of the file
my #exe_files = &get_files("define root directory" , ".exe");
##now in #exe_files will have all .exe files
sub get_files() {
my ($location,$type) = #_;
my #file_list;
if (defined $type) {
find (sub { my $str = $File::Find::name;
if($str =~ m/$type/g ) {
push #file_list, $File::Find::name ;
}
}, $location);
} else {
find (sub {push #file_list, $File::Find::name }, $location);
}
return (#file_list);
}