I'm unable to rename a file in linux system using perl - perl

I'm just a beginner in Perl. I try to rename a file or directory using the following script, but it is not renaming the file. Please help me in identifying the problem.
I'm using Perl version 5.8.4
#!/usr/bin/perl
use strict;
use warnings;
use File::Copy;
my $dir="/home/hari/perl-s/abc/";
opendir (DIR, $dir);
my #fileList = readdir DIR;
foreach (#fileList){
next if -d;
my $oldname = $_;
print "Newfile after assigning: $_ \n";
s/(^[0-9])(.)//;
print "Newfile: $_ \n";
print "oldname: $oldname \n";
rename ($oldname,$_);
}

The return values of readdir are just filenames; they do not include the path that was provided to opendir. You generally have to include that manually.
opendir (DIR, $dir);
my #fileList = readdir DIR;
foreach (#fileList){
# $_ is just "filename"
$_ = "$dir/$_"; # now $_ is "/home/hari/perl-s/abc/$filename"
next if -d;
...
}
There's more than one way to do things in Perl. Another way to get the set of files in a directory is with the glob function. One of the advantages of glob is that you can use it in such a way so that it returns filenames with their full paths, and so sometimes glob is preferable to the opendir/readdir/closedir idioms:
my #filelist = glob("$dir/*");
foreach (#filelist) {
# $_ is "/home/hari/perl-s/abc/filename"
...
}

Related

Unable to open files returned by readdir in Perl [duplicate]

This question already has answers here:
Why can't I open files returned by Perl's readdir?
(2 answers)
Closed 7 years ago.
I have a problem with a Perl script, as follows.
I must open and analyze all the *.txt files in a directory, but I cannot.
I can read file names that are saved in the #files array and printed, but I cannot open those files for reading.
This is my code:
my $dir= "../Scrivania/programmi" ;
opendir my ($dh), $dir;
my #files = grep { -f and /\.txt/i } readdir $dir;
closedir $dh;
for my $file ( #files ) {
$file = catfile($dir, $file);
print qq{Opening "$file"\n};
open my $fh, '<', $file;
# Do stuff with the data from $fh
print "sono nel foreach\n";
print " in : "."$fh\n";
#open(CANALI,$fh);
##righe=<CANALI>;
#close(CANALI);
#print "canali:"."#righe\n";
#foreach $canali (#righe)
#{
# $canali =~ /\d\d:\d\d (-) (.*)/;
# $ora= $1;
#
# if($hhSplit[0] == $ora)
# {
# push(#output, "$canali");
#
# }
#}
}
The main problem you have is that the file names returned by readdir have no path, so you're trying to open, say, x.txt when you should be opening ../Sc/direct/x.txt. The file doesn't exist in the current working directory so your open call fails
You also have a strange mixture of stuff in glob("$dir/(.*).txt/") which looks a little like a regex pattern, which glob doesn't understand. The value of $dir is a directory handle left open from the opendir on the first line. What you should be using is glob '../Sc/direct/*.txt', but then there's no need for the readdir
There are two ways to find the contents of a file. You can use opendir and readdir to read everything in the directory, or you can use glob
The first method returns only the bare name of each entry, which means you must concatenate each name with the path to the containing directory, preferably using catfile from File::Spec::Functions. It also includes the pseudo-directories . and .. so you must filter those out before you can use the list of names
glob has neither of these disadvantages. All the strings it returns are real directory entries, and they will include a path if you provided one in the pattern you passed as a parameter
You seem to have become rather muddled over the two, so I have written this program which differentiates between the two approaches. I hope it makes things clearer
use strict;
use warnings;
use v5.10.1;
use autodie;
use File::Spec::Functions qw/ catfile /;
my $dir = '../Sc/direct';
### Using glob
for my $file ( glob catfile($dir, '*.txt') ) {
print qq{Opening "$file"\n};
open my $fh, '<', $file;
# Do stuff with the data from $fh
}
### Using opendir / readdir
opendir my ($dh), $dir;
my #files = grep { -f and /\.txt$/i } readdir $dir;
closedir $dh;
for my $file ( #files ) {
$file = catfile($dir, $file);
print qq{Opening "$file"\n};
open my $fh, '<', $file;
# Do stuff with the data from $fh
}
Using $dir in the glob is incorrect. $dir is a GLOB type not a string value. Rather you should be looping over the #files array and looking for names that match what you want. Maybe something like so:
foreach my $fp (#files) {
if ($fp =~ /(.*).txt/) {
print "$fp is a .txt\n";
open (my $in, "<", $fp)
while (<$in>) ...
}
}

Perl, how to choose a directory

I'm trying to determine which of the content of a folder is a directory and which is a file, I wrote the following but the result is not what I would expect:
opendir DH, $dir or die "Cannot open Dir: $!";
my #dirs = grep !/^\.\.?$/, readdir DH ;
foreach my $files (#dirs) {
print $files."<br>";
if ( -d $files )
{
print $files." is a directory<br>";
}
}
closedir DH;
The result is something as the example below:
.file1
file.log
file3.zip
file4
file5.zip
dir1.name1.suffix1.yyyy.MM.dd.hh.mm.ss
file5.zip
file6.tar
dir2
dir3.name1.suffix1.yyyy.MM.dd.hh.mm.ss
where the item starting with dir are actual directory, so my question is why the if is failing discover them as such?
What am I doing wrong?
$diris missing...
if ( -d "$dir/$files" )
{
print $files." is a directory<br>";
}
It's easiest to chdir to $dir so that you don't have to prefix the node names with the path. You can also use autodie if you are running Perl v5.10.1 or better. Finally, if you use $_ as your loop control variable (the file/directory names) you can omit it from the parameters of print, -d and regex matches
Like this
use strict;
use warnings;
use v5.10.1;
use autodie;
my ($dir) = #ARGV;
opendir my $dh, $dir;
chdir $dh;
while ( readdir $dh ) {
next if /\A\.\.?\z/;
print;
print " is a directory" if -d;
print "<br/>\n";
}
... # local expires. working directory returns to its original value
Update
In view of ikegami's (deleted) comment about returning back to the original working directory, here's an example of using the File::chdir module to do this tidily. It exports a tied variable $CWD which will change your working directory if you assign to it. You can also localise it, so just wrapping the above code in braces and adding a new local value for $CWD keeps things neat. Note that File::chdir is not a core module so you will likely need to install it
Note however that there is still a very small possibility that the process may be started with a present working directory that it cannot chdir to. This module will not solve that problem
use strict;
use warnings;
use v5.10.1;
use autodie;
use File::chdir;
my ($dir) = #ARGV;
{
opendir my $dh, $dir;
local $CWD = $dir;
while ( readdir $dh ) {
next if /\A\.\.?\z/;
print;
print " is a directory" if -d;
print "<br/>\n";
}
}

Recursive Perl detail need help

i think this is a simple problem, but i'm stuck with it for some time now! I need a fresh pair of eyes on this.
The thing is i have this code in perl:
#!c:/Perl/bin/perl
use CGI qw/param/;
use URI::Escape;
print "Content-type: text/html\n\n";
my $directory = param ('directory');
$directory = uri_unescape ($directory);
my #contents;
readDir($directory);
foreach (#contents) {
print "$_\n";
}
#------------------------------------------------------------------------
sub readDir(){
my $dir = shift;
opendir(DIR, $dir) or die $!;
while (my $file = readdir(DIR)) {
next if ($file =~ m/^\./);
if(-d $dir.$file)
{
#print $dir.$file. " ----- DIR\n";
readDir($dir.$file);
}
push #contents, ($dir . $file);
}
closedir(DIR);
}
I've tried to make it recursive. I need to have all the files of all of the directories and subdirectories, with the full path, so that i can open the files in the future.
But my output only returns the files in the current directory and the files in the first directory that it finds. If i have 3 folders inside the directory it only shows the first one.
Ex. of cmd call:
"perl readDir.pl directory=C:/PerlTest/"
Thanks
Avoid wheel reinvention, use CPAN.
use Path::Class::Iterator;
my $it = Path::Class::Iterator->new(
root => $dir,
breadth_first => 0
);
until ($it->done) {
my $f = $it->next;
push #contents, $f;
}
Make sure that you don't let people set $dir to something that will let them look somewhere you don't want them to look.
Your problem is the scope of the directory handle DIR. DIR has global scope so each recursive call to readDir is using the same DIR; so, when you closdir(DIR) and return to the caller, the caller does a readdir on a closed directory handle and everything stops. The solution is to use a local directory handle:
sub readDir {
my ($dir) = #_;
opendir(my $dh, $dir) or die $!;
while(my $file = readdir($dh)) {
next if($file eq '.' || $file eq '..');
my $path = $dir . '/' . $file;
if(-d $path) {
readDir($path);
}
push(#contents, $path);
}
closedir($dh);
}
Also notice that you would be missing a directory separator if (a) it wasn't at the end of $directory or (b) on every recursive call. AFAIK, slashes will be internally converted to backslashes on Windows but you might want to use a path mangling module from CPAN anyway (I only care about Unix systems so I don't have any recommendations).
I'd also recommend that you pass a reference to #contents to readDir rather than leaving it as a global variable, fewer errors and less confusion that way. And don't use parentheses on sub definitions unless you know exactly what they do and what they're for. Some sanity checking and scrubbing on $directory would be a good idea as well.
There are many modules that are available for recursively listing files in a directory.
My favourite is File::Find::Rule
use strict ;
use Data::Dumper ;
use File::Find::Rule ;
my $dir = shift ; # get directory from command line
my #files = File::Find::Rule->in( $dir );
print Dumper( \#files ) ;
Which sends a list of files into an array ( which your program was doing).
$VAR1 = [
'testdir',
'testdir/file1.txt',
'testdir/file2.txt',
'testdir/subdir',
'testdir/subdir/file3.txt'
];
There a loads of other options, like only listing files with particular names. Or you can set it up as an iterator, which is described in How can I use File::Find
How can I use File::Find in Perl?
If you want to stick to modules that come with Perl Core, have a look at File::Find.

Filter filenames by pattern

I need to search for files in a directory that begin with a particular pattern, say "abc". I also need to eliminate all the files in the result that end with ".xh". I am not sure how to go about doing it in Perl.
I have something like this:
opendir(MYDIR, $newpath);
my #files = grep(/abc\*.*/,readdir(MYDIR)); # DOES NOT WORK
I also need to eliminate all files from result that end with ".xh"
Thanks, Bi
try
#files = grep {!/\.xh$/} <$MYDIR/abc*>;
where MYDIR is a string containing the path of your directory.
opendir(MYDIR, $newpath); my #files = grep(/abc*.*/,readdir(MYDIR)); #DOES NOT WORK
You are confusing a regex pattern with a glob pattern.
#!/usr/bin/perl
use strict;
use warnings;
opendir my $dir_h, '.'
or die "Cannot open directory: $!";
my #files = grep { /abc/ and not /\.xh$/ } readdir $dir_h;
closedir $dir_h;
print "$_\n" for #files;
opendir(MYDIR, $newpath) or die "$!";
my #files = grep{ !/\.xh$/ && /abc/ } readdir(MYDIR);
close MYDIR;
foreach (#files) {
do something
}
The point that kevinadc and Sinan Unur are using but not mentioning is that readdir() returns a list of all the entries in the directory when called in list context. You can then use any list operator on that. That's why you can use:
my #files = grep (/abc/ && !/\.xh$/), readdir MYDIR;
So:
readdir MYDIR
returns a list of all the files in MYDIR.
And:
grep (/abc/ && !/\.xh$/)
returns all the elements returned by readdir MYDIR that match the criteria there.
foreach $file (#files)
{
my $fileN = $1 if $file =~ /([^\/]+)$/;
if ($fileN =~ /\.xh$/)
{
unlink $file;
next;
}
if ($fileN =~ /^abc/)
{
open(FILE, "<$file");
while(<FILE>)
{
# read through file.
}
}
}
also, all files in a directory can be accessed by doing:
$DIR = "/somedir/somepath";
foreach $file (<$DIR/*>)
{
# apply file checks here like above.
}
ALternatively you can use the perl module File::find.
Instead of using opendir and filtering readdir (don't forget to closedir!), you could instead use glob:
use File::Spec::Functions qw(catfile splitpath);
my #files =
grep !/^\.xh$/, # filter out names ending in ".xh"
map +(splitpath $_)[-1], # filename only
glob # perform shell-like glob expansion
catfile $newpath, 'abc*'; # "$newpath/abc*" (or \ or :, depending on OS)
If you don't care about eliminating the $newpath prefixed to the results of glob, get rid of the map+splitpath.

How can I add a prefix to all filenames under a directory?

I am trying to prefix a string (reference_) to the names of all the *.bmp files in all the directories as well sub-directories. The first time we run the silk script, it will create directories as well subdirectories, and under each subdirectory it will store each mobile application's sceenshot with .bmp extension.
When I run the automated silkscript for second time it will again create the *.bmp files in all the subdirectories. Before running the script for second time I want to prefix all the *.bmp with a string reference_.
For example first_screen.bmp to reference_first_screen.bmp,
I have the directory structure as below:
C:\Image_Repository\BG_Images\second
...
C:\Image_Repository\BG_Images\sixth
having first_screen.bmp and first_screen.bmp files etc...
Could any one help me out?
How can I prefix all the image file names with reference_ string?
When I run the script for second time, the Perl script in silk will take both the images from the sub-directory and compare them both pixel by pixel. I am trying with code below.
Could you please guide me how can I proceed to complete this task.
#!/usr/bin/perl -w
&one;
&two;
sub one {
use Cwd;
my $dir ="C:\\Image_Repository";
#print "$dir\n";
opendir(DIR,"+<$dir") or "die $!\n";
my #dir = readdir DIR;
#$lines=#dir;
delete $dir[-1];
print "$lines\n";
foreach my $item (#dir)
{
print "$item\n";
}
closedir DIR;
}
sub two {
use Cwd;
my $dir1 ="C:\\Image_Repository\\BG_Images";
#print "$dir1\n";
opendir(D,"+<$dir1") or "die $!\n";
my #dire = readdir D;
#$lines=#dire;
delete $dire[-1];
#print "$lines\n";
foreach my $item (#dire)
{
#print "$item\n";
$dir2="C:\\Image_Repository\\BG_Images\\$item";
print $dir2;
opendir(D1,"+<$dir2") or die " $!\n";
my #files=readdir D1;
#print "#files\n";
foreach $one (#files)
{
$one="reference_".$one;
print "$one\n";
#rename $one,Reference_.$one;
}
}
closedir DIR;
}
I tried open call with '+<' mode but I am getting compilation error for the read and write mode.
When I am running this code, it shows the files in BG_images folder with prefixed string but actually it's not updating the files in the sub-directories.
You don't open a directory for writing. Just use opendir without the mode parts of the string:
opendir my($dir), $dirname or die "Could not open $dirname: $!";
However, you don't need that. You can use File::Find to make the list of files you need.
#!/usr/bin/perl
use strict;
use warnings;
use File::Basename;
use File::Find;
use File::Find::Closures qw(find_regular_files);
use File::Spec::Functions qw(catfile);
my( $wanted, $reporter ) = find_regular_files;
find( $wanted, $ARGV[0] );
my $prefix = 'recursive_';
foreach my $file ( $reporter->() )
{
my $basename = basename( $file );
if( index( $basename, $prefix ) == 0 )
{
print STDERR "$file already has '$prefix'! Skipping.\n";
next;
}
my $new_path = catfile(
dirname( $file ),
"recursive_$basename"
);
unless( rename $file, $new_path )
{
print STDERR "Could not rename $file: $!\n";
next;
}
print $file, "\n";
}
You should probably check out the File::Find module for this - it will make recursing up and down the directory tree simpler.
You should probably be scanning the file names and modifying those that don't start with reference_ so that they do. That may require splitting the file name up into a directory name and a file name and then prefixing the file name part with reference_. That's done with the File::Basename module.
At some point, you need to decide what happens when you run the script the third time. Do the files that already start with reference_ get overwritten, or do the unprefixed files get overwritten, or what?
The reason the files are not being renamed is that the rename operation is commented out. Remember to add use strict; at the top of your script (as well as the -w option which you did use).
If you get a list of files in an array #files (and the names are base names, so you don't have to fiddle with File::Basename), then the loop might look like:
foreach my $one (#files)
{
my $new = "reference_$one";
print "$one --> $new\n";
rename $one, $new or die "failed to rename $one to $new ($!)";
}
With the aid of find utility from coreutils for Windows:
$ find -iname "*.bmp" | perl -wlne"chomp; ($prefix, $basename) = split(m~\/([^/]+)$~, $_); rename($_, join(q(/), ($prefix, q(reference_).$basename))) or warn qq(failed to rename '$_': $!)"