How can I get all files in a directory, but not in subdirectories, in Perl? - perl

How can I find all the files that match a certain criteria (-M, modification age in days) in a list of directories, but not in their subdirectories?
I wanted to use File::Find, but looks like it always goes to the subdirectories too.

#files = grep { -f && (-M) < 5 } <$_/*> for #folders;

Use readdir or File::Slurp::read_dir in conjunction with grep.
#!/usr/bin/perl
use strict;
use warnings;
use File::Slurp;
use File::Spec::Functions qw( canonpath catfile );
my #dirs = (#ENV{qw(HOME TEMP)});
for my $dir ( #dirs ) {
print "'$dir'\n";
my #files = grep { 2 > -M and -f }
map { canonpath(catfile $dir, $_) } read_dir $dir;
print "$_\n" for #files;
}

You can set File::Find::prune within the 'wanted' function to skip directory trees. Add something like $File::Find::prune = 1 if( -d && $File::Find::name ne ".");

Related

Moving files with incremented files name padded by zeros to directories in Perl

I have files 0001_test.txt to 0100_test.txt. I make directories Dir1 and Dir2. I want to move files 0001_test.txt to 0010_test.txt to directory Dir1.
my current script:
#!/usr/bin/perl
use strict;
use warnings;
use File::Copy;
my #files = <*.txt>;
my $files;
my $count;
for $files (#files) {
++$count;
mkdir -p "Dir1";
mkdir -p "Dir2";
if ($count >= 1 && $count <= 10) {
my $basename = printf "%04d" $count;
mv "($basename)_test.txt" "Dir1";
}
}
This obviously fails, so how would one correct this?
Your question is very odd. Dir2 doesn't appear to have any bearing on the problem, and I don't see how you expect mkdir -p "Dir1" to do anything useful in a Perl program. However, this should solve your problem
#!/usr/bin/perl
use strict;
use warnings;
use File::Copy 'move';
use File::Path 'make_path';
my #dirs = ( 'Dir1', 'Dir2' );
make_path $_ for #dirs;
my #files = glob '*.txt';
my $n;
for my $file ( sort #files ) {
next unless $file =~ /\A\d{4}_test\.txt\z/;
my $new_file = sprintf '%s/%04d_test.txt', $dirs[0], ++$n;
move $file, $new_file;
last if $n == 10;
}
Easy solution:
#!/usr/bin/perl
use strict;
use warnings;
use File::Copy 'move';
my #files = <*.txt>;
my $count;
mkdir "Dir1";
mkdir "Dir2";
my #dir = <Dir*>;
for my $files (#files) {
++$count;
if ($count >= 1 && $count <= 9) {
my $move_files = sprintf '%s/%04d_test.text', $dir[0], $count;
move ($files, $move_files);
}
}

How to rename multiple files with random names together

I have files with random names and i want to rename all them together like Trace1, Trace2 and so on.... any idea?
Or in Perl:
#!/usr/bin/perl
use strict;
use warnings;
# use dirname() to keep the renamed files in the same directory
use File::Basename qw( dirname );
my $i = 1;
for my $file (#ARGV) {
rename $file, dirname($file) . "/Trace$i";
print "$file -> Trace$i\n";
} continue { $i++ }
If you are new to Linux, you need to also remember to make the script executable (assuming the script was saved in the file named random-renamer):
chmod 755 random-renamer
And then to run it (rename all the files in the random-files directory):
./random-renamer random-files/*
You can just use a shell command:
i=1;
for f in *
do
mv $f "Trace$i"
i=$(($i+1))
done
This checks if there are any existing files named Trace# and avoids clobbering them.
use Path::Class qw( dir );
use List::Util qw( max );
my $dir = dir(...);
my #files =
map $_->basename(),
grep !$_->is_dir(),
$dir->children();
my $last =
max 0,
map /^Trace([0-9]+)\z/,
#files;
my $errors;
for (#files) {
my $old = $dir->file($_);
my $new = $dir->file("Trace" . ++$last);
if (!rename($new, $old)) {
warn("Can't rename \"$old\" to \"$new\": $!\n");
++$errors;
}
}
exit($errors ? 1 : 0);

relative absolute path perl

Following directory setup:
/dira/dirb
/dira/dirb/myprog.pl
/dira/dirb/testa/myfilesdir Contains the following files
/dira/dirb/testa/myfilesdir/file1.txt
/dira/dirb/testa/myfilesdir/file2.txt
Current dir:
/dir/dirb
./myprog.pl -p testa/myfilesdir
Cycle through files
while (my $file_to_proc = readdir(DIR)) {
...
$file_to_proc = file1.txt
$file_to_proc = file2.txt
what I want is
$myfile = /dira/dirb/testa/myfilesdir/file1.txt
$myfile = /dira/dirb/testa/myfilesdir/file2.txt
Tried a few different perl module (CWD rel2abs) but it is using current directory. I can not use current directory because input could be relative or absolute path.
Use module File::Spec. Here an example:
use warnings;
use strict;
use File::Spec;
for ( #ARGV ) {
chomp;
if ( -f $_ ) {
printf qq[%s\n], File::Spec->rel2abs( $_ );
}
}
Run it like:
perl script.pl mydir/*
And it will print absolute paths of files.
UPDATED with a more efficient program. Thanks to TLP's suggestions.
use warnings;
use strict;
use File::Spec;
for ( #ARGV ) {
if ( -f ) {
print File::Spec->rel2abs( $_ ), "\n";
}
}

How do I use chdir to traverse subdirectories and parse XML files?

I want to write a script that traverses a directory and its subdirectories, grabs all the XML files and parses them. I am having trouble with chdir. This works fine:
my $search = "/home/user/books";
chdir($search) or die "cant change dir to $search $!";
system("ls");
But I want the user to decide the path where he want to search it so I am using Getopt::Long:
use strict;
use warnings;
use Data::Dumper;
use XML::Simple;
use Getopt::Long;
my $outputFile = '';
my $searchPath = "";
my $debug = 0;
GetOptions('outputFile=s' => \$outputFile, 'searchPath=s' => \$searchPath);
if ($outputFile eq '' or $searchPath = '') {
die("parameter --outpulFile=s is required.");
}
$searchPath =~ s/\/*$/\//;
my #founddirs = `cd $searchPath`;
foreach my $foundfiles (#founddirs) {
print $foundfiles;
chdir($foundfiles) or die "cant change dir to $searchPath $!";
chdir('..');
}
Command to run:
perl sample.pl --outputFile=books.txt --searchPath=/home/user/june18
I want to grab all the recursive.xml files from the subdirectories and parse them. Does anyone know how this can be done?
A couple of issues here:
$searchPath = '' is setting the search path to an empty string during the input validation. Use eq instead (not ==)
#founddirs will contain nothing since the backtick operator will return nothing. This is because
my #founddirs = `cd $searchPath`;
does not print found directories that are separated by newlines. Perhaps you're after ls $searchPath
On a side note, why not use File::Find instead?
use strict;
use warnings;
use File::Find;
use Getopt::Long;
my $outputFile;
my $searchPath;
GetOptions(
'outputFile=s' => \$outputFile,
'searchPath=s' => \$searchPath,
);
die "Usage : perl sample.pl -outputFile -searchPath\n"
unless $outputFile && $searchPath;
die "No such directory found: $searchPath\n" unless -d $searchPath;
find( sub { print "$File::Find::name\n" if /$outputFile/ }, $searchPath );
#!/usr/bin/perl --
use strict; use warnings;
use Data::Dump qw/ dd /;
use File::Find::Rule qw/ find /;
my #files = find(
file =>
name => '*.xml',
in => \#ARGV
);
dd \#files;
__END__
$ perl ffrule
[]
$ perl ffrule ../soap
[
"../soap/ex1.xml",
"../soap/ex2.xml",
"../soap/ex3.xml",
]

Filter filenames by pattern

I need to search for files in a directory that begin with a particular pattern, say "abc". I also need to eliminate all the files in the result that end with ".xh". I am not sure how to go about doing it in Perl.
I have something like this:
opendir(MYDIR, $newpath);
my #files = grep(/abc\*.*/,readdir(MYDIR)); # DOES NOT WORK
I also need to eliminate all files from result that end with ".xh"
Thanks, Bi
try
#files = grep {!/\.xh$/} <$MYDIR/abc*>;
where MYDIR is a string containing the path of your directory.
opendir(MYDIR, $newpath); my #files = grep(/abc*.*/,readdir(MYDIR)); #DOES NOT WORK
You are confusing a regex pattern with a glob pattern.
#!/usr/bin/perl
use strict;
use warnings;
opendir my $dir_h, '.'
or die "Cannot open directory: $!";
my #files = grep { /abc/ and not /\.xh$/ } readdir $dir_h;
closedir $dir_h;
print "$_\n" for #files;
opendir(MYDIR, $newpath) or die "$!";
my #files = grep{ !/\.xh$/ && /abc/ } readdir(MYDIR);
close MYDIR;
foreach (#files) {
do something
}
The point that kevinadc and Sinan Unur are using but not mentioning is that readdir() returns a list of all the entries in the directory when called in list context. You can then use any list operator on that. That's why you can use:
my #files = grep (/abc/ && !/\.xh$/), readdir MYDIR;
So:
readdir MYDIR
returns a list of all the files in MYDIR.
And:
grep (/abc/ && !/\.xh$/)
returns all the elements returned by readdir MYDIR that match the criteria there.
foreach $file (#files)
{
my $fileN = $1 if $file =~ /([^\/]+)$/;
if ($fileN =~ /\.xh$/)
{
unlink $file;
next;
}
if ($fileN =~ /^abc/)
{
open(FILE, "<$file");
while(<FILE>)
{
# read through file.
}
}
}
also, all files in a directory can be accessed by doing:
$DIR = "/somedir/somepath";
foreach $file (<$DIR/*>)
{
# apply file checks here like above.
}
ALternatively you can use the perl module File::find.
Instead of using opendir and filtering readdir (don't forget to closedir!), you could instead use glob:
use File::Spec::Functions qw(catfile splitpath);
my #files =
grep !/^\.xh$/, # filter out names ending in ".xh"
map +(splitpath $_)[-1], # filename only
glob # perform shell-like glob expansion
catfile $newpath, 'abc*'; # "$newpath/abc*" (or \ or :, depending on OS)
If you don't care about eliminating the $newpath prefixed to the results of glob, get rid of the map+splitpath.