I have files with random names and i want to rename all them together like Trace1, Trace2 and so on.... any idea?
Or in Perl:
#!/usr/bin/perl
use strict;
use warnings;
# use dirname() to keep the renamed files in the same directory
use File::Basename qw( dirname );
my $i = 1;
for my $file (#ARGV) {
rename $file, dirname($file) . "/Trace$i";
print "$file -> Trace$i\n";
} continue { $i++ }
If you are new to Linux, you need to also remember to make the script executable (assuming the script was saved in the file named random-renamer):
chmod 755 random-renamer
And then to run it (rename all the files in the random-files directory):
./random-renamer random-files/*
You can just use a shell command:
i=1;
for f in *
do
mv $f "Trace$i"
i=$(($i+1))
done
This checks if there are any existing files named Trace# and avoids clobbering them.
use Path::Class qw( dir );
use List::Util qw( max );
my $dir = dir(...);
my #files =
map $_->basename(),
grep !$_->is_dir(),
$dir->children();
my $last =
max 0,
map /^Trace([0-9]+)\z/,
#files;
my $errors;
for (#files) {
my $old = $dir->file($_);
my $new = $dir->file("Trace" . ++$last);
if (!rename($new, $old)) {
warn("Can't rename \"$old\" to \"$new\": $!\n");
++$errors;
}
}
exit($errors ? 1 : 0);
Related
I have a segment of code that is working that finds all of the .txt files in a given directory, but I can't get it to look in the subdirectories.
I need my script to do two things
scan through a folder and all of its subdirectories for a text file
print out just the last segments of its path
For example, I have a directory structed
C:\abc\def\ghi\jkl\mnop.txt
I script that points to the path C:\abc\def\. It then goes through each of the subfolders and finds mnop.txt and any other text file that is in that folder.
It then prints out ghi\jkl\mnop.txt
I am using this, but it really only prints out the file name and if the file is currently in that directory.
opendir(Dir, $location) or die "Failure Will Robertson!";
#reports = grep(/\.txt$/,readdir(Dir));
foreach $reports(#reports)
{
my $files = "$location/$reports";
open (res,$files) or die "could not open $files";
print "$files\n";
}
I do believe that this solution is more simple and easier to read. I hope it is helpful !
#!/usr/bin/perl
use File::Find::Rule;
my #files = File::Find::Rule->file()
->name( '*.txt' )
->in( '/path/to/my/folder/' );
for my $file (#files) {
print "file: $file\n";
}
What about using File::Find?
#!/usr/bin/env perl
use warnings;
use strict;
use File::Find;
# for example let location be tmp
my $location="tmp";
sub find_txt {
my $F = $File::Find::name;
if ($F =~ /txt$/ ) {
print "$F\n";
}
}
find({ wanted => \&find_txt, no_chdir=>1}, $location);
Much easier if you just use File::Find core module:
#!/usr/bin/perl
use strict;
use warnings FATAL => qw(all);
use File::Find;
my $Target = shift;
find(\&survey, #ARGV);
sub survey {
print "Found $File::Find::name\n" if ($_ eq $Target)
}
First argument: pathless name of file to search for. All subsequent arguments are directories to check. File::Find searches recursively, so you only need to name the top of a tree, all subdirectories will automatically be searched as well.
$File::Find::name is the full pathname of the file, so you could subtract your $location from that if you want a relative path.
I'm writing a Perl script and I'm new to Perl -- I have a file that contains a list of files. For each item on the list I want to search a given directory and its sub-directories to find the file return the full path. I've been unsuccessful thus far trying to use File::Find. Here's what I got:
use strict;
use warnings;
use File::Find;
my $directory = '/home/directory/';
my $input_file = '/home/directory/file_list';
my #file_list;
find(\&wanted, $directory);
sub wanted {
open (FILE, $input_file);
foreach my $file (<FILE>) {
chomp($file);
push ( #file_list, $file );
}
close (FILE);
return #file_list;
}
I find File::Find::Rule a tad easier and more elegant to use.
use File::Find::Rule;
my $path = '/some/path';
# Find all directories under $path
my #paths = File::Find::Rule->directory->in( $path );
# Find all files in $path
my #files = File::Find::Rule->file->in( $path );
The arrays contain full paths to the objects File::Find::Rule finds.
File::Find is used to traverse a directory structure in the filesystem. Instead of doing what you're trying to do, namely, have the wanted subroutine read in the file, you should read in the file as follows:
use strict;
use warnings;
use vars qw/#file_list/;
my $directory = '/home/directory/';
my $input_file = '/home/directory/file_list';
open FILE, "$input_file" or die "$!\n";
foreach my $file (<FILE>) {
chomp($file);
push ( #file_list, $file );
}
# do what you need to here with the #file_list array
Okay, well re-read the doc and I misunderstood the wanted subroutine. The wanted is a subroutine that is called on every file and directory that is found. So here's my code to take that into account
use strict;
use warnings;
use File::Find;
my $directory = '/home/directory/';
my $input_file = '/home/directory/file_list';
my #file_list;
open (FILE, $input_file);
foreach my $file (<FILE>) {
chomp($file);
push ( #file_list, $file );
}
close (FILE);
find(\&wanted, $directory);
sub wanted {
if ( $_ ~~ #file_list ) {
print "$File::Find::name\n";
}
return;
}
Following directory setup:
/dira/dirb
/dira/dirb/myprog.pl
/dira/dirb/testa/myfilesdir Contains the following files
/dira/dirb/testa/myfilesdir/file1.txt
/dira/dirb/testa/myfilesdir/file2.txt
Current dir:
/dir/dirb
./myprog.pl -p testa/myfilesdir
Cycle through files
while (my $file_to_proc = readdir(DIR)) {
...
$file_to_proc = file1.txt
$file_to_proc = file2.txt
what I want is
$myfile = /dira/dirb/testa/myfilesdir/file1.txt
$myfile = /dira/dirb/testa/myfilesdir/file2.txt
Tried a few different perl module (CWD rel2abs) but it is using current directory. I can not use current directory because input could be relative or absolute path.
Use module File::Spec. Here an example:
use warnings;
use strict;
use File::Spec;
for ( #ARGV ) {
chomp;
if ( -f $_ ) {
printf qq[%s\n], File::Spec->rel2abs( $_ );
}
}
Run it like:
perl script.pl mydir/*
And it will print absolute paths of files.
UPDATED with a more efficient program. Thanks to TLP's suggestions.
use warnings;
use strict;
use File::Spec;
for ( #ARGV ) {
if ( -f ) {
print File::Spec->rel2abs( $_ ), "\n";
}
}
Right now, I am using something like
copy catfile($PATH1, "instructions.txt"), catfile($ENV{'DIRWORK'});
individually for each of the .txt files that I want to copy. This code is portable as it does not use any OS specific commands.
How can I copy all the text files from $PATH1 to DIRWORK, when I do not know the individual names of all the files, while keeping the code portable?
You can use the core File::Copy module like this:
use File::Copy;
my #files = glob("$PATH1/*.txt");
for my $file (#files) {
copy($file, $ENV{DIRWORK}) or die "Copy failed: $!";
}
Using core File::Find and File::Copy and assuming you want all .txt files in $PATH1 copied to $ENV{DIRWORK}, and also assuming you want it to recurse...
use strict;
use warnings;
use File::Find;
use File::Copy;
die "ENV variable DIRWORK isn't set\n"
unless defined $ENV{DIRWORK} and length $ENV{DIRWORK};
die "DIRWORK $ENV{DIRWORK} is not a directory\n"
unless -d $ENV{DIRWORK};
my $PATH1 = q{/path/to/wherever};
die "PATH1 is not a directory" unless -d $PATH1;
find( sub{
# $_ is just the filename, "test.txt"
# $File::Find::name is the full "/path/to/the/file/test.txt".
return if $_ !~ /\.txt$/i;
my $dest = "$ENV{DIRWORK}/$_";
copy( $File::Find::name, $dest ) or do {
warn "Could not copy $File::Find::name, skipping\n";
return;
}
}, $PATH1 );
Give it a go ;)
Alternatively, why don't you use bash ?
$ ( find $PATH1 -type f -name '*.txt' | xargs -I{} cp {} $DIRWORK );
If you are guaranteed to work on a Unix system (e.g. don't care about portability), I'll go against my own usual inclinations and accepted best practices and recommend considering using "cp" :)
system("cp $PATH1/*.txt $ENV{'DIRWORK'}");
# Add error checking and STDERR redirection!
For Perl native solution, combine globbed file list (or File::Find) with File::Spec ability to find actual file name
my #files = glob("$PATH1/*.txt");
foreach my $file (#files) {
my ($volume,$directories,$filename) = File::Spec->splitpath( $file );
copy($file, File::Spec->catfile( $ENV{'DIRWORK'}, $filename ) || die "$!";
}
File::Find and File::Copy are portable:
use File::Find;
use File::Copy;
find(
sub {
return unless ( -f $_ );
$_ =~ /\.txt$/ && copy( $File::Find::name, $ENV{'DIRWORK'} );
},
$PATH1
);
How can I find all the files that match a certain criteria (-M, modification age in days) in a list of directories, but not in their subdirectories?
I wanted to use File::Find, but looks like it always goes to the subdirectories too.
#files = grep { -f && (-M) < 5 } <$_/*> for #folders;
Use readdir or File::Slurp::read_dir in conjunction with grep.
#!/usr/bin/perl
use strict;
use warnings;
use File::Slurp;
use File::Spec::Functions qw( canonpath catfile );
my #dirs = (#ENV{qw(HOME TEMP)});
for my $dir ( #dirs ) {
print "'$dir'\n";
my #files = grep { 2 > -M and -f }
map { canonpath(catfile $dir, $_) } read_dir $dir;
print "$_\n" for #files;
}
You can set File::Find::prune within the 'wanted' function to skip directory trees. Add something like $File::Find::prune = 1 if( -d && $File::Find::name ne ".");