Okay so I have a program that basically looks into a passed in directory, if any file names match a pattern I will make a directory and move that specific file and any that matches it (regardless of extension) into that directory. Now if they don't match I should move them into the PassedInDir/misc/ directory.
I have a condition in both cases to avoid passing in any directory (as my program isn't ready to deal with those yet) something like if( ! -d $fp).
Everything works fine when I run it the first time in the directory. However when I run it again on the same directory (which should now only contain directories) I get the Error Could not move file assignmentZ to destination DataB/misc at projectSorter.pl line 16.. AssignmentZ is a directory however its somehow getting past the (!-d) in the second case.
#!/usr/bin/perl -w
use File::Copy;
if(#ARGV < 1){
print "\nUsage: proj6.pl <directory>\n\n";
exit;
}
die("\nDirectory $ARGV[0] does not exist\n\n") if( ! -e $ARGV[0]);
opendir( DIR, $ARGV[0]) or die("\nCould not open directory $ARGV[0]\n\n");
while(($fp = readdir(DIR))){
if($fp =~ m/proj(.*)\./){
(! -d "$ARGV[0]/assignment$1") && (mkdir "$ARGV[0]/assignment$1");
move("$ARGV[0]/$fp" , "$ARGV[0]/assignment$1") or die("Could not move file $fp to destination $ARGV[0]/assignment$1");
}
elsif(! -d $fp){ #gets past here!!!
(! -d "$ARGV[0]/misc") && (mkdir "$ARGV[0]/misc");
move("$ARGV[0]/$fp" , "$ARGV[0]/misc") or die("Could not move file $fp to destination $ARGV[0]/misc");
}
}
It is the only directory to do it out of the ones previously made by running my program once. I am curious about why this is happening.
$fp as set by readdir is relative to scanned directory. chdir to the scanned directory or prepend the scanned directory name for -d test.
You use "$ARGV[0]/$fp" as argument to move function.
perldoc -f readdir
readdir DIRHANDLE
Returns the next directory entry for a directory opened by
"opendir". […]
If you're planning to filetest the return values out of a
"readdir", you'd better prepend the directory in question.
Otherwise, because we didn't "chdir" there, it would have been
testing the wrong file.
Some suggestions.
‣ Don't use the -w flag with Perl. Some modules turn warnings off to do their work but the -w flag is global. With it, they will report warnings that should be ignored.
‣ Always have these two lines at the top of every script.
use strict;
use warnings;
These will catch a lot of errors in your code. See perldoc strict and perldoc warnings for more details.
‣ Use glob() or Find::Find instead of opendir/readdir/closedir.
‣ Use make_path() from File::Path instead of mkdir.
‣ Use an if statement for conditional execution instead of &&.
‣ Place blank lines in your code to make reading it easier.
File::Find and File::path are standard modules that come installed with Perl. For a list of the standard modules, see perldoc perlmodlib.
#!/usr/bin/perl
# --------------------------------------
# pragmas
use strict;
use warnings;
# --------------------------------------
# modules
use File::Copy;
use File::Path qw( make_path );
# --------------------------------------
# main
# make sure there is something to work on
if(#ARGV < 1){
print "\nUsage: proj6.pl <directory>\n\n";
exit;
}
# arguments should be directories
for my $src_dir ( #ARGV ){
# validate the source directory
die("\n$src_dir does not exist\n\n") if( ! -e $src_dir);
die("\n$src_dir is not a directory\n\n") if( ! -d $src_dir);
# move proj* files
for my $proj ( glob( "$src_dir/proj*" )){
# get the proj number
( my $number ) = $proj =~ m/proj(.*)\./;
# get the destination directory
my $dst_dir = "$src_dir/assignment$number";
# create the directory where it goes
if( ! -d $dst_dir ){
make_path( $dst_dir ) or die "could not make path $dst_dir";
}
# move the file
move( $proj, $dst_dir ) or die( "could not move file $proj to destination $dst_dir" );
} # end of $proj files
# move other files
for my $file ( grep { ! -d } glob( "$src_dir/*" )){
# get the destination directory
my $dst_dir = "$src_dir/misc";
# create the directory where it goes
if( ! -d $dst_dir ){
make_path( $dst_dir ) or die "could not make path $dst_dir";
}
# move the file
move( $file, $dst_dir ) or die( "could not move file $file to destination $dst_dir" );
} # end other files
} # end of src_dir
Related
This question already has answers here:
Using Perl to rename files in a directory
(4 answers)
Closed 7 years ago.
I'm trying following code to search for .txt files and rename them while copying it to different directory.
#!/bin/perl
use File::Basename;
#txtfiles = <*/*.txt>;
foreach my $file(#textfiles){
$dir = dirname($file);
$file = basename($file);
$file =~ s/(\d+)/$dir/; //renaming number with $dir
`mkdir -p summary` unless -d summary;
`cp $file summary`;
}
Above code gives error saying no such file to copy however print statement at each line shows correctly (renamed file names)
NOOOOO!
Don't use system commands -- especially since there are Perl commands that you can use.
`mkdir -p summary` unless -d summary;
`cp $file summary`;
Use the Perl commands!
use File::Copy; # Standard Perl Module. USE IT!
...
mkdir 'summary' unless -d 'summary'; # No need for `-p`
copy $file, 'summary';
Here's a revised script:
#! /usr/bin/env perl
use strict;
use warnings;
use feature qw(say);
# use File::Basename; -- No need for this in this version
use File::Copy;
# What files do you want to find?
# Do you want to find the immediate files or the
# files in the subdirectories?
my #text_files = glob('*.txt'); # Immediate files only
mkdir 'summary' if not -d 'summary'; # Make this dir before the loop
for my $file ( #text_files ) {
next unless -f $file; # Make sure this is a file you want to copy!
#
# No need for 'dirname' and 'basename' with '*.txt' glob
#
if ( not copy $file, 'summary' ) { # Check the outcome of this command.
warn qq(Could not copy file "$file" to "summary".);
}
}
Let us know if you need to copy files in subdirectories rather than just in the immediate directory. Maybe you can use make_tree found in File::Path, or the Perl version of find from File::Find. These are standard Perl modules that all Perl installations have.
Addendum
I don't want current directory. The files are resides inside one directory i.e. foo/text_v0001.txt, foo/text_v0002.txt, foo_au/text_v0003.txt, foo_au/text_v0004.txt continues.... I want to replace the numbers with directory name e.g. foo/text_v0001.txt should renamed to text_foo.txt and foo/text_v0002.txt should renamed to text_foo.txt (since in same folder we can't have same name files, we can add part2 and the end of second file i.e. text_fooPart2.txt).
That last part is a doozy, and it's a new requirement too. I need to verify that a file doesn't already exist with the same name, and if it does, I need to make sure that I find the next available name.
If I find that a file already exist, I'll loop incrementing a duplicate file counter until I find a file name that doesn't already exist.
#! /usr/bin/env perl
use strict;
use warnings;
use feature qw(say);
use File::Basename;
use File::Copy;
use File::Glob; # Improved Glob matching.
use constant {
DIRECTORY => 'summary',
};
# What files do you want to find?
# Do you want to find the immediate files or the
# files in the subdirectories?
#
# Let's do a more sophisticated pattern making sure we're catching
# the files we want.
#
my #text_files = glob('*/*.txt'); # Subdirectories only
mkdir DIRECTORY if not -d DIRECTORY; # Make this dir before the loop
for my $file ( #text_files ) {
my $dir_name = dirname $file;
my $file_name = basename $file;
say "DEBUG: On '$file'.";
#
# Let's make sure that the file name matches the expected
# pattern. If the substitution doesn't succeed, we assume
# this file shouldn't be copied, and skip it.
#
# I'm serching for a file that has the suffix '_vxxxx.txt' where
# 'xxxx' is some number. I remove the number and the letter `v`,
# and add in the directory name.
#
if ( not $file_name =~ s/_v(\d+)\.txt$/_$dir_name.txt/ ) {
warn qq("$dir_name/$file_name" has not been copied.");
next;
}
#
# If the name matches, make sure it's a file
#
if ( not -f $file ) {
warn qq("$file" is not a file and wasn't copied.");
next
}
#
# Now make sure file name is unique
#
if ( -f DIRECTORY . "/$file_name" ) { # File name already exists
say qq(DEBUG: Duplicate File '$file_name' detected!);
my $dup_file_counter = 2;
( my $file_no_suffix = $file_name ) =~ s/\.txt$//;
#
# Find a non-matching name
#
for (;;) {
my $new_file_name = $file_no_suffix . "_part_$dup_file_counter.txt";
say "DEBUG: New file name '$new_file_name'";
say qq(DEBUG: if ( not -e #{[DIRECTORY]} . "/$new_file_name" ) { );
if ( not -e DIRECTORY . "/$new_file_name" ) {
$file_name = $new_file_name;
last;
}
else {
$dup_file_counter += 1;
}
}
}
if ( not copy $file, DIRECTORY . "/$file_name" ) { # Check the outcome of this command.
warn qq(Could not copy file "$file" to directory ") . DIRECTORY . qq(".);
}
}
in the loop you are using #textfiles instead of #txtfiles. Use strict
#!/usr/local/bin/perl
use File::Basename;
use strict;
use warnings;
my #txtfiles = glob("*.txt");
foreach my $file(#txtfiles){
my $dir = dirname($file);
$file = basename($file);
$file =~ s/(\d+)/$dir/; # renaming number with $dir
`mkdir -p summary` unless -d "summary";
`cp $file summary`;
}
I have a Perl script which takes a few arguments. It is executed like this:
exec myscript.pl --file=/path/to/input/file --logfile=/path/to/logfile/logfile.log
I have the following line in the script:
open LOGFILE, ">>$logFilePath" or die "Can't open '$logFilePath': $!\n";
Where $logfilePath is taken from the command line.
If there is a path, /path/to/logfile/, but no logfile.log, it just creates it (which is the desired action). However, it fails to start if there is no such path. How can I make the script create the path for the logfile, if it does not exist prior to running the script?
Suppose you have the path to the logfile (which may or may not include the filename: logfile.log) in the variable $full_path. Then, you can create the respective directory tree if needed:
use File::Basename qw( fileparse );
use File::Path qw( make_path );
use File::Spec;
my ( $logfile, $directories ) = fileparse $full_path;
if ( !$logfile ) {
$logfile = 'logfile.log';
$full_path = File::Spec->catfile( $full_path, $logfile );
}
if ( !-d $directories ) {
make_path $directories or die "Failed to create path: $directories";
}
Now, $full_path will contain the full path to the logfile.log file. The directory tree in the path will have also been created.
Update: as Dave Cross pointed out, mkdir only creates a single directory. So, this won't work if you want to create multiple levels at once.
Use Perl's mkdir command. Example:
#Get the path portion only, without the filename.
if ($logFilePath =~ /^(.*)\/[^\/]+\.log$/)
{
mkdir $1 or die "Error creating directory: $1";
}
else
{
die "Invalid path name: $logFilePath";
}
Using perl's own function is preferable to running a unix command.
Edit: of course, you should also check if the directory exists first. Use -e to check if something exists. Adding this to the above code:
#Get the path portion only, without the filename.
if ($logFilePath =~ /^(.*)\/[^\/]+\.log$/)
{
if (-e $1)
{
print "Directory exists.\n";
}
else
{
mkdir $1 or die "Error creating directory: $1";
}
}
else
{
die "Invalid path name: $logFilePath";
}
The program I'm writing is suppose to open two directories and read through the files in them to compare the contents of those files. Then the functions that have changed in the files should be printed to a file. This program will mainly be checking .cpp files and .h files.
Currently I am trying to go through the directory and open the current file I am at to print the functions that have changed. However, I keep getting an error that states that the file is not a file and can't be opened.
Here is part of my current code that I am using
use strict;
use warnings;
use diagnostics -verbose;
use File::Compare;
use Text::Diff;
my $newDir = 'C:\Users\kkahla\Documents\Perl\TestFiles2';
my $oldDir = 'C:\Users\kkahla\Documents\Perl\TestFiles';
chomp $newDir;
$newDir =~ s#/$##;
chomp $oldDir;
$oldDir =~ s#/$##;
# Checks to make sure they are directories
unless(-d $newDir or -d $oldDir) {
print STDERR "Invalid directory path for one of the directories";
exit(0);
}
# Makes a directory for the outputs to go to unless one already exists
mkdir "Outputs", 0777 unless -d "Outputs";
# opens output file
open (OUTPUTFILE, ">Outputs\\diffDirectoriesOutput.txt");
print OUTPUTFILE "Output statistics for comparing two directories\n\n";
# opens both directories
opendir newDir, $newDir;
my #allNewFiles = grep { $_ ne '.' and $_ ne '..'} readdir newDir;
closedir newDir;
opendir oldDir, $oldDir;
my #allOldFiles = grep { $_ ne '.' and $_ ne '..'} readdir oldDir;
closedir oldDir
Here is where I want to open the files to read through them:
elsif((File::Compare::compare("$newDir/$_", "$oldDir/$_") == 1)) {
print OUTPUTFILE "File: $_ has been update. Please check marked functions for differences\n\n";
diff "$newDir/$_", "$oldDir/$_", { STYLE => "Table" , OUTPUT => \*OUTPUTFILE};
#Here is where I want to open the file but when I try it throws an error
#Here are the two opens I have tried:
open (FILE, "<$newDir/$_") or die "Can't open file"; #first attempt
open (FILE, "<$_") or die "Can't open file"; #second attempt to see if it worked
}
I tried adding the flags
my #allNewFiles = grep { $_ ne '.' and $_ ne '..' && -e $_} readdir newDir;
my #allNewFiles = grep { $_ ne '.' and $_ ne '..' && -f $_} readdir newDir;
But that would simply remove all files that weren't .pl file extensions. I tested that on some simple directories I have that have two copies of .txt, .cpp, .h, .c, .py, and .pl file extensions and it would only show that the .pl files were files.
I am new to perl and any help would be appreciated.
-f is returning undef with $! set to "No such file or directory" because you are passing a file name to -f instead of a path to the file.
Change
-f $_
to
-f "$newDir/$_"
I would like to remove a broken symlink in a directory using Perl.
In my mind I just had to list the file of a directory and test is this a symlink (-l) and if it returns false just unlink it.
But it appears that when using readir to list all files my broken symlinks are not recoganized as a file. Because my link is pointing to nothing I understand why.
All the file in $myDir are symlinks, either valid or broken.
When I display #files I only get a list of valid symlink.
opendir DIR, $myDir;
my #files = grep(/$regexp/,readdir(DIR));
closedir DIR;
print "filenames : #files\n";
There are two main relevant system calls, stat() and lstat(). The lstat() call will tell you that it is a symlink (but on other files, behaves the same as stat()). This allows you to determine that the name is a symlink. The stat() system call follows a symlink to its end, and tells you about the file (or directory) at the end of the link. If the stat() call fails on the symlink, then the symlink is broken or you're trying to access a directory or file where you have no permission.
The Perl file test operators include -l to detect whether a name is a symlink. You can use the Perl functions stat and lstat explicitly. Between these, you should be able to sort out whether a symlink is broken or not - but you should probably plan to write a function to do the job.
You probably don't need to use the readlink Perl function. Beware the underlying system readlink() call; it does not return a null-terminated string!
It is interesting that neither Perl nor its POSIX module supports the realpath() function. However, the PathTools module does support it. If realpath fails, on a symlink, the symlink is non-functional (aka broken).
Here's some code I've used to remove broken links:
chdir $dir or die;
opendir(DIR, '.') or die;
foreach my $link (readdir DIR) {
next unless -l $link and not -e readlink($link);
print "Removing broken link $link\n";
unlink $link;
}
closedir DIR;
Note that it's important that the directory containing the links is the current directory. readdir returns only filenames, and the links might be relative.
Combining lstat with stat:
say "dangling link at $fn" if (lstat $fn and not stat $fn);
update: it works for me...
salva#topo:~/t/dl$ perl -E 'opendir $dh, "."; say $_ for grep { !stat $_ and lstat $_ } readdir $dh'
foo
salva#topo:~/t/dl$ ls -l
total 0
-rw-rw-r-- 1 salva salva 0 2011-07-05 12:34 f
lrwxrwxrwx 1 salva salva 11 2011-07-05 12:00 fii -> /etc/shadow
lrwxrwxrwx 1 salva salva 12 2011-07-05 11:59 foo -> /etc/hjdkshf
Check for broken symlinks (checking only the top level if there are symlinks to symlinks):
use strict;
use warnings;
use autodie;
opendir my $dirh, '.';
while (my $file = readdir $dirh) {
if ( -l $file ) {
my $target = readlink $file;
if ( ! -e $target && ! -l $target ) {
print "$file -> $target broken\n";
}
}
}
Use readlink() and stat() the result.
Using the built-in Perl glob function ?
For examples:
#files = <*>;
foreach $file (#files) {
print $file . "\n";
}
For a specific $dir:
#files = <$dir*>;
foreach $file (#files) {
print $file . "\n";
}
A broken symlink is a link (-l) that does not exists (!-e)
perl -e 'print "broken: $_\n" for grep { -l and ! -e } glob("*");'
I have a folder called Client which contains many subfolders. I want to create a Perl script to look at each of those subfolders and check for a folder there. If it is there, I want to skip it and move on, if it is not there, I want to create it and do some processing.
How do I go about looping through all of the subfolders and checking for the directory I want? I have found a lot of information on how to get all the files in a folder and/or subfolders, but nothing on checking for a directory within each subfolder.
Augh! Too much complexity in the other answers. The original question doesn't appear to be asking for a recursive traversal. As far as I can see, this is a perfectly sensible solution, and vastly more readable to boot:
foreach my $dir (glob "Client/*") {
next if ! -d $dir; # skip if it's not a directory
next if -d "$dir/subfolder"; # skip if subfolder already exists
mkdir "$dir/subfolder" or die; # create it
do_some_processing(); # do some processing
}
Seriously folks: opendir/readdir? Really?
It's pretty easy once you break it into steps. Get a list of the subdirectories with glob then see which ones don't have the second-level directory. If you are using a File::Find-like module, you are probably doing too much work:
#!perl
use strict;
use warnings;
use File::Spec::Functions;
my $start = 'Clients';
my $subdir = 'already_there';
# #queue is the list of directories you need to process
my #queue = grep { ! -d catfile( $_, $subdir ) } # filter for the second level
grep { -d } # filter for directories
glob catfile( $start, '*' ); # everything below $start
#!/usr/bin/perl
use strict;
use Fcntl qw( :DEFAULT :flock :seek );
use File::Spec;
use IO::Handle;
my $startdir = shift #ARGV || '.';
die "$startdir is not a directory\n"
unless -d $startdir;
my $verify_dir_name = 'MyDir';
my $dh = new IO::Handle;
opendir $dh, $startdir or
die "Cannot open $startdir: $!\n";
while(defined(my $cont = readdir($dh))) {
next
if $cont eq '.' || $cont eq '..';
my $fullpath = File::Spec->catfile($dir, $cont);
next
unless -d $fullpath && -r $fullpath && -w $fullpath;
my $verify_path = File::Spec->catfile($fullpath, $verify_dir_name);
next
if -d $verify_path;
mkdir($verify_path, 0755);
# do whatever other operations you want to $verify_path
}
closedir($dh);
The short answer is use File::FInd.
The long answer is first write a subroutine that validates the existence of the folder and if the folder is not there, create it and then do the processing needed. Then invoke the find method of the File::Find module with a reference to the subroutine and the starting folder to process all the subfolders.