How can I unlink a windows folder with spaces? - perl

I am trying to remove files and folders that do not match our naming standard. Well I can seem to loop through an array of collected files and folders that do not match and delete them. The problem I am running into is I cannot remove files that have spaces in them. I am running on Windows with Strawberry Perl.
Here is a sample of the array
Picture.jpg
Sample Document.doc
New Folder
The only thing I can delete successfully is Picture.jpg in this example.
Here is the function
foreach my $folder (#$returned_unmatches) {
print "$folder\n";
remove_root_junk($office,$folder);
}
Here is the subroutine.
sub remove_root_junk {
my $office = shift;
my $folder = shift;
my $order_docs_path = $office_names{$office};
unlink "$order_docs_path\\$folder";
}
$order_docs_path is just the full path up to where I am working. In this case C:\Data\Locations\Canton\Order_Documents if this help at all.

Under normal circumstances, directly unlinking a directory is not supported. The documentation for unlink says:
Note: unlink will not attempt to delete directories unless you are superuser and the -U flag is supplied to Perl. Even if these conditions are met, be warned that unlinking a directory can inflict damage on your filesystem. Finally, using unlink on directories is not supported on many operating systems. Use rmdir instead.
If your subdirectory is not empty, the rmdir documentation suggests using rmtree in File::Path.

Related

the perl script is deleting only the contents inside the directory and not the directory

I have the perl code to delete the files inside the directory and later the directory.
find ( sub {
my $file = $File::Find::name;
if ( -f $file ) {
push (#file_list, $file);
}
}, #find_dirs);
for my $file (#file_list) {
my #stats = stat($file);
if ($now-$stats[9] > $AGE) {
unlink $file;
}
}
But the above code is deleting only the contents inside the directories and sub directories leaving behind all the empty folder.
Could anyone please help me with the changes to be done to above coding so that it deletes the files and also the directories.
unlink does not delete directories, only files.
Note: unlink will not attempt to delete directories unless you are
superuser and the -U flag is supplied to Perl. Even if these
conditions are met, be warned that unlinking a directory can inflict
damage on your filesystem. Finally, using unlink on directories is not
supported on many operating systems. Use rmdir instead.
You want rmdir, and you probably want to check with -d which one to use, unless you don't care about warnings.
I am only putting the code together, so you may upvote #simbabque that answered first. Try:
finddepth( sub {
my $file = $File::Find::name;
my #stats = stat($file);
if( -f $file && $now - $stats[9] > $AGE ) {
unlink $file;
}
elsif( -d $file ) {
rmdir $file;
}
}, #find_dirs );
A few comments:
File::Find will find both files and directories.
-f checks for a file; -d for a directory.
rmdir will only remove a directory if the directory is empty. That is why files must be deleted first. finddepth takes care of this.
-f and -d are simple to use, but stat may also be used for such a check (see second field, mode.)
I have not tested the code; I cannot easily recreate your conditions.
EDIT: Now it uses finddepth instead of find because:
finddepth() works just like find() except that it invokes the &wanted function for a directory after invoking it for the directory's contents. It does a postorder traversal instead of a preorder traversal, working from the bottom of the directory tree up where find() works from the top of the tree down.
This should take care of removing the directories in order, deepest first. Some directories may still not be removed if files remain in them that do not match the delete condition. If you want them removed when empty regardless of their timestamp, then remove the if -d condition. The non-empty ones will remain. Directories that cannot be removed may issue a warning...

recursively file delete from directory in perl

I am new in perl script. I want to write perl which delete previous backup file and extract new backup file from dropbox and rename with specific file name.
Example:
backup location:
D:\Database\store_name\ containing .bak files
Actual folder data
D:\Database\Mahavir Dhanya Bhandar\ contain .bak file
D:\Database\Patel General Store\ containg .bak files
..so on
How can write perl script code which delete *.bak files store_recursively
2.extract new backup file from dropbox and rename with specific file name.
Have you looked into walking your file tree. http://rosettacode.org/wiki/Walk_a_directory/Recursively. Combine this with simple file operations (copying, deleting, etc.) and you should be good.
use File::Find qw(find);
my $dir = "D:\Database\Store_Name";
find sub {unlink $File::Find::name if /\.bak$/}, $dir;
and assuming that connectToDropbox() connects to your dropbox
use File::Copy;
use File::Find qw(find);
my $backup = connectToDropbox();
my $dir = "D\Database\Store_Name";
find sub {copy($backup -> getFile("file"), "newFile")} $dir;
of course, this assumes that you already can set up a connection and such to Dropbox. If not, there is a good CPAN libraryhere you can check out.

Perl File::Copy::Recursive fcopy doesn't create directories on UNC Paths

When i use fcopy to copy files from an UNC Path to another, it doesn't work if the target directory doesn't exist. But it does work perfectly on local path, (resulting in creating that directory)
use File::Copy::Recursive qw(fcopy rcopy dircopy fmove rmove dirmove);
use autodie qw'fcopy rmove';
#works. Folder is created, File is copied.
fcopy ("/path/to/file", "/path/to/non/existing/path");
#works too. No need to create a folder.
fcopy ("//path/to/UNC/Share/File", "//path/to/existing/Share");
#doesn't work.
fcopy ("path/to/UNC/Share/File", ""//path/to/existing/Share/with/non/existing/folder");
it dies with
Following example
my $file1 = "//server/existing/file"
if (! -f $file1 ) {
print "$file1 does not exist";
exit 2;
}
fcopy($file1, "//server/targetDirectory/newFolder"
dies with
can't fcopy('//server/existing/file', '//server/targetDirectory/newFolder'): No such file or d
irectory at test.pl line 20
Is it not possible to create directories with rcopy on samba shares, using an UNC path or is this a bug?
This is a bug. https://rt.cpan.org/Public/Bug/Display.html?id=43328. It does work if you use a drive letter to map the remote share -- but that is not always convenient. The bug was reported in 2009, someone posted a proposed solution in 2010, but no new version including a fix has been released yet. You can try the proposed solution by adjusting your local copy of File::Copy::Recursive, changing the beginning of sub pathmk to the following:
sub pathmk {
my ( $volume, #parts ) = File::Spec->splitpath(shift());
my $nofatal = shift;
#parts = File::Spec->splitdir( File::Spec->catdir(#parts));
my $pth = File::Spec->catdir($volume,$parts[0]);
my $zer = 0;
[EDIT] I've sent the maintainer of the package an email asking to release a new version with this fix included. I checked that the fix does not break any of the tests associated with the software package.

Checking existence of a file given a directory format in perl

I am struggling with a method of walking a directory tree to check existence of a file in multiple directories. I am using Perl and I only have the ability to use File::Find as I am unable to install any other modules for this.
Here's the layout of the file system I want to traverse:
Cars/Honda/Civic/Setup/config.txt
Cars/Honda/Pathfinder/Setup/config.txt
Cars/Toyota/Corolla/Setup/config.txt
Cars/Toyota/Avalon/Setup/
Note that the last Setup folder is missing a config.txt file.
Edit: also, in each of the Setup folders there are a number of other files as well that vary from Setup folder to Setup folder. There really isn't any single file to search against to get into the Setup folder itself.
So you can see that the file path stays the same except for the make and model folders. I want to find all of the Setup folders and then check to see if there is a config.txt file in that folder.
At first I was using the following code with File::Find
my $dir = '/test/Cars/';
find(\&find_config, $dir);
sub find_config {
# find all Setup folders from the given top level dir
if ($File::Find::dir =~ m/Setup/) {
# create the file path of config.txt whether it exists or not, well check in the next line
$config_filepath = $File::Find::dir . "/config.txt";
# check existence of file; further processing
...
}
}
You can obviously see the flaw in trying to use $File::Find::dir =~ m/Setup/ since it will return a hit for every single file in the Setup folder. Is there any way to use a -d or some sort of directory check rather than a file check? The config.txt is not always in the folder (I will need to create it if it doesn't exist) so I can't really use something like return unless ($_ =~ m/config\.txt/) since I don't know if it's there or not.
I'm trying to find a way to use something like return unless ( <is a directory> and <the directory has a regex match of m/Setup/>).
Maybe File::Find is not the right method for something like this but I've been searching around for a while now without any good leads on working with directory names rather than file names.
File::Find finds directory names, too. You want to check for when $_ eq 'Setup' (note: eq, not your regular expression, which would also match XXXSetupXXX), and then see if there's a config.txt file in the directory ( -f "$File::Find::name/config.txt" ). If you want to avoid complaining about files named Setup, check that the found 'Setup' is a directory with -d.
I'm trying to find a way to use something like return unless ( <is a directory> and <the directory has a regex match of m/Setup/>).
use File::Spec::Functions qw( catfile );
my $dir = '/test/Cars/';
find(\&find_config, $dir);
sub find_config {
return unless $_ eq 'Setup' and -d $File::Find::name;
my $config_filepath = catfile $File::Find::name => 'config.txt';
# check for existence etc
}

How can I change the case of filenames in Perl?

I'm trying to create a process that renames all my filenames to Camel/Capital Case. The closest I have to getting there is this:
perl -i.bak -ple 's/\b([a-z])/\u$1/g;' *.txt # or similar .extension.
Which seems to create a backup file (which I'll remove when it's verified this does what I want); but instead of renaming the file, it renames the text inside of the file. Is there an easier way to do this? The theory is that I have several office documents in various formats, as I'm a bit anal-retentive, and would like them to look like this:
New Document.odt
Roffle.ogg
Etc.Etc
Bob Cat.flac
Cat Dog.avi
Is this possible with perl, or do I need to change to another language/combination of them?
Also, is there anyway to make this recursive, such that /foo/foo/documents has all files renamed, as does /foo/foo/documents/foo?
You need to use rename .
Here is it's signature:
rename OLDNAME,NEWNAME
To make it recursive, use it along with File::Find
use strict;
use warnings;
use File::Basename;
use File::Find;
#default searches just in current directory
my #directories = (".");
find(\&wanted, #directories);
sub wanted {
#renaming goes here
}
The following snippet, will perform the code inside wanted against all the files that are found. You have to complete some of the code inside the wanted to do what you want to do.
EDIT: I tried to accomplish this task using File::Find, and I don't think you can easily achieve it. You can succeed by following these steps :
if the parameter is a dir, capitalize it and obtain all the files
for each file, if it's a dir, go back at the beginning with this file as argument
if the file is a regular file, capitalize it
Perl just got in my way while writing this script. I wrote this script in ruby :
require "rubygems"
require "ruby-debug"
# camelcase files
class File
class << self
alias :old_rename :rename
end
def self.rename(arg1,arg2)
puts "called with #{arg1} and #{arg2}"
self.old_rename(arg1,arg2)
end
end
def capitalize_dir_and_get_files(dir)
if File.directory?(dir)
path_c = dir.split(/\//)
#base = path_c[0,path_c.size-1].join("/")
path_c[-1].capitalize!
new_dir_name = path_c.join("/")
File.rename(dir,new_dir_name)
files = Dir.entries(new_dir_name) - [".",".."]
files.map! {|file| File.join(new_dir_name,file)}
return files
end
return []
end
def camelize(dir)
files = capitalize_dir_and_get_files(dir)
files.each do |file|
if File.directory?(file)
camelize(file.clone)
else
dir_name = File.dirname(file)
file_name = File.basename(file)
extname = File.extname(file)
file_components = file_name.split(/\s+/)
file_components.map! {|file_component| file_component.capitalize}
new_file_name = File.join(dir_name,file_components.join(" "))
#if extname != ""
# new_file_name += extname
#end
File.rename(file,new_file_name)
end
end
end
camelize(ARGV[0])
I tried the script on my PC and it capitalizes all dirs,subdirs and files by the rule you mentioned. I think this is the behaviour you want. Sorry for not providing a perl version.
Most systems have the rename command ....
NAME
rename - renames multiple files
SYNOPSIS
rename [ -v ] [ -n ] [ -f ] perlexpr [ files ]
DESCRIPTION
"rename" renames the filenames supplied according to the rule specified as the first argument. The perlexpr argument is a Perl expression which
is expected to modify the $_ string in Perl for at least some of the filenames specified. If a given filename is not modified by the expression,
it will not be renamed. If no filenames are given on the command line, filenames will be read via standard input.
For example, to rename all files matching "*.bak" to strip the extension, you might say
rename 's/\.bak$//' *.bak
To translate uppercase names to lower, you’d use
rename 'y/A-Z/a-z/' *
OPTIONS
-v, --verbose
Verbose: print names of files successfully renamed.
-n, --no-act
No Action: show what files would have been renamed.
-f, --force
Force: overwrite existing files.
AUTHOR
Larry Wall
DIAGNOSTICS
If you give an invalid Perl expression you’ll get a syntax error.
Since Perl runs just fine on multiple platforms, let me warn you that FAT (and FAT32, etc) filesystems will ignore renames that only change the case of the file name. This is true under Windows and Linux and is probably true for other platforms that support the FAT filesystem.
Thus, in addition to Geo's answer, note that you may have to actually change the file name (by adding a character to the end, for example) and then change it back to the name you want with the correct case.
If you will only rename files on NTFS filesystems or only on ext2/3/4 filesystems (or other UNIX/Linux filesystems) then you probably don't need to worry about this. I don't know how the Mac OSX filesystem works, but since it is based on BSDs, I assume it will allow you to rename files by only changing the case of the name.
I'd just use the find command to recur the subdirectories and mv to do the renaming, but still leverage Perl to get the renaming right.
find /foo/foo/documents -type f \
-execdir bash -c 'mv "$0" \
"$(echo "$0" \
| perl -pe "s/\b([[:lower:]])/\u\$1/g; \
s/\.(\w+)$/.\l\$1/;")"' \
{} \;
Cryptic, but it works.
Another one:
find . -type f -exec perl -e'
map {
( $p, $n, $s ) = m|(.*/)([^/]*)(\.[^.]*)$|;
$n =~ s/(\w+)/ucfirst($1)/ge;
rename $_, $p . $n . $s;
} #ARGV
' {} +
Keep in mind that on case-remembering filesystems (FAT/NTFS), you'll need to rename the file to something else first, then to the case change. A direct rename from "etc.etc" to "Etc.Etc" will fail or be ignored, so you'll need to do two renames: "etc.etc" to "etc.etc~" then "etc.etc~" to "Etc.Etc", for example.