If file exists with wildcard? - perl

I need to verify if a file exists, but the file name changes. At least the last part of the file name does due to appending time (hh-mm-ss).
So basically, I need to see if file yyyy-MM-dd-hh-mm-ss exists irrespective of what the hh-mm-ss is.
I'm trying to do a wildcard search but it doesn't find the file.
For example, I want to check if the file /home/httpd/doc/$user/$year/2018-08-20-* exists.
Is it possible to use * for the filename variable? Or an other better way?
I'm trying to check if a file exists, but with a wildcard. Here is my code:
opendir(DIR, "/home/httpd/doc/$user/2018") || die "Unable to open log/location";
while(<>){
if($_ =~ /\/2018-08-20-\d+-\d+-\d+/) {
print "file exists\n";
}
}

It looks like you are trying to use the opendir-readdir pattern, which would look like
opendir(DIR, "/home/httpd/doc/$user/2018") || die "Unable to open log/location";
foreach my $file (readdir DIR) {
if ($file =~ /\/2018-08-20-\d+-\d+-\d+/) {
print "file exists\n";
}
}
closedir DIR;
but a more concise way (with more edge cases but would probably work in your case) would be to use glob
if (glob("/home/httpd/doc/$user/2018/2018-08-20-*")) {
print "file exists\n";
}

Related

Perl cannot find and open existing file file name even I can print out exactly the same file name

I'm having the following problem like this:
I tried to traverse through all the sub-directory folders with numeric name, e.g.: 0/ 1/ 2/ ...
and try to check files named like this: combine_0_arc combine_1_arc ... in which the number in between is the same as the sub-folder name which this file exists, and here's how I do it:
#!/usr/bin/perl -w
use strict;
opendir(DIR, "./") or die "Failed to open directory: $!\n";
my #DirName = readdir(DIR) or die "Unable to read current directory: $!\n";
#closedir(DIR);
foreach (#DirName){
chomp;
my $CurrentDir =$_;
next if ($CurrentDir eq ".");
next if ($CurrentDir eq "..");
if($CurrentDir =~ /^\d+$/){
# print "Iteration directory: $CurrentDir\n";
opendir(SUBDIR, $CurrentDir) or die "Unable to read current directory: $CurrentDir\n";
my #SubDirFiles = readdir(SUBDIR);
foreach (#SubDirFiles){
chomp;
# if($_ =~ /combine_0_arc/){next;}
if($_ =~ /combine_\d+_arc$/){
my $UntestedArc = $_;
# print "Current directory: $CurrentDir\n";
# print `pwd`."\n";
# print "Combine_arc_name:$UntestedArc\n";
open (FH, "<", $UntestedArc) or die "Cannot open file $UntestedArc:$!\n";
}
}
}
It seems I'm getting the following error message:
Cannot open file combine_0_arc:No such file or directory
and I tried to print out the folder name and the file name for each iteration it seems it's printing out correctly. I tried to chomp those trailing spaces or carriage returns for each file name or folder name but it seems it's not working. Can anybody explain me what's happening there? Thanks a lot!
The readir returns bare file name, without the path. So add it
foreach my $CurrentDir (#DirName) {
# ...
opendir my $SUBDIR, $CurrentDir;
my #SubDirFiles = map { "$CurrentDir/$_" } readdir($SUBDIR);
foreach my $untestedArc (#SubDirFiles)
{
if ($UntestedArc =~ /combine_${CurrentDir}_arc$/) {
# ...
}
}
}
where the regex uses that directory's name, not any number.
Notes
readdir doesn't add a newline, no need to chomp (it doesn't hurt though)
use lexical filehandles
declare the loop variable (topicalizer) in the foreach statement
This answer assumes that your current working directory is the one with #DirName subdirectories.
readdir just returns the file names in the directory, i.e. the name relative to the directory given to the matching opendir. But, what you need for open is either the absolute name of the file or the name relative to the directory you are currently in (current working directory). Since opendir does not magically change the working directory (this could be done with chdir) the working directory is not the same as the directory scanned with opendir and thus the relative file you use within open can not be found within the current working directory.

perl not able to delete a file using Unlink

I am using a perl script that takes directory name as input from user and searches files in it. After searching file it reads the contents of file. If file contents contain a word "cricket" then using unlink function I should be able to delete the file. But using unlink the file that contains the word "cricket" still exists in the directory after execution of the code. Please help. My code is:
use strict;
use warnings;
use File::Basename;
print "enter a directory name\n";
my $dir = <>;
print "you have entered $dir \n";
chomp($dir);
opendir DIR, $dir or die "cannot open directory $!";
while (my $file = readdir(DIR)) {
next if ($file =~ m/^\./);
my $filepath = "${dir}${file}";
print "$filepath\n";
print " $file \n";
open(my $fh, '<', $filepath) or die "unable to open the $file $!";
my $count = 0;
while (my $row = <$fh>) {
chomp $row;
if ($row =~ /cricket/) {
$count++;
}
}
print "$count";
if ($count == 0) {
chomp($filepath);
unlink $filepath;
print " $filepath deleted";
}
}
By your test if($count==0) {...} you'll only delete files if they don't contain "cricket". It should work as you describe if you change it to if($count) {...}.
Additionally you're creating the filepath by concatenating the dir and file names in a manner that will only work if the dir name the user entered includes a trailing slash (${dir}${file}): this would be less error-prone as $dir/$file, or, if you wanted to go to town:
use File::Spec;
File::Spec::catfile($dir, $file);
Additionally, as the comments point out, you're not closing the open file handle, whether or not you try to delete it. This is bad practice, however, on Linux at least it should still work. Use close($fh) before your deletion test.
Note also that "cricket" is case-sensitive so files with "Cricket" won't be deleted. Use $row =~ /cricket/i for case-insensitive search.

Copy files from one folder to another based on similar file name

I trying to write a script that will copy files from one folder to another based on the file name(similar). As I got Few thousands text files in a folder. But I try to find few hundreds of files out of thousands files. It's takes a lot of time to search it one by one.
Copy seem like a good idea to use in this and then use for to loop through the list of files that I try to find out of thousands. But Copy need a specified name. The problem is I only have part of the file name.
Example of list of files(Content of the text file):
ABCDEF-A01
ADEWSD-B03
ABDDER-C23
Example of filename:
GGI_1409506_ABCDEF-A01.txt,GGI_ADEWSD-B03.txt,DA_ABDDER-C23_12304.txt
I only got the ABCDEF-A01 instead of the full filename.
Expected result:
Able to search through the folder and copy the files to another location that matched according the list of files (one text files).
Anything that you can share? Info/ans/related posts? Thank you so much!
Try the below code in perl . When running the program pass the arguments for Source Directory path and Destination Directory path along with the list of filename that need to be searched. If destination directory doesn't exist it will create a folder automatically through the program as shown below :
Code:
use strict;
use warnings;
use File::Copy;
my $source = $ARGV[0];
my $destination = $ARGV[1];
my $listFiles = $ARGV[2];
if(-f $destination)
{
print "Already unknown extension of file exists with the same name of directory. So rename the file and run the program";
exit 0;
}
if(-d "$destination")
{
print "Directory where files need to be copied: $destination\n";
}
else
{
print "No Directory found and hence created the directory $destination\n";
mkdir("$destination");
}
opendir DIR, $source or die "cant open dir";
my #files = grep /(.*?)(\.txt)$/,(readdir DIR);
open my $fh, '<', "$listFiles" or die "Cannot open the file names to search $listFiles - $!";
open my $out,'>', "$ARGV[1]\\NoMatch.txt" or die "Cannot write to the file NoMatch.txt - $!";
my #listFileNames = <$fh>;
my #listFiles = ();
foreach my $InputFiles (#files)
{
chomp($InputFiles);
foreach my $list(#listFileNames)
{
chomp($list);
if($InputFiles =~ /$list/isg)
{
print "Files : $InputFiles copying\t";
copy("$InputFiles","$destination");
print "Files : $InputFiles copied\n";
push(#listFiles,$list);
}
}
}
my %seen = ();
my $count = 0;
foreach my $list (#listFiles)
{
$seen{lc($list)} = 1;
#print $list . "\n";
}
foreach my $listnames (#listFileNames)
{
if($seen{lc($listnames)})
{
}
else
{
if($count ==0)
{
print "\nFilenames that did not match the text files are present in the destination folder : NoMatch.txt file " . "\n";
}
print $out "$listnames\n";
$count++;
}
}
close($out);
close($fh);
closedir(DIR);
create a batch file and put it in the source folder, with your list of files you want to copy.
for /f %%f in (list.txt) do robocopy c:\source d:\dest %%f
Hope this helps
#!/usr/bin/perl -w
use strict;
use File::Copy;
my $sorce_direcrtory = qq{};
my $new_directory = "";
opendir(my $dh, $sorce_direcrtory) || die;
while(readdir $dh) {
if($_ =~ /[A..Z]+\-[A..Z]\d+/){
move("$sorce_direcrtory/$_", "$new_directory/$_");
}
}
closedir $dh;

How to check if a file is within a directory

I have a text file with a list of individual mnemonics (1000+) in it and a directory that also has page files in it. I want to see how many pages a given mnemonic is on.
below is my code so far..
use strict;
use warnings;
use File::Find ();
my $mnemonics = "path\\path\\mnemonics.txt";
my $pages = "path\\path\\pages\\";
open (INPUT_FILE, $names) or die "Cannot open file $mnemonics\n";
my #mnemonic_list = <INPUT_FILE>;
close (INPUT_FILE);
opendir (DH, $pages);
my #pages_dir = readdir DH;
foreach my $mnemonic (#mnemonic_list) {
foreach my $page (#pages_dir) {
if (-e $mnemonic) {
print "$mnemonic is in the following page: $page";
} else {
print "File does not exist \n";
}
}
}
Basically, where I know that a name exists in a page, it isn't showing me the correct output. I'm getting a lot of "File does not exists" when I know it does.
Also, instead of (-e) I tried using:
if ($name =~ $page)
and that didn't work either..
Please help!
Assuming that you want to search a directory full of text files and print the names of files that contain words from the words in mnemonics.txt, try this:
use strict; use warnings;
my $mnemonics = "path/mnemonics.txt";
my $pages = "path/pages/";
open (INPUT_FILE, $mnemonics) or die "Cannot open file $mnemonics\n";
chomp(my #mnemonic_list = <INPUT_FILE>);
close (INPUT_FILE);
local($/, *FILE); # set "slurp" mode
for my $filename (<$pages*>) {
next if -d "$filename"; # ignore subdirectories
open FILE, "$filename";
binmode(FILE);
$filename =~ s/.+\///; # remove path from filename for output
my $contents = <FILE>; # "slurp" file contents
for my $mnemonic (#mnemonic_list) {
if ($contents =~ /$mnemonic/i) {
print "'$mnemonic' found in file $filename\n";
}
}
close FILE;
}

Why can't I open files returned by Perl's readdir?

Well, I know this is another newbie question but I'm very frustrated and I'm looking to be enlightened again. With the guidance of you guys, I've already learnt how to use the glob function to read the contents of each file in a directory. Now I'm trying the readdir-foreach combination to do the same thing but I keep receiving "Cannot open file: Permission denied" error. Why is this happening with the same directory , the same files and the same me as Administrator. Can someone kindly show me what I'm doing wrong? Thanks.
The following code uses the glob function and it works:
#! perl
my $dir = 'f:/corpus/';
my #files = glob "$dir/*";
foreach my $file (#files) {
open my $data, '<',"$file" or die "Cannot open FILE";
while(<$data>) {
...}
The following code fails and the error message says "Cannot open FILE: Permission denied". But why?
#! perl
my $dir = 'f:/corpus/';
opendir (DIR,'f:/corpus/') or die "Cannot open directory:$!";
my #files=readdir(DIR);
closedir DIR;
foreach my $file (#files) {
open my $data, '<',"$file" or die "Cannot open FILE:$!";
while(<$data>) {
...}
The readdir() function returns only the file's name, not a full path. So you are trying to open e.g. "foo.txt" instead of "f:\corpus\foo.txt".
You should keep in mind that readdir returns directory names and file names. Most likely you are attempting to open one of the special directory entries . or .. which you generally need to filter out if you're using these functions:
foreach my $f (#files)
{
# skip special directory entries
if ($f ne '.' && $f ne '..')
{
# ...
print "$f\n";
}
}
Also note Andy Ross' suggestion that this will only return the relative path, not the full path.