Can't open file error in Perl - perl

Why program can't open the file? i.e dies. I searched for this problem, but it seems all fine to me.
Funny thing is this code worked before, and i don't think i changed something from that moment in open function.
my $i;
my $regex = $ARGV[0];
for (#ARGV[1 .. $#ARGV]){
open (my $fh, "<", "$_") or die ("Can't open, $!");
$i++;
foreach (<$fh>){
print "Given regexp: $regex\nfile$i:\n line $.: $1\n" if $_ =~ /(\b$regex\b)/;
}
}
OUTPUT:
Can't open Not a directory

Not a directory means you're supplying an argument that assumes a non-directory is a directory.
For instance, if your argument is
a/b
and
a
exists but is not a directory, you will get this error.

Check your argument. it should be a proper directory name

Related

In Perl, why can't I open this file using single argument open?

The file is 52MB in size. It is in the same directory as the program.
$big = 'VIXhx.csv';
# tie #optLine, 'Tie::File', $big or die "Cant Tie to $big $!" ;
open $big or die "Cant open $big, $!," ;
Tie::File gave no error message.
Plain open gave the error message:
Cant open VIXhx.csv, No such file or directory, at C:\Python34\hsf\ETFs\VIX\qad.
pl line 47.
(Yes, it's in the Python directory - but Perl works fine there)
I can open the file in the editor, so there doesn't seem to be a problem with the file itself.
I have a smaller file in the same program that opened with no problem in Tie::File.
$dat = 'ETF5Y.csv';
tie #datLine, 'Tie::File', $dat or die "Cant Tie to $dat $!" ;
Is it possible that Perl is unable to open a file if it's too large?
Please check perldoc -f open on how to open files, what you did ended up in opening an empty filename,
strace perl -e '$big = "/etc/passwd"; open $big or die "Cant open $big, $!,"'
output
...
open("", O_RDONLY) = -1 ENOENT (No such file or directory)
write(2, "Cant open /etc/passwd, No such f"..., 64Cant open /etc/passwd, No such file or directory, at -e line 1.
See perldoc perlopentut:
Single Argument Open
Remember how we said that Perl's open took two arguments? That was a passive prevarication. You see, it can also take just one argument. If and only if the variable is a global variable, not a lexical, you can pass open just one argument, the filehandle, and it will get the path from the global scalar variable of the same name.
$FILE = "/etc/motd";
open FILE or die "can't open $FILE: $!";
while (<FILE>) {
# whatever
}
Therefore, if you want single argument open to do what you want, you would have to write our code as
$big = 'VIXhx.csv';
open big or die "Can't open '$big': $!";
# ^ <-- look, no dollar sign before filehandle
Alternatively, you could do something like this:
$big = 'VIXhx.csv';
*{$big} = \$big;
open $big and print <$big>;
if you want to keep the open $big.
But, relying on global variables and effects at a distance is not a good idea. Instead, use the three argument form of open to specify the filehandle, the mode, and the file name separately as in:
open my $vix_fh, '<', $vix_file
or die "Failed to open '$vix_file': $!";
By the way, you won't even find this section on "Single Argument Open" in recent Perl documentation. The following note should give you and idea why:
Why is this here? Someone has to cater to the hysterical porpoises. It's something that's been in Perl since the very beginning, if not before.
The single argument open can also be used to turn any Perl program into a quine.
I found the answer to my original question, of why TIE didn't work.
It turns out that the file used '0A' as the line terminator, so TIE, expecting '0D0A', read the whole 52MB file as one record.
I added recsep => "\n" to the TIE statement, and everything works fine.

open error - No such file or directory

I am using File::Find to run through a directory tree and when I try to open the current file for reading I get No such file or directory. This happens with ALL files in the directory tree.
Here's the sub I use in the find():
sub {
if (-d) {
return;
}
if (-f) {
my $file = ${File::Find::name};
open (my $IN, '<', '$file') or die "$!\n";
while (<$IN>) {
### Do some formatting.
}
close $IN;
}
}
It fails in the line:
open (my $IN, '<', '$file') or die "$!\n";
I thought it's a matter of links maybe, but even with follow => 1 option I get this error.
By the way, without follow the error I get is on the first file of the first directory I find and with it, the error is on the last file of the last directory (but in both cases, it's on the first file inspected by File::Find).
Problem solved. Apparently, replacing the single quotes with double quotes in the open line, or even better, not using any quotes, did the trick. The string literal '$file' produces the string $file, and there's clearly no file with this name.

How to extend a program that works for one file to act on every file in a directory

I wrote a program to check for misspellings or unused data in a text file. Now, I want to check all files in a directory using the same process.
Here are a few lines of the script that I run for the first file:
open MYFILE, 'checking1.txt' or die $!;
#arr_file = <MYFILE>;
close (MYFILE);
open FILE_1, '>text1' or die $!;
open FILE_2, '>Output' or die $!;
open FILE_3, '>Output2' or die $!;
open FILE_4, '>text2' or die $!;
for ($i = 0; $i <= $#arr_file; $i++) {
if ( $arr_file[$i-1] =~ /\s+\S+\_name\s+ (\S+)\;/ ) {
print FILE_1 "name : $i $1\n";
}
...
I used only one file, checking1.txt, to execute the script, but now I want to do the same process for all files in the all_file_directory
Use an array to store file names and then loop over them. At the end of loop rename output files or copy them somewhere so that they do not get overwritten in next iteration.
my #files = qw(checking1.txt checking2.txt checking3.txt checking4.txt checking5.txt);
foreach my $filename (#files){
open (my $fh, "<", $filename) or die $!;
#perform operations on $filename using filehandle $fh
#rename output files
}
Now for the above to work you need to make sure the files are in the same directory. If not then:
Provide absolute path to each file in #files array
Traverse directory to find desired files
If you want to traverse the directory then see:
How do I read in the contents of a directory in Perl?
How can I recursively read out directories in Perl?
Also:
Use 3 args open
Always use strict; use warnings; in your Perl program
and give proper names to the variables. For eg:
#arr_file = <MYFILE>;
should be written as
#lines = <MYFILE>;
Your all files in same directory means put the program inside the directory then run it.
For read the file from a directory use glob
while (my $filename =<*.txt>) # change the file extension whatever you want
{
open my $fh, "<" , $filename or die "Error opening $!\n";
#do your stuff here
}
Why not useFile::Find? Makes changing files in directories very easy. Just supply the start directory.
It's not always the best choice, depends on your needs, but it's useful and easy almost every time I need to modify a lot of files all at once.
Another option is to just loop through the files, but in this case you'll have to supply the file names.
As mkHun pointed out a glob can be helpful.

how to point this script to one folder for reading and another for writing

I can't seem to get this script to open from one directory and write to another. Both Directories exist. I've commented out what I tried. Funny this is it runs fine when I place it in the directory with the files to process. Here's the code:
use strict;
use warnings "all";
my $tmp;
my $dir = ".";
#my $dir = "Ask/Parsed/Html4/";
opendir(DIR, $dir) or die "Cannot open directory: $dir!\n";
my #files = readdir(DIR);
closedir(DIR);
open my $out, ">>output.txt" or die "Cannot open output.txt!\n";
#open my $out, ">>Ask/Parsed/Html5/output.txt" or die "Cannot open output.txt!\n";
foreach my $file (#files)
{
if($file =~ /html$/)
{
open my $in, "<$file" or die "Cannot open $file!\n";
undef $tmp;
while(<$in>)
{
$tmp .= $_;
}
print $out ">$file\n";
print $out "$tmp\n";
#print $out "===============";
close $in;
}
}
close $out;
The directories you use -- . and Ask/Parsed/Html4/ -- are relative paths, which means they are relative to your current working directory, and so it makes a difference where in the file system you are currently located when you run the script.
In addition, the files you are opening -- output.txt and $file -- have no path information, so Perl will look in your current working directory to find them.
There are a few ways to solve this.
You could cd to the directory where your files are before running the script, and open the directory as . as you currently do
You could achieve the same effect by calling chdir from within the script, which will change the current working directory and make the program ignore your location when you ran it
Or you could add an absolute directory path to the beginning of the file names, preferably using catfile from File::Spec::Functions
However I would choose to use glob -- which works in the same way as command-line filename expansion -- in preference to opendir / readdir as the resulting strings include the path (if one was specified in the parameter) and there is no need to separately filter the .html files.
I would also choose to undefine the input record separator $/ to read the whole file, rather than reading it line-by-line and concatenating them all.
Finally, if you are running version 10 or later of Perl 5 then it is simpler to use autodie rather than checking the success of every open, readline, close, opendir, readdir, and closedir etc.
Something like this
use strict;
use warnings 'all';
use 5.010;
use autodie;
my $dir = '/path/to/Ask/Parsed/Html4';
my #html = glob "$dir/*.html";
open my $out, '>>', "$dir/output.txt";
for my $file (#html) {
my $contents = do {
open my $in, '<', $file;
local $/;
<$in>;
};
print $out "> $file\n";
print $out "$contents\n";
print $out "===============";
}
close $out;
It is likely trying to access the files from where ever you are calling this from. If you're files are located relative to the location of the script use the following example to provide a full path;
use FindBin;
my $file = "$FindBin::Bin/Ask/Parsed/Html5/output.txt";
If your file us not relative to the script, provide the full path;
my $file = "/home/john.doe/Ask/Parsed/Html5/output.txt";
Note that readdir() only returns the file name. If you want to open it prepend the directory
eg
open my $in, "<", "$dir/$file" or die "Cannot open $file!\n";
Note that best practice says you should be using the three parameter version of open, otherwise

Perl : Cannot open a file with filename passed as argument

I am passing two filenames from a DOS batch file to a Perl script.
my $InputFileName = $ARGV[0];
my $OutputFileName = $ARGV[1];
Only the input file physically exists while the Outputfile must be created by the script.
open HANDLE, $OutputFileName or die $!;
open (HANDLE, ">$OutputFileName);
open HANDLE, ">$OutputFileName" or die $!;
All three fail.
However the following works fine.
open HANDLE, ">FileName.Txt" or die $!;
What is the correct syntax?
Edit : Error message is : No such file or directory at Batchfile.pl at line nn
The proper way is to use the three-parameter form of open (with the mode as a separate parameter) with lexical file handles. Also die doesn't have a capital D.
Like this
open my $out, '>', $OutputFileName or die $!;
but your last example should work assuming you have spelled die properly in your actual code.
If you are providing a path to the filename that doesn't exist then you also need to create the intermediate directories.
The die string will tell you the exact problem. What message do you get when this fails?
code:
$file_name = $ARGV[1];
open (OUTPUT "> $file_name") or error("unable to create or open $file_name");
print OUTPUT "hello world";
close(OUTPUT);
command to execute:
perl perl_file.pl data.txt
it will work try