Unable to open text file in Perl - perl

I am fetching some log files (which are in txt format) from another server and trying to parse them using my Perl script. The logs are being fetched correctly after which I set permissions to 777 for the log directory.
After this I attempt to open the log files, one by one for parsing, via my Perl script. Now, the strange thing and the problem which happens is, my script is sometimes able to open the file and sometimes NOT. To put it simply, it's unable to open the log files for parsing at times.
Also, I have cronned this perl script and the chances of file open failing are greater when it runs via cron rather than manually, although they have run successfully in both cases previously. I don't understand where the issue lies.
Here is the code which I use for opening the files,
$inputDir = "/path/to/dir";
#inputFiles = <$inputDir/*>;
# inputFiles array is list of files in the log directory
foreach my $logFile(#inputFiles)
{
# just to ensure file name is text
$logFile = $logFile."";
# process file only if filename contains "NOK"
if(index($logFile,"NOK") > -1)
{
# opens the file
open($ifile, '<', $logFile) or die "Error: Unable to open file for processing.";
# file parsing takes place
}
}
close($ifile);
I want to re-iterate that this code HAS run successfully and I haven't changed any part of it. Yet, it does not run every time without fail, because its unable to open the log file at times. Any ideas?

You should include the error message $! and the file name $logFile in your die string to see why the open failed, and for which file.
open($ifile, '<', $logFile) or die "Error: Unable to open $logFile: $!";
Also, this line:
$logFile = $logFile."";
...is quite redundant. If a conversion is necessary, perl will handle it.

Just as an example, this is what you code should look like. You may like to try this version
use strict;
use warnings;
my $inputDir = '/path/to/dir';
my #inputFiles = <$inputDir/*>;
foreach my $logFile (grep /NOK/, #inputFiles) {
open my $ifile, '<', $logFile or die qq(Unable to open "$logFile": $!);
# Process data from <$ifile>;
}

Maybe opening some files fails, because your program has too many open files. Your program opens all files in $inputDir and processes them in the loop. After that it closes the last file opened.
EDIT: after reading TLP's comment and reading perldoc -f close and perldoc -f open I see that TLP is right and the filehandle in $ifile is closed by a subsequent open($ifile,'<',$logFile) . However, if the file parsing code not shown by the topic creator creates another reference to $ifile the file handle would stay open.
Moving the call to close into the if block should solve your problem:
$inputDir = "/path/to/dir";
#inputFiles = <$inputDir/*>;
# inputFiles array is list of files in the log directory
foreach my $logFile(#inputFiles)
{
# process file only if filename contains "NOK"
if(index($logFile,"NOK") > -1)
{
# opens the file
# added my to $ifile to keep local to this scope
open(my $ifile, '<', $logFile) or die "Error: Unable to open file for processing.";
# file parsing takes place
# close current file
close($ifile);
}
}

Related

Zipping a file with perl results in an invalid archive

I am currently trying to zip some files with perl. The resulting file is printed, so a user who calls the page which executes the script can download or open the zip file.
Looking at the size of the zip file it seems everything worked ok, but if I try to open the file on the server no contents are shown. If I open the file after downloading it, the archive is invalid.
Here's the code:
my $zip = Archive::Zip->new();
my $i;
foreach $i(#files)
{
my $fh = $zip->addFile("$directoryPath$i") if (-e "$directoryPath$i");
}
my $zipFilePath = "Test.zip";
die 'Cannot create $zip_file_name: $!\n' if $zip->writeToFileNamed("$zipFilePath") != AZ_OK;
open (DLFILE, "<$zipFilePath");
#fileholder = <DLFILE>;
close (DLFILE);
print "Content-Type:application/x-download\n";
print "Content-Disposition:attachment;filename=$zipFilePath\n\n";
print #fileholder;
Can you please tell me where the error is?
I am running the code using xampp on my local windows machine.
Edit: The same happens when I use
use strict;
use warnings;
use autodie;
Edit: The first problem is solved by ysth, thanks for that. Now the archive is not invalid after downloading, but still no files are shown if I open it, while the zip-file's size seems to be correct.
You are corrupting it here:
open (DLFILE, "<$zipFilePath");
#fileholder = <DLFILE>;
close (DLFILE);
by opening it such that it translates "\r\n" to just "\n".
Try this:
open( DLFILE, '<:raw', $zipFilePath );

Cannot open Local file /dir1/file1 : No such file or directory: PERL code

I am trying to read file contents and assign it to a variable. Pass the variable to the FTP so that it can ftp the file. The input contains multiple lines. Hence I am using a while loop.
When I execute the ode in the UNIX environment it says
"Cannot open Local file <file name> : No such file or directory
But I am able to see the file in the location.
Input File contents (cfilelistftpA):
sxa1:OUT047
axk1:OUT635
...
Here is the output on UNIX for ls
axk1: ls -lrt
total 596
-rw-rw-rw- 1 axk1 ptuser 599399 Jul 23 23:06 OUT635
sxa1#axk5hbz: pwd
/home/sxa1/udms/axk1
sxa1: ls -lrt
total 836
-rw-rw-rw- 1 sxa1 ptuser 844664 Jul 23 23:06 OUT047
sxa1: pwd
/home/sxa1/udms/sxa1
I am unable to figure out the issue.
Any help is much appreciated.
#!/usr/bin/perl
use strict;
use warnings;
use Getopt::Std;
use File::Copy;
use File::Path;
use Net::FTP;
# declare and initialize counter
my $lfile = "LogFile";
my $filelistftpA = "/home/sxa1/filelistFTPA";
my $cfilelistftpA = "cfilelistftpA";
my $newPath = "/home/sxa1/udms";
my $newUDrive = "X:\\udms";
my $host1 = "XXXX";
my $user1 = "XXXX";
my $password1 = "XXXXX";
my ( $drive1, $part4 ) = split /:/, $newUDrive;
open (LOGFILE, "> $lfile") or die "Could not open log file $!\n";
my $ftpfiles = Net::FTP->new($host1) or die "Can't open $host1\n";
$ftpfiles->login($user1, $password1) or die "can't log $user1 \n";
$ftpfiles->binary();
open (MYFILESA, "< $cfilelistftpA");
while (<MYFILESA>)
{
chomp;
my $fileline = $_;
my ($dirnameisA,$filenameisA) = split /:/, $fileline;
print LOGFILE "Dir name is: $dirnameisA\n";
print LOGFILE "File name is: $filenameisA\n";
print LOGFILE "$hour, $min\n";
# $ftpfiles->cwd($part4);
# $ftpfiles->cwd($dirnameisA);
my $dstFilesA = "$newPath/$dirnameisA/$filenameisA";
if (-e "$dstFilesA")
{
print LOGFILE "File Exists: $dstFilesA \n";
}
else
{
print LOGFILE "File does not Exists: $dstFilesA \n";
}
$ftpfiles->cwd($part4);
$ftpfiles->cwd($dirnameisA);
$ftpfiles->put("$dstFilesA");
print LOGFILE "READING from FILE: $_\n";
}
$ftpfiles->quit();
close (MYFILESA);
close(LOGFILE);
Output by executing "perl testftp.pl"
Cannot open Local file /home/sxa1/udms/sxa1/OUT047 : No such file or directory
at testftp.pl line 62
Cannot open Local file /home/sxa1/udms/axk1/OUT635 : No such file or directory
at testftp.pl line 62
logfile contents:
Dir name is: sxa1
File name is: OUT047
File does not Exists: /home/sxa1/udms/sxa1/OUT047
READING from FILE: sxa1:OUT047
Dir name is: axk1
File name is: OUT635
File does not Exists: /home/sxa1/udms/axk1/OUT635
READING from FILE: axk1:OUT635
Your code is looking for files in:
my $newPath = "/home/san/udms";
Your ls listings and pwd commands are looking for files in:
/home/sxa1/udms/axk1
If you get the spellings consistent, you stand more chance of it working; if you get them correct, it will improve the chances of it working.
The error Cannot open Local file you're getting is coming from Net::FTP when you do a put. This happens when sysopen cannot open the file name you've passed it. It's usually an indication that the file doesn't exist.
Go to that error message and copy that file name. The entire name including the directory. Now, paste that into an ls command:
$ ls /home/sxa1/udms/sxa1/OUT047
No such file or directory
If you get that error, then your file name you're trying to put is wrong. Take a look at it and see if you can figure it out. For example, I see sxa1 twice in the path. Is that correct?
If the file does exist, then take a look at other possible issues:
Is the file readable by you? If not, you're not going to be able to open it.
Is there any whitespace in the name? The Net::FTP->put command may list the file, but you're not seeing the actual name.
If you still can't figure it out, put more logging information into your program. Change these lines:
$ftpfiles->cwd($part4);
$ftpfiles->cwd($dirnameisA);
$ftpfiles->put("$dstFilesA");
To:
print qq(Changing to directory "$part4"\n);
$ftpfiles->cwd($part4) or warn qq(Can't change to directory "$part4");
print qq(Changing to directory "$dirnameisA"\n);
$ftpfiles->cwd($dirnameisA) or warn qq(Can't change to directory "$dirnameisA");
print qq(Attempting to "put" file "$dstFilesA"\n);
$ftpfiles->put("$dstFilesA") or warn qq(Can't put file "$dstFilesA");
I am looking at the return status of each of these lines as they execute to see if I can figure out what could be going wrong. I am putting quotation marks around file names. Maybe the file has a leading or ending space or maybe a carriage return character. Maybe a hidden Windows NL because the person who created this file did so on Windows, and you are executing this on Unix.
See what this gives you.
Also, you claim this is the program, but it's only 57 lines long. Net::FTP claims that this error is on line 62 of your program. Can you show us which line is giving you this error? I am assuming put is doing it, but that's only line #51.

Perl File pointers

I have a question concerning these two files:
1)
use strict;
use warnings;
use v5.12;
use externalModule;
my $file = "test.txt";
unless(unlink $file){ # unlink is UNIX equivalent of rm
say "DEBUG: No test.txt persent";
}
unless (open FILE, '>>'.$file) {
die("Unable to create $file");
}
say FILE "This name is $file"; # print newline automatically
unless(externalModule::external_function($file)) {
say "error with external_function";
}
print FILE "end of file\n";
close FILE;
and external module (.pm)
use strict;
use warnings;
use v5.12;
package externalModule;
sub external_function {
my $file = $_[0]; # first arguement
say "externalModule line 11: $file";
# create a new file handler
unless (open FILE, '>>'.$file) {
die("Unable to create $file");
}
say FILE "this comes from an external module";
close FILE;
1;
}
1; # return true
Now,
In the first perl script line 14:
# create a new file handler
unless (open FILE, '>>'.$file) {
die("Unable to create $file");
}
If I would have
'>'.$file
instead, then the string printed by the external module will not be displayed in the final test.txt file.
Why is that??
Kind Regards
'>' means open the file for output, possibly overwriting it ("clobbering"). >> means appending to it if it already exists.
BTW, it is recommended to use 3 argument form of open with lexical file-handles:
open my $FH, '>', $file or die "Cannot open $file: $!\n";
If you use >$file in your main function, it will write to the start of the file, and buffer output as well. So after your external function returns, the "end of file" will be appended to the buffer, and the buffer flushed -- with the file pointer still at position 0 in the file, so you'll just overwrite the text from the external function. Try a much longer text in the external function, and you'll see that the last part of it remains, with the first part getting overwritten.
This is very old syntax, and not recommended in modern Perl. The 3-argument version, which was "only" introduced in 5.6.1 about 10 years ago, is preferred. So is using a lexical variable for a file handle, rather than an uppercase bareword.
Anyway, >> means open for append, whereas > means open for write, which will remove any existing data in the file.
You're clobbering your file when you reopen it once more. The > means open the file for writing, and delete the old file if it exists and create a new one. The >> means open the file for writing, but append the data if the file already exists.
As you can see, it's very hard to pass FILE back and forth between your module and your program.
The latest syntax is to use lexically scoped variables for file handles:
use autodie;
# Here I'm using the newer three parameter version of the open statement.
# I'm also using '$fh' instead of 'FILE' to store the pointer to the open
# file. This makes it easier to pass the file handle to various functions.
#
# I'm also using "autodie". This causes my program to die due to certain
# errors, so if I forget to test, I don't cause problems. I can use `eval`
# to test for an error if I don't want to die.
open my $fh, ">>", $file; # No die if it doesn't open thx to autodie
# Now, I can pass the file handle to whatever my external module needs
# to do with my file. I no longer have to pass the file name and have
# my external module reopen the file
externalModule::xternal_function( $fh );

Can directory names and file names be variables in cgi perl scripting

I know that file names can be variable in perl scripting.Does the same apply to cgi-perl scripting.Because when I used variables inside open statement I get the error No such file or directory.But when I directly mention the path the file is opened for reading.These variables are passed from a form. The values are passed correctly they are not empty(Checked by printing the varaiables).
Example:
$dir=abc;
$file=file1;
open (FILE, '/var/www/cgi-bin/$dir/$file')
or print "file cannot be opened $!\n";
Error:
file cannot be opened no such file or directory.
Use double quotes to interpolate variables:
open (FILE, "/var/www/cgi-bin/$dir/$file")
# here __^ and here __^
or print "file cannot be opened $!\n";
Also, ALWAYS
use strict;
use warnings;
By using single quotes, the varibles aren'tt interpolated, so you're trying to open literally /var/www/cgi-bin/$dir/$file and it doesn't exist.
You've got (and accepted) a good answer. I just wanted to add that you can make your error message a lot more helpful if you include the value of $file in the string.
my $file_path = '/var/www/cgi-bin/$dir/$file';
open (FILE, $file_path)
or print "file [$file_path] cannot be opened: $!\n";
Then the error would have been "file [/var/www/cgi-bin/$dir/$file] cannot be opened: no such file or directory" which would have made it obvious that the variables weren't being expanded.
Update: I was talking nonsense. The new version is better.

Perl to set a directory to open, open it, then print the directory opened?

Trying to troubleshoot a port of some perl code from CentOS to Windows.
Really know nothing about Perl, and the code I'm porting is around 700-1000 lines. 100% sure one of the issues I'm seeing is related to how the code is being rendered as a result of being on the OS it's running on.
So, I'm looking for a way to troubleshoot debugging how the OS's are rendering filepath apart from the legacy code; which I can not post to SO due to "IP" reasons.
So, I looking for some perl that I can set a directory to open within the script (for example, C:\data\ or /home/data), then script attempts to load the directory, prints if it failed or succeeded, and then prints the string it attempted to load, regardless if the code failed to open the directory or not.
Open to suggestions, but that's the issue, and the solution I'm seeing.
Questions, feedback, requests - just comment, thanks!!
use IO::Dir;
my $dir = IO::Dir->new($dir_path) or
die "Could not open directory $dir_path: $!\n";
of course, where $dir_path is some path to a directory on your system that you want, either as a var or hard coded. The more 'old school' way would look like:
opendir my $dir, $dir_path or die "Could not open directory $dir_path: $!\n";
That won't print of the directory is opened, but the program will fail if it doesn't open it then print the precise error as to why, which is what the $! variable holds.
Is this what you're looking for?
use DirHandle;
my $dir = "test";
my $dh = new DirHandle($dir);
if($dh) {
print "open directory succeeded\n";
}
else {
print "open directory failed\n";
}
print $dir, "\n";
new DirHandle opens the directory and returns a handle to it. The handle will be undef if the open failed.