Failing to open Excel files for writing - perl

I am using open to write data into an Excel file. This is working fine with .txt files, but with .xls files it always fails.
This is the code I am writing
$filename = "abc.xls";
$fhandle = "ABC";
open( $fhandle, ">$filename" ) || die "cannot open file $filename";
The same code executes fine in another environment which has an older Perl version.
I need help on how can I fix this.

Found the problem to the issue. The environment I was trying to create these files in already had the files with the same name but did not have permissions for me to edit those files. I deleted the old files and it worked like a charm.
Thanks a lot for your help guys.!!
Shivam

A file handle is a file handle. Not a string.
Also, you are typing the filename in the open call. Why don't you use the $filename variable you so neatly created two lines earlier?
I would try something like this:
$filename = "abc.xls";
open($fhandle, ">$filename") || die "cannot open file $filename";
Also, this xls file: Is it open in excel when you try to run the script? Excel will stop you from opening it in another application or script if you do.
Furthermore : Please use strict and warnings. They will make your life much better.

Related

Unable to Downsample audio file in CGI perl script using sox

I am working on a cgi script where I get an uploaded an audio file, downsample it to 8000Hz and then get it recognised later.
I am facing an error while downsampling the file. The code for downsampling goes like:
1) Code for File Upload:
use CGI;
use strict;
use File::Copy qw(copy);
use CGI::Carp 'fatalsToBrowser';
my $PROGNAME = "file_upload.cgi";
my $cgi = new CGI();
print "Content-type: text/html\n\n";
my $upfile = $cgi->param('upfile');
# Get the basename in case we want to use it.
my $basename = GetBasename($upfile);
no strict 'refs';
if (! open(OUTFILE, ">../cgi-bin/upload/".$basename) ) {
print "Can't open for writing - $!";
exit(-1);
}
2)Code for downsample:
my $source_file="/var/www/cgi-bin/upload/$upfile";
system("sox $source_file -r 8000 /var/www/cgi-bin/upload/temp.wav".";"."mv /var/www/cgi-bin/upload/temp.wav $source_file");
where:
source_file is the path for uploaded audio file
$upfile is the name of the uploaded wav file
temp.wav is the temporary downsampled file which is overwritten on the original file using mv command
Error
sox FAIL formats: can't open input file `/var/www/cgi-bin/upload/file1.wav': WAVE: RIFF header not found
file1.wav is the file I uploaded
Please help me understand why the sox command is not executing despite it being correctly written?
This isn't really an answer to your question as we don't have enough information yet.
Have you tried running the command from your Unix command line? I'd assume you get the same error. What do you get if you run file on the file that you have saved? How big is the file before and after you upload it?
You don't show the code that writes the uploaded file. I suspect there's a bug in that. If you add that to your question, we could help you find it.
Where is GetBasename() defined? Can we see the code?
Your sox command seems strange. You're running sox on a file called temp.wav and then copying that file over your uploaded file. Perhaps there are a couple of steps that you aren't telling us.
Some other suggestions for improvement:
Use cgi->new, not new CGI. The latter has some strange corner cases that you will have real problems debugging if you ever come across them.
If you're loading the CGI module, then why not use its header method instead of writing your own (technically incorrect) header.
no strict 'refs' is a really bad idea (and, as far as I can see, isn't needed here).
Please use the three-arg version of open() and lexical filehandles
open my $out_fh, '>', "../cgi-bin/upload/$basename"
Include the file path in your error message
my $file = "../cgi-bin/upload/$basename";
if (!open my $out_fh, '>', $file) {
print "Can't open file '$file' for writing - $!";
exit(-1);
}
You are loading the File::Copy module, but then moving your file using a shell command.
Allowing random users to upload files into a directory under your cgi-bin directory is a massive potential security hole. You should find another directory to store the uploaded files.
Oh, and then there's the whole - why on Earth would you be writing CGI programs in 2017!
The issue is resolved. The reason why I was having problem executing the sox and copy commands was because of where I was placing the two commands in code. Basically a beginners error. So I was opening the file as mentioned in the problem statement. I put the copy and sox commands for execution before I closed the filehandler and hence they were not getting executed successfully.

Open File to Read from in Perl

I just started learning Perl today. I'm on the section of file input and output. This is a very basic question, and I've been searching on the internet for a couple of hours as to what I'm doing wrong, but I can't seem to find out why. I'm sure some of you think this question should be voted down, but if I could find the answer by myself through internet searching and troubleshooting, I wouldn't be asking it here.
My question is why can't I open the file that I'm referring to in my filepath?
open(my $in, "<", "ioFile.txt") or die "Can't open input.txt: $!";
The ioFile.txt is in the same directory as my Perl script. I've used multiple different filepaths to see which worked, and none have for me so far. I've tried using forward slashes instead of backslashes as well.
Any tips about opening this specific file or files in general in Perl would be greatly appreciated.
After Edit:
It could be permissions on the file, but I do have read and write permissions on the file, but not full control permissions. I'm on Windows 7 btw.
If you're not running the script while you are in the directory the script and the file you want to open are in, then you have to specify the full path to the file:
open my $in, '<', 'c:\path\to\ioFile.txt' or die "Can't open input.txt: $!";
perl will look for the input file from the location you are running the script from, not in the directory the script is in (again, unless you are in that directory when you are running the script).

CGI log file empty

I feel like a real novice with this question, but sometimes huge experience can have a bad day.
I have this CGI script for file upload in file chunks. It is correctly uploading my files just fine, however I can't get a log file to write log data even though it used to work perfectly. The log file is zero bytes.
I added a couple of lines to greatly simplify debug at the start of the script. They are listed below. They open a new log file, write a simple line of text, then close it.
I figured it can't be a flush problem as I'm closing the file. It must be opening the file because the file exists, just zero bytes. It must be running the CGI script, because my file does not upload otherwise. And it has permission to write to /tmp, as the file is created. Also there are no errors in /etc/httpd/logs/errorlog
open(LOGFILE, ">", "/tmp/olss2.log") or die "Can't open log file: $!";
$| = 1;
print LOGFILE "New Log Entry Started";
close LOGFILE;

How to open multiple files in Perl

Guys im really confused now. Im new to learning Perl. The book ive read sometimes do Perl codes and sometimes do Linux commands.
Is there any connection between them? (Perl codes and linux commands)
I want to open multiple files using Perl code, i know how to open a single file in Perl using:
open (MYFILE,'somefileshere');
and i know how to view multiple files in Linux using ls command.
So how to do this? can i use ls in perl? And i want to open certain files only (perl files) which dont have file extension visible (I cant use *.txt or etc. i guess)
A little help guys
Use system function to execute linux command, glob - for get list of files.
http://perldoc.perl.org/functions/system.html
http://perldoc.perl.org/functions/glob.html
Like:
my #files = glob("*.h *.m"); # matches all files with a .h or .m extension
system("touch a.txt"); # linux command "touch a.txt"
Directory handles are also quite nice, particularly for iterating over all the files in a directory. Example:
opendir(my $directory_handle, "/path/to/directory/") or die "Unable to open directory: $!";
while (my $file_name = <$directory_handle>) {
next if $file_name =~ /some_pattern/; # Skip files matching pattern
open (my $file_handle, '>', $file_name) or warn "Could not open file '$file_name': $!";
# Write something to $file_name. See <code>perldoc -f open</code>.
close $file_handle;
}
closedir $directory_handle;

Unable to open text file in Perl

I am fetching some log files (which are in txt format) from another server and trying to parse them using my Perl script. The logs are being fetched correctly after which I set permissions to 777 for the log directory.
After this I attempt to open the log files, one by one for parsing, via my Perl script. Now, the strange thing and the problem which happens is, my script is sometimes able to open the file and sometimes NOT. To put it simply, it's unable to open the log files for parsing at times.
Also, I have cronned this perl script and the chances of file open failing are greater when it runs via cron rather than manually, although they have run successfully in both cases previously. I don't understand where the issue lies.
Here is the code which I use for opening the files,
$inputDir = "/path/to/dir";
#inputFiles = <$inputDir/*>;
# inputFiles array is list of files in the log directory
foreach my $logFile(#inputFiles)
{
# just to ensure file name is text
$logFile = $logFile."";
# process file only if filename contains "NOK"
if(index($logFile,"NOK") > -1)
{
# opens the file
open($ifile, '<', $logFile) or die "Error: Unable to open file for processing.";
# file parsing takes place
}
}
close($ifile);
I want to re-iterate that this code HAS run successfully and I haven't changed any part of it. Yet, it does not run every time without fail, because its unable to open the log file at times. Any ideas?
You should include the error message $! and the file name $logFile in your die string to see why the open failed, and for which file.
open($ifile, '<', $logFile) or die "Error: Unable to open $logFile: $!";
Also, this line:
$logFile = $logFile."";
...is quite redundant. If a conversion is necessary, perl will handle it.
Just as an example, this is what you code should look like. You may like to try this version
use strict;
use warnings;
my $inputDir = '/path/to/dir';
my #inputFiles = <$inputDir/*>;
foreach my $logFile (grep /NOK/, #inputFiles) {
open my $ifile, '<', $logFile or die qq(Unable to open "$logFile": $!);
# Process data from <$ifile>;
}
Maybe opening some files fails, because your program has too many open files. Your program opens all files in $inputDir and processes them in the loop. After that it closes the last file opened.
EDIT: after reading TLP's comment and reading perldoc -f close and perldoc -f open I see that TLP is right and the filehandle in $ifile is closed by a subsequent open($ifile,'<',$logFile) . However, if the file parsing code not shown by the topic creator creates another reference to $ifile the file handle would stay open.
Moving the call to close into the if block should solve your problem:
$inputDir = "/path/to/dir";
#inputFiles = <$inputDir/*>;
# inputFiles array is list of files in the log directory
foreach my $logFile(#inputFiles)
{
# process file only if filename contains "NOK"
if(index($logFile,"NOK") > -1)
{
# opens the file
# added my to $ifile to keep local to this scope
open(my $ifile, '<', $logFile) or die "Error: Unable to open file for processing.";
# file parsing takes place
# close current file
close($ifile);
}
}