I am trying to write out to a file, where the file name is created from a variable(the name + the user id + the date and time + file extension).
I have read various things on Stackoverflow which I have based my code off.
my $windowsfile = "winUserfile-$User_ID-$datetime.csv";
open winUserfile, ">>", $windowsfile) or die "$!";
print winUserfile "User_ID, Expression\n";
close winUserfile;
I would assumed this would work, but I am getting a syntax error. Would anyone be able to help?
Your second line has a close-paren without the preceeding open:
open winUserfile, ">>", $windowsfile) or die "$!";
You likely want to open it first
open(winUserfile, ">>", $windowsfile) or die "$!";
Or just not bother with them entirely here, as they're optional in this case
open winUserfile, ">>", $windowsfile or die "$!";
Also, it's bad style to use a bareword filehandle, as this creates becomes global. Better to use a lexical one:
open my $winUserfile, ">>", $windowsfile or die "$!";
print $winUserfile "User_ID, Expression\n";
You don't then need to close it; the close will be automatic when the $winUserfile variable goes out of scope.
I like using the IO::All module for file io.
use IO::All
my $windowsfile = "winUserfile-$User_ID-$datetime.csv";
io($windowsfile) > "User_ID, Expression\n";
my $windowsfile = "winUserfile-$User_ID-$datetime.csv";
open (winUserfile, ">>$windowsfile") or die "$!";
print winUserfile "User_ID, Expression\n";
close winUserfile;
Related
I have the following three lines:
rename($file_path, $file_fh.'.bak');
open( my $file_IN_fh, '<' , $file_path.'.bak') || die "die message";
open( my $file_OUT_fh, '>' , $file_path) || die "die message";
It works great. It allows me to go through the in file while(<$file_IN_fh>), make a bunch of changes with a script (s///g, if() to determine if the line stays or not, etc), and write to the out file. In the end I get my edited file and the file name is unchanged.
My issue is that I am at a point where I no longer (currently) want the backup files, so I want to replace the code with something similar that won't create the backup file, and comment back and forth the three lines over the years if my needs change.
How do I do this kind of editing in place not from the command line?
One basic way is to read the file line by line and write desired output lines to a temporary file, which is then renamed so to overwrite the original.
use File::Copy qw(move);
open my $fh, '<', $file or die "Can't open $file: $!";
open my $fh_out, '>', $outfile or die "Can't open $outfile: $!";
while (<$fh>) {
next if /line_to_skip/;
s/patt/repl/g;
print $fh_out $_;
}
close $_ for ($fh, $fh_out);
move ($outfile, $file) or die "Can't move $outfile to $file: $!";
This is what is normally done by tools that edit files "in place" (with additional safety, checks, and flexibility). Since the $outfile is temporary use File::Temp.
Add checks when close-ing files.
Note that this changes the file's inode number, which may matter for some applications.†
If the file isn't huge you can simplify this and read it in first
open my $fh, '<', $file or die "Can't open $file: $!";
my #lines = <$fh>;
open $fh, '>', $file or die "Can't open $file for writing: $!";
for (#lines) {
next if /line_to_skip/;
s/patt/repl/g;
print $fh_out $_;
}
close $fh;
what preserves the inode number, since > mode truncates the existing inode data.
† If this is indeed a problem, you can still keep the same inode. After the temporary file is written, open it for reading and open the original file for writing; that truncates the contents of that inode. Then copy the temporary file to the original. Close handles and delete the temporary file.
If the file is huge, then I'd question why you'd want to avoid the temporary file. Otherwise, I'd suggest just loading the file into memory, make modifications, then write it back out.
use File::Slurp qw( read_file write_file );
my $in = read_file($qfn, array_ref => 1);
my #out;
while (defined( $_ = shift(#$in) )) {
s/a/b/g; # For example.
push #out, $_ if /c/; # For example.
}
write_file($qfn, \#out);
I avoided using expensive splice by using two arrays.
Note that using Tie::File might save one line of code, but this will be 30x faster[1], and probably use less memory (despite memory-saving being Tie::File's goal). Tie::File is never the answer!!!
This is not necessarily representative of all Tie::File uses, but I have indeed timed Tie::File taking 30x longer than the alternative at some basic task. That means that 2 seconds worth of work would have taken 1 minute with Tie::File!
Take a look at the Tie::File module. It is a core module and so shouldn't need installing, and the code is as simple as
use Tie::File;
tie my #file, 'Tie::File', $filepath or die $!;
Thereafter the array #file will hold the contents of the file, one line per element, and any changes to the array will be reflected in the file. All array operations such as push, splice, etc. will work fine
Note that line one of the file is in element zero of the array etc.
Once a file handle is closed is it possible to reopen it without creating a new file handle?
ex (This doesn't work):
use strict;
use warnings;
# Open and write to a file
open my $fh, '>', 'file.txt' or die $!;
print $fh 'Hi!';
close $fh;
# Reopen the file and read its contents
open $fh, '<' or die $!;
my #contents = <$fh>;
close $fh;
print #contents;
I'm trying to avoid doing this:
open my $fh2, '<', 'file.txt' or die $!;
You cannot "reopen" it. The best solution is to avoid closing, if you
need it later. Remember that you can alway use seek if you need
to rewind to some portion of the file.
# open for read/write
open my $fh, '+<', 'file.txt' or die $!;
# do something to file, like writing
# go to the beginning
seek FILEHANDLE, 0, 0;
# do something else, like reading
I don't know that you can reopen a closed handle. But, you can reuse the $fh variable as a filehandle since it's been closed. But, you're missing the name of the file when you open again.
open $fh, '<', 'file.txt' or die $!;
I have a folder called ALLEQData in which I have 100 files that end in '.ndk' for example, 'jan05.ndk', 'feb05.ndk' etc.
Using a Perl script I would like to open every single file ending in '.ndk', read the information contained in that file and place it in an output file.
Before I only needed to open one file and read it, for which I used:
my $filename = "jan76_dec10.ndk";
open FILEEQ, "<$filename"
or die "can't open '$filename' for reading: $!";
close FILEEQ;
$icount = 0;
for ($j=0; $j<#equ_file; $j++) .....etc
Then read over the information. I can read and sort the information into the output that I want.
What I am not sure how to do, is how to open all of the file that end in '.ndk', one by one, do the reading and sorting, close that file, then move onto the next one?
Hope this is clear enough.
Use glob:
my #filenames = glob('*.ndk');
for my $filename (#filenames) {
open my $fh, '<', $filename
or die "can't open '$filename' for reading: $!";
# read/sort file
close $fh;
}
Take a look at Perl's globbing mechanism.
You could also read filenames from the command line:
while ($#ARGV > -1) {
my $filename = shift;
open my $fh, '<', $filename
or die "can't open $filename for reading: $!";
# ...
close $fh;
}
Then call your perl script with a wild card expression:
your-script.pl *.ndk
The below is the Perl script that I wrote today. This reads the content from one file and writes on the other file. It works but, not completely.
#---------------------------------------------------------------------------
#!/usr/bin/perl
open IFILE, "text3.txt" or die "File not found";
open OFILE, ">text4.txt" or die "File not found";
my $lineno = 0;
while(<IFILE>)
{
#var=<IFILE>;
$lineno++;
print OFILE "#var";
}
close(<IFILE>);
close(<OFILE>);
#---------------------------------------------------------------------------
The issue is, it reads and writes contens, but not all.
text3.txt has four lines. The above script reads only from second line and writes on text4.txt. So, finally I get only three lines (line.no 2 to line.no 4) of text3.txt.
What is wrong with the above program. I don't have any idea about how to check the execution flow on Perl scripts. Kindly help me.
I'm completely new to Programming. I believe, learning all these would help me in changing my career path.
Thanks in Advance,
Vijay
<IFILE> reads one line from IFILE (only one because it's in scalar context). So while(<IFILE>) reads the first line, then the <IFILE> in list context within the while block reads the rest. What you want to do is:
# To read each line one by one:
while(!eof(IFILE)) { # check if end of file is reached instead of reading a line
my $line = <IFILE>; # scalar context, reads only one line
print OFILE $line;
}
# Or to read the whole file at once:
my #content = <IFILE>; # list context, read whole file
print OFILE #content;
The problem is that this line...
while(<IFILE>)
...reads one line from text3.txt, and then this line...
#var=<IFILE>;
...reads ALL of the remaining lines from text3.txt.
You can do it either way, by looping with while or all at once with #var=<IFILE>, but trying to do both won't work.
This is how I would have written the code in your question.
#!/usr/bin/perl
use warnings;
use strict;
use autodie;
# don't need to use "or die ..." when using the autodie module
open my $input, '<', 'text3.txt';
open my $output, '>', 'text4.txt';
while(<$input>){
my $lineno = $.;
print {$output} $_;
}
# both files get closed automatically when they go out of scope
# so no need to close them explicitly
I would recommend always putting use strict and use warnings at the beginning of all Perl files. At least until you know exactly why it is recommended.
I used autodie so that I didn't have to check the return value of open manually. ( autodie was added to Core in version 5.10.1 )
I used the three argument form of open because it is more robust.
It is important to note that while (<$input>){ ... } gets transformed into while (defined($_ = <$input>)){ ... } by the compiler. Which means that the current line is in the $_ variable.
I also used the special $. variable to get the current line number, rather than trying to keep track of the number myself.
There is a couple of questions you might want to think about, if you are strictly copying a file you could use File::Copy module.
If you are going to process the input before writing it out, you might also consider whether you want to keep both files open at the same time or instead read the whole content of the first file (into memory) first, and then write it to the outfile.
This depends on what you are doing underneath. Also if you have a huge binary file, each line in the while-loop might end up huge, so if memory is indeed an issue you might want to use more low-level stream-based reading, more info on I/O: http://oreilly.com/catalog/cookbook/chapter/ch08.html
My suggestion would be to use the cleaner PBP suggested way:
#!/usr/bin/perl
use strict;
use warnings;
use English qw(-no_match_vars);
my $in_file = 'text3.txt';
my $out_file = 'text4.txt';
open my $in_fh, '<', $in_file or die "Unable to open '$in_file': $OS_ERROR";
open my $out_fh, '>', $out_file or die "Unable to open '$out_file': $OS_ERROR";
while (<$in_fh>) {
# $_ is automatically populated with the current line
print { $out_fh } $_ or die "Unable to write to '$out_file': $OS_ERROR";
}
close $in_fh or die "Unable to close '$in_file': $OS_ERROR";
close $out_fh or die "Unable to close '$out_file': $OS_ERROR";
OR just print out the whole in-file directly:
#!/usr/bin/perl
use strict;
use warnings;
use English qw(-no_match_vars);
my $in_file = 'text3.txt';
my $out_file = 'text4.txt';
open my $in_fh, '<', $in_file or die "Unable to open '$in_file': $OS_ERROR";
open my $out_fh, '>', $out_file or die "Unable to open '$out_file': $OS_ERROR";
local $INPUT_RECORD_SEPARATOR; # Slurp mode, read in all content at once, see: perldoc perlvar
print { $out_fh } <$in_fh> or die "Unable to write to '$out_file': $OS_ERROR";;
close $in_fh or die "Unable to close '$in_file': $OS_ERROR";
close $out_fh or die "Unable to close '$out_file': $OS_ERROR";
In addition if you just want to apply a regular expression or similar to a file quickly, you can look into the -i switch of the perl command: perldoc perlrun
perl -p -i.bak -e 's/foo/bar/g' text3.txt; # replace all foo with bar in text3.txt and save original in text3.txt.bak
When you're closing the files, use just
close(IFILE);
close(OFILE);
When you surround a file handle with angle brackets like <IFILE>, Perl interprets that to mean "read a line of text from the file inside the angle brackets". Instead of reading from the file, you want to close the actual file itself here.
I am getting this error while executing my Perl script. Please, tell me how to rectify this error in Perl.
print() on closed filehandle MYFILE
This is the code that is giving the error:
sub return_error
{
$DATA= "Sorry this page is corrently being updated...<p>";
$DATA.= " Back ";
open(MYFILE,">/home/abc/xrt/sdf/news/top.html");
print MYFILE $DATA;
close(MYFILE);
exit;
}
I hope that now I'm clearer.
You want to do some action on MYFILE after you (or the interpreter itself because of an error) closed it.
According to your code sample, the problem could be that open doesn't really open the file, the script may have no permission to write to the file.
Change your code to the following to see if there was an error:
open(MYFILE, ">", "/home/abc/xrt/sdf/news/top.html") or die "Couldn't open: $!";
Update
ysth pointed out that -w is not really good at checking if you can write to the file, it only ‘checks that one of the relevant flags in the mode is set’. Furthermore, brian d foy told me that the conditional I've used isn't good at handling the error. So I removed the misleading code. Use the code above instead.
It appears that the open call is failing. You should always check the status when opening a filehandle.
my $file = '/home/abc/xrt/sdf/news/top.html';
open(MYFILE, ">$file") or die "Can't write to file '$file' [$!]\n";
print MYFILE $DATA;
close MYFILE;
If the open is unsuccessful, the built-in variable $! (a.k.a. $OS_ERROR) will contain the OS-depededant error message, e.g. "Permission denied"
It's also preferable (for non-archaic versions of Perl) to use the three-argument form of open and lexical filehandles:
my $file = '/home/abc/xrt/sdf/news/top.html';
open(my $fh, '>', $file) or die "Can't write to file '$file' [$!]\n";
print {$fh} $DATA;
close $fh;
An alternate solution to saying or die is to use the autodie pragma:
#!/usr/bin/perl
use strict;
use warnings;
use autodie;
open my $fh, "<", "nsdfkjwefnbwef";
print "should never get here (unless you named files weirdly)\n";
The code above produces the following error (unless a file named nsdfkjwefnbwef exists in the current directory):
Can't open 'nsdfkjwefnbwef' for reading: 'No such file or directory' at example.pl line 7
This:
open(MYFILE,">/home/abc/xrt/sdf/news/top.html");
In modern Perl, it could be written as:
open(my $file_fh, ">", "/home/abc/xrt/sdf/news/top.html") or die($!);
This way you get a $variable restricted to the scope, there is no "funky business" if you have weird filenames (e.g. starting with ">") and error handling (you can replace die with warn or with error handling code).
Once you close $file_fh or simply go out of scope, you can not longer print to it.
I had this problem when my files were set to READ-ONLY.
Check this also, before giving up! :)
Check that the open worked
if(open(my $FH, ">", "filename") || die("error: $!"))
{
print $FH "stuff";
close($FH);
}
If you use a global symbol MYFILE as your filehandle, rather than a local lexical ($myfile), you will invariably run into issues if your program is multithreaded, e.g. if it is running via mod_perl. One process could be closing the filehandle while another process is attempting to write to it. Using $myfile will avoid this issue as each instance will have its own local copy, but you will still run into issues where one process could overwrite the data that another is writing. Use flock() to lock the file while writing to it.
Somewhere in you're script you will be doing something like:
open MYFILE, "> myfile.txt";
# do stuff with myfile
close MYFILE;
print MYFILE "Some more stuff I want to write to myfile";
The last line will throw an error because MYFILE has been closed.
Update
After seeing your code, it looks like the file you are trying to write to can't be opened in the first place. As others have already mentioned try doing something like:
open MYFILE, "> myfile.txt" or die "Can't open myfile.txt: $!\n"
Which should give you some feedback on why you can't open the file.