I am completely new to this and this should be the easiest thing to do but for some reason I cannot get my local text file to print. After trying multiple times with different code I came to use the following code but it doesn't print.
I have searched for days on various threads to solve this and have had no luck. Please help. Here is my code:
#!/usr/bin/perl
$newfile = "file.txt";
open (FH, $newfile);
while ($file = <FH>) {
print $file;
}
I updated my code to the following:
#!/user/bin/perl
use strict; # Always use strict
use warnings; # Always use warnings.
open(my $fh, "<", "file.txt") or die "unable to open file.txt: $!";
# Above we open file using 3 handle method
# or die die with error if unable to open it.
while (<$fh>) { # While in the file.
print $_; # Print each line
}
close $fh; # Close the file
system('C:\Users\RSS\file.txt');
It returns the following: my first report generated by perl. I do not know where this is coming from. Nowhere do I have a print "my first report generated by perl."; statement and it definitely is not in my text file.
My text file is full of various emails, addresses, phone numbers and snippets of emails.
Thank you all for your help. I figured out my problem. I somehow managed to kick myself out of my directory and did not realize it.
This is most likely a combination of a failure to open the file, and a failure to check the return value of open.
If you are completely new to perl, I warmly recommend reading the excellent "perlintro" man page, using either man perlintro or perldoc perlintro on the command line, or taking a look here: https://perldoc.perl.org/perlintro.html.
The "Files and I/O" section there gives a good and concise way of doing this:
open(my $in, "<", "input.txt") or die "Can't open input.txt: $!";
while (<$in>) { # assigns each line in turn to $_
print "Just read in this line: $_";
}
This version will give you an explanation and abort if anything goes wrong while trying to open the file. For example, if there is no file named file.txt in the current working directory, your version will quietly fail to open the file, and afterwards it will quietly fail to read from the closed file handle.
Also, always adding at least one of these to your perl scripts will save you a lot of trouble in the long run:
use warnings; # or use the -w command line switch to turn warnings on globally
use diagnostics;
These won't catch the failure to open the file, but will alert on the failed read.
In the first example here you can see that without the diagnostics module, the code fails without any error messages. The second example shows how the diagnostics module changes this.
$ perl -le 'open FH, "nonexistent.txt"; while(<FH>){print "foo"}'
$ perl -le 'use diagnostics; open FH, "nonexistent.txt"; while(<FH>){print "foo"}'
readline() on closed filehandle FH at -e line 1 (#1)
(W closed) The filehandle you're reading from got itself closed sometime
before now. Check your control flow.
By the way, the legendary "Camel Book" is basically the perl man pages formatted for paper printing, so reading the perldocs in the order listed in perldoc perl will give you a high level of understanding of the language in a reasonably accessible and inexpensive manner.
Happy hacking!
This is simple and including explanations.
use strict; # Always use strict
use warnings; # Always use warnings.
open(my $fh, "<", "file.txt") or die "unable to open file.txt: $!";
# Above we open file using 3 handle method
# or die die with error if unable to open it.
while (<$fh>) { # While in the file.
print $_; # Print each line
}
close $fh; # Close the file
There is then also the case where you are trying to open a file which is not in a location where you think it is. So consider doing full path, if not in the same dir.
open(my $fh, "<", 'F:\Workdir\file.txt') or die "unable to open < input.txt: $!";
EDIT: After your comments, it seems that you are opening an empty file. Please add this at the bottom of that same script and rerun. It will open the file in C:\Users\RSS and make sure it does actually contain data?
system('C:\Users\RSS\file.txt');
First, of all as you are starting out, it is better to enable all warnings by 'use warnings' and disable all such expression which can lead to uncertain behavior or are difficult to debug by pragma 'use strict'.
As you are dealing with file stream, it is always recommended to the check if you were able to open the stream. so, try to use croak or die both would terminate the program with a given message.
Instead of reading inside the while condition, I would recommend checking for end of file. So, loop breaks as end is found. Usually, when reading a line you would use it for further processing, so it is good idea to remove end of lines using chomp.
A sample for reading a file in perl can be as follows:
#!/user/bin/perl
use strict;
use warnings;
my $newfile = "file.txt";
open (my $fh, $newfile) or die "Could not open file '$newfile' $!";
while (!eof($fh))
{
my $line=<$fh>;
chomp($line);
print $line , "\n";
}
Related
I am trying to read in a bunch of similar files and process them one by one. Here is the code I have. But somehow the perl script doesn't read in the files correctly. I'm not sure how to fix it. The files are definitely readable and writable by me.
#!/usr/bin/perl
use strict;
use warnings;
my #olap_f = `ls /full_dir_to_file/*txt`;
foreach my $file (#olap_f){
my %traits_h;
open(IN,'<',$file) || die "cannot open $file";
while(<IN>){
chomp;
my #array = split /\t/;
my $trait = $array[4];
$traits_h{$trait} ++;
}
close IN;
}
When I run it, the error message (something like below) showed up:
cannot open /full_dir_to_file/a.txt
You have newlines at the end of each filename:
my #olap_f = `ls ~dir_to_file/*txt`;
chomp #olap_f; # Remove newlines
Better yet, use glob to avoid launching a new process (and having to trim newlines):
my #olap_f = glob "~dir_to_file/*txt";
Also, use $! to find out why a file couldn't be opened:
open(IN,'<',$file) || die "cannot open $file: $!";
This would have told you
cannot open /full_dir_to_file/a.txt
: No such file or directory
which might have made you recognize the unwanted newline.
I'll add a quick plug for IO::All here. It's important to know what's going on under the hood but it's convenient sometimes to be able to do:
use IO::All;
my #olap_f = io->dir('/full_dir_to_file/')->glob('*txt');
In this case it's not shorter than #cjm's use of glob but IO::All does have a few other convenient methods for working with files as well.
I was trying out an elementary Perl/CGI script to keep track of visitors coming to a web page. The Perl code looks like this:
#!/usr/bin/perl
#KEEPING COUNT OF VISITORS IN A FILE
use CGI':standard';
print "content-type:text/html\n\n";
#opening file in read mode
open (FILE,"<count.dat");
$cnt= <FILE>;
close(FILE);
$cnt=$cnt+1;
#opening file to write
open(FILE,">count.dat");
print FILE $cnt;
close(FILE);
print "Visitor count: $cnt";
The problem is that the web page does not increment the count of visitors on each refresh. The count remains at the initital value of $cnt , ie 1. Any ideas where the problem lies?
You never test if the attempt to open the file handle works. Given a file which I had permission to read from and write to that contained a single number and nothing else, the code behaved as intended. If the file did not exist then the count would always be 1, if it was read-only then it would remain at whatever the file started at.
More general advice:
use strict; and use warnings; (and correct code based on their complaints)
Use the three argument call to open as per the first example in the documentation
When you open a file always || handle_the_error_in($!);
Don't use a file to store data like this, you have potential race conditions.
Get the name of the language correct
Here's an alternate solution that uses only one open() and creates the file if it doesn't already exist. Locking eliminates a potential race condition among multiple up-daters.
#!/usr/bin/env perl
use strict;
use warnings;
use Fcntl qw(:DEFAULT :flock);
my $file = 'mycount';
sysopen(my $fh, $file, O_RDWR|O_CREAT) or die "Can't open '$file' $!\n";
flock($fh, LOCK_EX) or die "Can't lock $file: $!\n";
my $cnt = <$fh>;
$cnt=0 unless $cnt;
$cnt++;
seek $fh, 0, 0;
print ${fh} $cnt;
close $fh or die "Can't close $file: $\n";
print "Visitor count: $cnt\n";
A few potential reasons:
'count.dat' is not being opened for reading. Always test with or die $!; at minimum to check if the file opened or not
The code is not being executed and you think it is
The most obvious thing that you would have forgotten is to change permissions of the file count.dat
Do this :
sudo chmod 777 count.dat
That should do the trick
You will need to close the webpage and reopen it again. Just refreshing the page won't increment the count.
I would like to take input from a text file in Perl. Though lot of info are available over the net, it is still very confusing as to how to do this simple task of printing every line of text file. So how to do it? I am new to Perl, thus the confusion .
eugene has already shown the proper way. Here is a shorter script:
#!/usr/bin/perl
print while <>
or, equivalently,
#!/usr/bin/perl -p
on the command line:
perl -pe0 textfile.txt
You should start learning the language methodically, following a decent book, not through haphazard searches on the web.
You should also make use of the extensive documentation that comes with Perl.
See perldoc perltoc or perldoc.perl.org.
For example, opening files is covered in perlopentut.
First, open the file:
open my $fh, '<', "filename" or die $!;
Next, use the while loop to read until EOF:
while (<$fh>) {
# line contents's automatically stored in the $_ variable
}
close $fh or die $!;
# open the file and associate with a filehandle
open my $file_handle, '<', 'your_filename'
or die "Can't open your_filename: $!\n";
while (<$file_handle>) {
# $_ contains each record from the file in turn
}
The below is the Perl script that I wrote today. This reads the content from one file and writes on the other file. It works but, not completely.
#---------------------------------------------------------------------------
#!/usr/bin/perl
open IFILE, "text3.txt" or die "File not found";
open OFILE, ">text4.txt" or die "File not found";
my $lineno = 0;
while(<IFILE>)
{
#var=<IFILE>;
$lineno++;
print OFILE "#var";
}
close(<IFILE>);
close(<OFILE>);
#---------------------------------------------------------------------------
The issue is, it reads and writes contens, but not all.
text3.txt has four lines. The above script reads only from second line and writes on text4.txt. So, finally I get only three lines (line.no 2 to line.no 4) of text3.txt.
What is wrong with the above program. I don't have any idea about how to check the execution flow on Perl scripts. Kindly help me.
I'm completely new to Programming. I believe, learning all these would help me in changing my career path.
Thanks in Advance,
Vijay
<IFILE> reads one line from IFILE (only one because it's in scalar context). So while(<IFILE>) reads the first line, then the <IFILE> in list context within the while block reads the rest. What you want to do is:
# To read each line one by one:
while(!eof(IFILE)) { # check if end of file is reached instead of reading a line
my $line = <IFILE>; # scalar context, reads only one line
print OFILE $line;
}
# Or to read the whole file at once:
my #content = <IFILE>; # list context, read whole file
print OFILE #content;
The problem is that this line...
while(<IFILE>)
...reads one line from text3.txt, and then this line...
#var=<IFILE>;
...reads ALL of the remaining lines from text3.txt.
You can do it either way, by looping with while or all at once with #var=<IFILE>, but trying to do both won't work.
This is how I would have written the code in your question.
#!/usr/bin/perl
use warnings;
use strict;
use autodie;
# don't need to use "or die ..." when using the autodie module
open my $input, '<', 'text3.txt';
open my $output, '>', 'text4.txt';
while(<$input>){
my $lineno = $.;
print {$output} $_;
}
# both files get closed automatically when they go out of scope
# so no need to close them explicitly
I would recommend always putting use strict and use warnings at the beginning of all Perl files. At least until you know exactly why it is recommended.
I used autodie so that I didn't have to check the return value of open manually. ( autodie was added to Core in version 5.10.1 )
I used the three argument form of open because it is more robust.
It is important to note that while (<$input>){ ... } gets transformed into while (defined($_ = <$input>)){ ... } by the compiler. Which means that the current line is in the $_ variable.
I also used the special $. variable to get the current line number, rather than trying to keep track of the number myself.
There is a couple of questions you might want to think about, if you are strictly copying a file you could use File::Copy module.
If you are going to process the input before writing it out, you might also consider whether you want to keep both files open at the same time or instead read the whole content of the first file (into memory) first, and then write it to the outfile.
This depends on what you are doing underneath. Also if you have a huge binary file, each line in the while-loop might end up huge, so if memory is indeed an issue you might want to use more low-level stream-based reading, more info on I/O: http://oreilly.com/catalog/cookbook/chapter/ch08.html
My suggestion would be to use the cleaner PBP suggested way:
#!/usr/bin/perl
use strict;
use warnings;
use English qw(-no_match_vars);
my $in_file = 'text3.txt';
my $out_file = 'text4.txt';
open my $in_fh, '<', $in_file or die "Unable to open '$in_file': $OS_ERROR";
open my $out_fh, '>', $out_file or die "Unable to open '$out_file': $OS_ERROR";
while (<$in_fh>) {
# $_ is automatically populated with the current line
print { $out_fh } $_ or die "Unable to write to '$out_file': $OS_ERROR";
}
close $in_fh or die "Unable to close '$in_file': $OS_ERROR";
close $out_fh or die "Unable to close '$out_file': $OS_ERROR";
OR just print out the whole in-file directly:
#!/usr/bin/perl
use strict;
use warnings;
use English qw(-no_match_vars);
my $in_file = 'text3.txt';
my $out_file = 'text4.txt';
open my $in_fh, '<', $in_file or die "Unable to open '$in_file': $OS_ERROR";
open my $out_fh, '>', $out_file or die "Unable to open '$out_file': $OS_ERROR";
local $INPUT_RECORD_SEPARATOR; # Slurp mode, read in all content at once, see: perldoc perlvar
print { $out_fh } <$in_fh> or die "Unable to write to '$out_file': $OS_ERROR";;
close $in_fh or die "Unable to close '$in_file': $OS_ERROR";
close $out_fh or die "Unable to close '$out_file': $OS_ERROR";
In addition if you just want to apply a regular expression or similar to a file quickly, you can look into the -i switch of the perl command: perldoc perlrun
perl -p -i.bak -e 's/foo/bar/g' text3.txt; # replace all foo with bar in text3.txt and save original in text3.txt.bak
When you're closing the files, use just
close(IFILE);
close(OFILE);
When you surround a file handle with angle brackets like <IFILE>, Perl interprets that to mean "read a line of text from the file inside the angle brackets". Instead of reading from the file, you want to close the actual file itself here.
I am getting this error while executing my Perl script. Please, tell me how to rectify this error in Perl.
print() on closed filehandle MYFILE
This is the code that is giving the error:
sub return_error
{
$DATA= "Sorry this page is corrently being updated...<p>";
$DATA.= " Back ";
open(MYFILE,">/home/abc/xrt/sdf/news/top.html");
print MYFILE $DATA;
close(MYFILE);
exit;
}
I hope that now I'm clearer.
You want to do some action on MYFILE after you (or the interpreter itself because of an error) closed it.
According to your code sample, the problem could be that open doesn't really open the file, the script may have no permission to write to the file.
Change your code to the following to see if there was an error:
open(MYFILE, ">", "/home/abc/xrt/sdf/news/top.html") or die "Couldn't open: $!";
Update
ysth pointed out that -w is not really good at checking if you can write to the file, it only ‘checks that one of the relevant flags in the mode is set’. Furthermore, brian d foy told me that the conditional I've used isn't good at handling the error. So I removed the misleading code. Use the code above instead.
It appears that the open call is failing. You should always check the status when opening a filehandle.
my $file = '/home/abc/xrt/sdf/news/top.html';
open(MYFILE, ">$file") or die "Can't write to file '$file' [$!]\n";
print MYFILE $DATA;
close MYFILE;
If the open is unsuccessful, the built-in variable $! (a.k.a. $OS_ERROR) will contain the OS-depededant error message, e.g. "Permission denied"
It's also preferable (for non-archaic versions of Perl) to use the three-argument form of open and lexical filehandles:
my $file = '/home/abc/xrt/sdf/news/top.html';
open(my $fh, '>', $file) or die "Can't write to file '$file' [$!]\n";
print {$fh} $DATA;
close $fh;
An alternate solution to saying or die is to use the autodie pragma:
#!/usr/bin/perl
use strict;
use warnings;
use autodie;
open my $fh, "<", "nsdfkjwefnbwef";
print "should never get here (unless you named files weirdly)\n";
The code above produces the following error (unless a file named nsdfkjwefnbwef exists in the current directory):
Can't open 'nsdfkjwefnbwef' for reading: 'No such file or directory' at example.pl line 7
This:
open(MYFILE,">/home/abc/xrt/sdf/news/top.html");
In modern Perl, it could be written as:
open(my $file_fh, ">", "/home/abc/xrt/sdf/news/top.html") or die($!);
This way you get a $variable restricted to the scope, there is no "funky business" if you have weird filenames (e.g. starting with ">") and error handling (you can replace die with warn or with error handling code).
Once you close $file_fh or simply go out of scope, you can not longer print to it.
I had this problem when my files were set to READ-ONLY.
Check this also, before giving up! :)
Check that the open worked
if(open(my $FH, ">", "filename") || die("error: $!"))
{
print $FH "stuff";
close($FH);
}
If you use a global symbol MYFILE as your filehandle, rather than a local lexical ($myfile), you will invariably run into issues if your program is multithreaded, e.g. if it is running via mod_perl. One process could be closing the filehandle while another process is attempting to write to it. Using $myfile will avoid this issue as each instance will have its own local copy, but you will still run into issues where one process could overwrite the data that another is writing. Use flock() to lock the file while writing to it.
Somewhere in you're script you will be doing something like:
open MYFILE, "> myfile.txt";
# do stuff with myfile
close MYFILE;
print MYFILE "Some more stuff I want to write to myfile";
The last line will throw an error because MYFILE has been closed.
Update
After seeing your code, it looks like the file you are trying to write to can't be opened in the first place. As others have already mentioned try doing something like:
open MYFILE, "> myfile.txt" or die "Can't open myfile.txt: $!\n"
Which should give you some feedback on why you can't open the file.