I am learning Perl and have looked up this question but haven't been able to get it to work for me although it terminates without error.
I enter a file that it should want (name-0-0-0) but it just skips the while loop altogether.
open FILE, '+>>userinfo.txt';
print("What is your name?");
$name = <>;
chomp $name;
while (<FILE>) {
chomp;
($nameRead,$wins, $losses, $cats) = split("-");
if ($nameRead eq $name){
print("Oh hello $name, your current record is $wins wins - $losses losses - $cats ties");
print("Would you like to play again? type y for yes or n for no\n");
$bool = <>;
if ($bool == "y"){
print("Okay let's play!");
play();
exit();
}
else {
printf("well fine goodbye!");
exit();
}
}
}
Well it seems my problem was indeed related to the +>>. I am trying to add on to the file, but I wanted to be able to write, not just append. I changed it to +< and everything worked great. Thanks guys I really appreciate it!
Your primary problem is that you have chosen an arcane open mode for userinfo.txt, which will allow you to open an existing file for both read and write but create a new file if it doesn't exist.
You must always check whether a file open has succeeded, and it looks like all you want to do is read from this file, so you want
open FILE, '<', 'userinfo.txt' or die $!;
You must also always add
use strict;
use warnings;
to the top of your program, and declare all variables with my at their first point of use.
Once you have made these changes you will most likely understand yourself what is going wrong, but if you have further problems please post your modified code.
I appears like you're using the wrong syntax to open the file for reading. Try
use autodie qw(:all);
open my $FILE, '<', '/path/to/file';
The syntax you're using opens a file for appending.
Why are you opening the file in +>> mode? That mode opens your file for input and output, but sets the filehandle cursor to the end of the file. Even experienced Perl programmers rarely have a need to do that.
Since the filehandle is positioned at the end of the file when you open it, you won't get anything when you attempt to read from it.
Is there a reason you aren't just saying open FILE, '<', 'userinfo.txt' ?
The following says while you're reading the file and it hasn't reached the end of file:
while (<FILE>) {
...
}
You might want to remove the while loop entirely and just do everything inside it.
Edit/Alternative Solution:
The real reason nothing was happening is because when you use +>>, it opens the file up for read/append as you'd expect, but it immediately sets the cursor at the end of the file. So that when you encounter the while (<FILE>) { ... } there's nothing to read.
One solution would be to reset the file cursor position:
open FILE, '+>>userinfo.txt';
seek(FILE,0,0); # set the file cursor at the top
Related
I am a perl novice, but have read the "Learning Perl" by Schwartz, foy and Phoenix and have a weak understanding of the language. I am still struggling, even after using the book and the web.
My goal is to be able to do the following:
Search a specific folder (current folder) and grab filenames with full path. Save filenames with complete path and current foldername.
Open a template file and insert the filenames with full path at a specific location (e.g. using substitution) as well as current foldername (in another location in the same text file, I have not gotten this far yet).
Save the new modified file to a new file in a specific location (current folder).
I have many files/folders that I want to process and plan to copy the perl program to each of these folders so the perl program can make new .
I have gotten so far ...:
use strict;
use warnings;
use Cwd;
use File::Spec;
use File::Basename;
my $current_dir = getcwd;
open SECONTROL_TEMPLATE, '<secontrol_template.txt' or die "Can't open SECONTROL_TEMPLATE: $!\n";
my #secontrol_template = <SECONTROL_TEMPLATE>;
close SECONTROL_TEMPLATE;
opendir(DIR, $current_dir) or die $!;
my #seq_files = grep {
/gz/
} readdir (DIR);
open FASTQFILENAMES, '> fastqfilenames.txt' or die "Can't open fastqfilenames.txt: $!\n";
my #fastqfiles;
foreach (#seq_files) {
$_ = File::Spec->catfile($current_dir, $_);
push(#fastqfiles,$_);
}
print FASTQFILENAMES #fastqfiles;
open (my ($fastqfilenames), "<", "fastqfilenames.txt") or die "Can't open fastqfilenames.txt: $!\n";
my #secontrol;
foreach (#secontrol_template) {
$_ =~ s/#/$fastqfilenames/eg;
push(#secontrol,$_);
}
open SECONTROL, '> secontrol.txt' or die "Can't open SECONTROL: $!\n";
print SECONTROL #secontrol;
close SECONTROL;
close FASTQFILENAMES;
My problem is that I cannot figure out how to use my list of files to replace the "#" in my template text file:
my #secontrol;
foreach (#secontrol_template) {
$_ =~ s/#/$fastqfilenames/eg;
push(#secontrol,$_);
}
The substitute function will not replace the "#" with the list of files listed in $fastqfilenames. I get the "#" replaced with GLOB(0x8ab1dc).
Am I doing this the wrong way? Should I not use substitute as this can not be done, and then rather insert the list of files ($fastqfilenames) in the template.txt file? Instead of the $fastqfilenames, can I substitute with content of file (e.g. s/A/{r file.txt ...). Any suggestions?
Cheers,
JamesT
EDIT:
This made it all better.
foreach (#secontrol_template) {
s/#/$fastqfilenames/g;
push #secontrol, $_;
}
And as both suggestions, the $fastqfiles is a filehandle.
replaced this: open (my ($fastqfilenames), "<", "fastqfilenames.txt") or die "Can't open fastqfilenames.txt: $!\n";
with this:
my $fastqfilenames = join "\n", #fastqfiles;
made it all good. Thanks both of you.
$fastqfilenames is a filehandle. You have to read the information out of the filehandle before you can use it.
However, you have other problems.
You are printing all of the filenames to a file, then reading them back out of the file. This is not only a questionable design (why read from the file again, since you already have what you need in an array?), it also won't even work:
Perl buffers file I/O for performance reasons. The lines you have written to the file may not actually be there yet, because Perl is waiting until it has a large chunk of data saved up, to write it all at once.
You can override this buffering behavior in a few different ways (closing the file handle being the simplest if you are done writing to it), but as I said, there is no reason to reopen the file again and read from it anyway.
Also note, the /e option in a regex replacement evaluates the replacement as Perl code. This is not necessary in your case, so you should remove it.
Solution: Instead of reopening the file and reading it, just use the #fastqfiles variable you previously created when replacing in the template. It is not clear exactly what you mean by replacing # with the filenames.
Do you want to to replace each # with a list of all filenames together? If so, you should probably need to join the filenames together in some way before doing the replacement.
Do you want to create a separate version of the template file for each filename? If so, you need an inner for loop that goes over each filename for each template. And you will need something other than a simple replacement, because the replacement will change the original string on the first time through. If you are on Perl 5.16, you could use the /r option to replace non-destructively: push(#secontrol,s/#/$file_name/gr); Otherwise, you should copy to another variable before doing the replacement.
$_ =~ s/#/$fastqfilenames/eg;
$fastqfilenames is a file handle, not the file contents.
In any case, I recommend the use of Text::Template module in order to do this kind of work (file text substitution).
I have a Perl Script which performs a specific operation and based on the result, it should update a file.
Basic overview is:
Read a value from the file handle, FILE
Perform some operation and then compare the result with the value stored in INPUT file.
If there is a change, then update the file corresponding to File Handle.
When I say, update, I mean, overwrite the existing value in INPUT file with the new one.
An overview of the script:
#! /usr/bin/perl
use warnings;
use diagnostics;
$input=$ARGV[0];
open(FILE,"+<",$input) || die("Couldn't open the file, $input with error: $!\n");
# perform some operation and set $new_value here.
while(<FILE>)
{
chomp $_;
$old_value=$_;
if($new_value!=$old_value)
{
print FILE $new_value,"\n";
}
}
close FILE;
However, this appends the $new_value to the file instead of overwriting it.
I have read the documentation in several places for this mode of FILE Handle and everywhere it says, read/write mode without append.
I am not sure, why it is unable to overwrite. One reason I could think of is, since I am reading from the handle in the while loop and trying to overwrite it at the same time, it might not work.
Thanks.
your guess is right. You first read the file so file pointer is actually in the position of end of old value. I didn't try this myself, but you can probably seek file pointer to 0 before print it out.
seek(FILE, 0, 0);
You should add truncate to your program along with seek.
if( $new_value != $old_value )
{
seek( FILE, 0, 0 );
truncate FILE, 0;
print FILE $new_value,"\n";
}
Since the file is opened for reading and writing, writing a shorter $new_value will leave some of the $old_value in the file. truncate will remove it.
See perldoc -f seek and perldoc -f truncate for details.
you have to close the file handle and open a different one (or the same one if you like) set to the output file. like this.
close FILE;
open FILE, ">$input" or die $!;
...
close FILE;
that should do the trick
I end up having my script appending the new changes that I wanted to make to the end of the file instead of in the actual file.
open (INCONFIG, "+<$text") or die $!;
#config = <INCONFIG>;
foreach(#config)
{
if ( $_ =~ m/$checker/ )
{
$_ = $somethingnew;
}
print INCONFIG $_;
}
close INCONFIG or die;
Ultimately I wanted to rewrite the whole text again, but with certain strings modified if it matched the search criterion. But so far it only appends ANOTHER COPY of the entire file(with changes) to the bottom of the old file.
I know that I can just close the file, and use another write file -handle and parse it in. But was hoping to be able to learn what I did wrong, and how to fix it.
As I understand open, using read/write access for a text file isn't a good idea. After all a file just is a byte stream: Updating a part of the file with something of a different length is the stuff headaches are made of ;-)
Here is my approach: Try to emulate the -i "inplace" switch of perl. So essentially we write to a backup file, which we will later rename. (On *nix system, there is some magic with open filehandles keeping deleted files available, so we don't have to create a new file. Lets do it anyway.)
my $filename = ...;
my $tempfile = "$filename.tmp";
open my $inFile, '<', $filename or die $!;
open my $outFile, '>', $tempfile or die $!;
while (my $line = <$inFile>) {
$line = doWeirdSubstitutions($line);
print $outFile $line;
}
close $inFile or die $!;
close $outFile or die $!;
rename $tempfile, $filename
or die "rename failed: $!"; # will break under weird circumstances.
# delete temp file
unlink $tempfile or die $!;
Untested, but obvious code. Does this help with your problem?
Your problem is a misunderstanding of what <+ "open for update" does. It is discussed in the Perl Tutorial at
Mixing Reads and Writes.
What you really want to do is copy the old file to a new file and then rename it after the fact. This is discussed in the perlfaq5 mentioned by daxim. Plus there are entire modules dedicated to doing this safely, such as File::AtomicWrite. These help with the issue of your program aborting and leaving you with a clobbered file.
As others pointed out, there are better ways :)
But if you really want to read and write using +<, you should remember that, after reading the file, you're at the end of the file... That explains that your output is appended after the original content.
What you need to do is reset the file-pointer to the beginning of the file, using seek:
seek(INCONFIG ,0,0);
Then start writing...
perlopentut says this about mixing reads and writes
In fact, when it comes to updating a file, unless you're working on a
binary file as in the WTMP case above, you probably don't want to use
this approach for updating. Instead, Perl's -i flag comes to the
rescue.
Another way is to use the Tie::File module. The code reduces to just this:
tie my #config, 'Tie::File', $text or die $!;
s/$checker/$somethingnew/g for #config;
But remember to back the file up before you modify it until you have debugged your program.
Im new to perl, so sorry if this is obvious, but i looked up how to open a file, and use the flags, but for the life of me they dont seem to work right I narrowed it down to these lines of code.
if ($flag eq "T"){
open xFile, ">" , "$lUsername\\$openFile";
}
else
{
open xFile, ">>", "$lUsername\\$openFile";
}
Both of these methods seem to delete the contents of my file. I also checked if the flag is formatted correctly and it is, i know for a fact ive gone down both conditions.
EDIT: codepaste of a larger portion of my code http://codepaste.net/n52sma
New to Perl? I hope you're using use strict and use warnings.
As other's have stated, you should be using a test to make sure your file is open. However, that's not really the problem here. In fact, I used your code, and it seems to work fine for me. Maybe you should try printing some debugging messages to see if this is doing what you think it's doing:
use strict;
use warnings;
use autodie; #Will stop your program if the "open" doesn't work.
my $lUsername = "ABaker";
my $openFile = "somefile.txt";
if ($flag eq "T") {
print qq(DEBUG: Flag = "$flag": Deleting file "$lUsername/$openFile");
open xFile, ">" , "$lUsername/$openFile";
}
else {
print qq(DEBUG: Flag = "$flag": Appending file "$lUsername/$openFile");
open xFile, ">>", "$lUsername/$openFile";
}
You want to use strict and warnings in order to make sure you're not having issues with variable names. The use strict forces you to declare your variables first. For example, are you setting $Flag, but then using $flag? Maybe $flag is set the first time through, but you're setting $Flag the second time through.
Anyway, the DEBUG: statements will give you a better idea of what your error could be.
By the way, in Perl, you're checking if $flag is set to T and not t. If you want to test against both t and T, test whether uc $flag eq 'T' and not just $flag eq 'T'.
#Ukemi
I reformated to comply with use strict, i also made print statements to make sure i was trunctating when i want to, and not when i dont. It still is deleting the file. Although now sometimes its simply not writing, im going to give a larger portion of my code in a link, id really appreciate it if you gave it a once over.
Are you seeing it say Truncating, but the file is empty? Are you sure the file already existed? There's a reason why I put the flag and everything in my debug statements. The more you print, the more you know. Try the following section of code:
$file = "lUsername/$openFile" #Use forward slashes vs. back slashes.
if ($flag eq "T") {
print qq(Flag = "$flag". Truncating file "$file"\n);
open $File , '>', $file
or die qq(Unable to open file "$file" for writing: $!\n);
}
else {
print qq(Flag = "$flag". Appending to file "$file"\n);
if (not -e $file) {
print qq(File "$file" does not exist. Will create it\n");
}
open $File , '>>', $file
or die qq(Unable to open file "$file" for appending: $!\n);
}
Note I'm printing out the flag and the name of the file in quotes. This will allow me to see if there are any hidden characters in my file name.
I'm using the qq(...) method to quote strings, so I can use the quotation marks in my print statements.
Also note I'm checking for the existence of the file when I truncate. This way, I make sure the file actually exists.
This should point out any possible errors in your logic. The other thing you can do is to stop your program when you finish writing out the file and verify that the file was written out as expected.
print "Write to file now:\n";
my $writeToFile = <>;
printf $File "$writeToFile";
close $File;
print "DEBUG: Temporary stop. Examine file\n";
<STDIN>; #DEBUG:
Now, if you see it saying it's appending to the file, and the file exists, and you still see the file being overwritten, we'll know the problem lies in your actual open xFile, ">>" $file statement.
You should use the three-argument-version of open, lexical filehandles and check wether there might have been an error:
# Writing to file (clobbering it if it exists)
open my $file , '>', $filename
or die "Unable to write to file '$filename': $!";
# Appending to file
open my $file , '>>', $filename
or die "Unable to append to file '$filename': $!";
>> does not clobber or truncate. Either you ended up in the "then" clause when you expected to be in the "else" clause, or the problem is elsewhere.
To check what $flag contains:
use Data::Dumper;
local $Data::Dumper::Useqq = 1;
print(Dumper($flag));
For your reference I have mentioned some basic file handling techniques below.
open FILE, "filename.txt" or die $!;
The command above will associate the FILE filehandle with the file filename.txt. You can use the filehandle to read from the file. If the file doesn't exist - or you cannot read it for any other reason - then the script will die with the appropriate error message stored in the $! variable.
open FILEHANDLE, MODE, EXPR
The available modes are the following:
read < #this mode will read the file
write > # this mode will create the new file. If the file already exists it will truncate and overwrite.
append >> #this will append the contents if the file already exists,else it will create new one.
if you have confusion on this, you can use the module called File::Slurp;
I have mentioned the sample codes using File::Slurp module.
use strict;
use File::Slurp;
my $read_mode=read_file("test.txt"); #to read file contents
write_file("test2.txt",$read_mode); #to write file
my #all_files=read_dir("/home/desktop",keep_dot_dot=>0); #read a dir
write_file("test2.txt",{append=>1},"#all_files"); #Append mode
I am new to perl and am trying to read and write to a csv file in perl. But nothing happens can some one help me where the problem is. I am able to read without a problem using '<' but I am unable to write.
use strict;
use warnings;
use Text::CSV_XS;
my $file = 'file.csv';
my $csv = Text::CSV_XS->new();
open (InCSV, '+>>', $file) or die $!;
while (<InCSV>) {
if ($csv->parse($_)) {
my #columns = $csv->fields();
if($columns[1] eq "01") {
my $str = "Selected $columns[6] \n ";
push(#columns,$str);
print InCSV join("," , #columns), "\n";
}
} else {
my $err = $csv->error_input;
print "Failed to parse line: $err";
}
}
close InCSV;
Opening a file in +>> mode will seek to the end of the file, so there will be nothing to read unless you seek back to the beginning (or the middle) of the file. To open in read/write mode with the file cursor at the beginning of the file, use +< mode.
That said, you probably want to rethink your approach to this problem. It looks like you are trying to read a row of data, modify it, and write it back to the file. But the way you have done it, you are overwriting the next row of data rather than the row you have just read, and anyway the new data is longer (has more bytes) than the old data. This is certain to corrupt your data file.
Some better approaches might be to
read and process all data first, then close and overwrite the input with processed data
write data to a temporary file while you are processing it, then overwrite the input with the temporary file (see also about the perl interpreter's in-place editing mode)
use a module like Tie::File to handle the line-based I/O for this task