Fallback Open File Perl - perl

I was trying to write a program where perl opens one file, but falls back to another if that file does not exist or cannot be opened for some reason. The relevant line was:
open(my $fh,"<","/path/to/file") or open (my $fh,"<","/path/to/alternate/file") or die
Eventually, I figured out that:
open(my $fh,"<","/path/to/file") or open ($fh,"<","/path/to/alternate/file") or die
worked. What is the difference between these two statements, why doesn't the first work, and is the second the right way to do this, or are there still some problems with it?
Edit: If it matters, I'm using perl 5.12, and the first fails in the case that "/path/to/file" exists. My inclination is that the second open should not run if the first open is successful, so why is $fh being overwritten by the second?

my declares a variable. If you use it twice with the same name in the same scope, later mentions of it will be the second one, not the first. Your code will trigger a "my" variable ... masks earlier declaration in the same statement warning (if you enable warnings as you should.) So if the first open succeeds, it sets a $fh variable that isn't accessible later, and the second variable is left in an undocumented, undefined state, because its declaration wasn't actually executed. (See the "Here be dragons" warning in perldoc perlsyn, and realize that A or B is equivalent to B unless A.)
Your "working" code is also broken; while my returns the newly declared variable, which can be then set, the scope of a lexical (where later mentions of it find the variable) doesn't actually begin until the following statement. So your first $fh is the lexical that will be accessed on later lines, but the second is actually a global variable (or an error, if you are using strict as you should).
Correct code is:
my $fh;
open $fh, ... or open $fh, ...;

Others have said why the existing code doesn't work, but have also offered versions that have race conditions: the state of the file might change between when you checked it and when you opened it. It's fairly benign in your case, but it can produce subtle bugs and security holes. In general, you check if you can open a file by trying to open a file.
Here's a more general way which scales to multiple files, lets you know which file opened, and contains no race conditions.
use Carp;
sub try_open {
my #files = #_;
for my $file (#files) {
if( open my $fh, "<", $file ) {
return { fh => $fh, file => $file };
}
}
croak "Can't open any of #files";
}

Related

Perl: `die` did not work upon opening a nonexistent gz file using gzip

The following script creates a gziped file named "input.gz". Then the script attempts to open "input.gz" using gzip -dc. Intuitively, die should be triggered if a wrong input file name is provided. However, as in the following script, the program will not die even if a wrong input file name is provided ("inputx.gz"):
use warnings;
use strict;
system("echo PASS | gzip -c > input.gz");
open(IN,"-|","gzip -dc inputx.gz") || die "can't open input.gz!";
print STDOUT "die statment was not triggered!\n";
close IN;
The output of the script above was
die statment was not triggered!
gzip: inputx.gz: No such file or directory
My questions is: why wasn't die statement triggered even though gzip quit with error? And how can I make die statement triggered when a wrong file name is given?
It's buried in perlipc, but this seems relevant (emphasis added):
Be careful to check the return values from both open() and close(). If you're writing to a pipe, you should also trap SIGPIPE. Otherwise, think of what happens when you start up a pipe to a command that doesn't exist: the open() will in all likelihood succeed (it only reflects the fork()'s success), but then your output will fail--spectacularly. Perl can't know whether the command worked, because your command is actually running in a separate process whose exec() might have failed. Therefore, while readers of bogus commands return just a quick EOF, writers to bogus commands will get hit with a signal, which they'd best be prepared to handle.
Use IO::Uncompress::Gunzip to read gzipped files instead.
The open documentation is explicit about open-ing a process since that is indeed different
If you open a pipe on the command - (that is, specify either |- or -| with the one- or two-argument forms of open), an implicit fork is done, so open returns twice: in the parent process it returns the pid of the child process, and in the child process it returns (a defined) 0. Use defined($pid) or // to determine whether the open was successful.
For example, use either
my $child_pid = open(my $from_kid, "-|") // die "Can't fork: $!";
or
my $child_pid = open(my $to_kid, "|-") // die "Can't fork: $!";
(with code following that shows one use of this, which you don't need) The main point is to check for defined -- by design we get undef if open for a process fails, not just any "false."
While this should be corrected, keep in mind that the open call fails if fork itself fails, what is rare; in most cases when a "command fails" the fork was successful but something later wasn't. So in such cases we just cannot get the // die message, but end up seeing messages from the shell or command or OS, hopefully.
This is alright though, if informative messages indeed get emitted by some part of the process. Wrap the whole thing in eval and you'll have manageable error reporting.
But it is in general difficult to ensure to get all the right messages, and in some cases not possible. One good approach is to use a module for running and managing external commands. Among the many other advantages they also usually handle errors much more nicely. If you need to handle process's output right as it is emitted I recommend IPC::Run (which i'd recommend otherwise as well).
Read on what linked docs say, for specific examples on what you need and for much useful insight.
In your case
# Check input, depending on how it is given,
# consider String::ShellQuote if needed
my $file = ...;
my #cmd = ('gzip', '-dc', $file);
my $child_pid = open my $in, '-|', #cmd
// die "Can't fork for '#cmd': $!";
while (<$in>) {
...
}
close $in or die "Error closing pipe: $!";
Note a few other points
the "list form" of the command bypasses the shell
lexical filehandle (my $fh) is much better than typeglobs (IN)
print the actual error in the die statement, in $! variable
check close for a good final check on how it all went

Perl replacement operator doesn't work under Windows when patterns contain slashes

I want to replace a string with a path:
my $somedir = "D:/somedir/someotherdir";
system("perl -pi.bak -e \"s{STRING_TO_BE_REPLACED}{$somedir}\" $file");
but under Windows it replaces string with random symbols instead of slashes.
What's the problem?
I think it's got something to do with a syntax detail needed on Windows, but can't test now.
However, as you are in a Perl script, why go out with system and run another Perl interpreter? It is far more complex and inefficient since it involves a syscall or a shell, and starts another program. Also, it is far harder to get it right -- you need to deal with syntax details, quoting and escaping, for system, your system's command interpreter, the other instance of Perl, and the regex.
The code below reads the whole file into an array first, which is fine if files aren't huge. In general it is better to process a file line by line, and how to do what you need in that way is discussed in detail in a perlfaq5 page. See the comment at the end, with the link.
use warnings 'all';
use strict;
# your code ...
open my $fh, '<', $file or die "Can't open $file: $!";
my #lines = <$fh>;
# Change #lines in-place. See the comment
s/STRING_TO_BE_REPLACED/$somedir/ for #lines;
open $fh, '>', $file or die "Can't open $file for write: $!";
print $fh #lines;
close $fh;
When we open the $fh the second time it is closed and re-opened, so there is no need for an explicit close. When an existing file is opened for writing ('>') it is clobbered, so this replaces it.
It's more to write but it is better.
Comment on the in-place change to #lines This uses the fact that when iterating over an array if we change the index variable, here $_, the change is made in the original element. The index variable is like an alias for the array element. It says in perlsyn
If any element of LIST is an lvalue, you can modify it by modifying VAR inside the loop. Conversely, if any element of LIST is NOT an lvalue, any attempt to modify that element will fail. In other words, the foreach loop index variable is an implicit alias for each item in the list that you're looping over.
This has the benefit of not copying data and not touching elements that don't change so it is more efficient, potentially a lot more. However, it relies on a subtle property and thus it may be tricky and error prone, so I do not recommend it as a general practice.
To copy the array, with modifications, to a new one
my #lines_new;
foreach my $line (#lines) {
$line =~ s{STRING_TO_BE_REPLACED}{$somedir};
push #lines_new, $line;
}
This also changes #lines. If it need be kept intact do (my $new_line = $line) =~ s/.../. Then write #lines_new to $file. Somewhere in between these two is
#lines = map { s{STRING_TO_BE_REPLACED}{$somedir}; $_ } #lines;
what was posted originally. However, since the map changes elements of #lines and copies data to build the output list, while the whole statement also overwrites the array, on reflection I think it makes more sense to do either the in-place change or an explicit copy to a new array.
In principle it is better to not read the whole file at once but rather to process line by line, unless the file is small enough. In that case open the file for reading and new one for writing, and after you copy (with changes) the file over, move the new one to rewrite $file. See the topic in perlfaq5
The copied file is temporary, to be used to overwrite $file, so it can be named using the core module File::Temp to avoid accidents. To move a file use move from the core module File::Copy.

perl's open() fails sometimes when file name ends with whitespace

I'm facing a problem with Perl's open() function. It is related to the files whose names end with whitespace. If I use open() with 2 arguments (filehandle and filename) and filename ends with whitespace, open() fails. Error message says that file cannot be found, although file exists. No such thing happen when opening mode is specified, e.g., if I state explicitly that file is opened for reading. Here is some sample code:
use warnings;
use strict;
my $file = '/tmp/test_with_ending_space ';
open WRITE, ">", $file or die "open with mode got error: $!";
print WRITE "my open() test\n";
close WRITE;
# open() with mode
open READ, "<", $file or die "open without mode got error: $!";
while (<READ>) {
print;
}
close READ;
# open() without mode
open READ1, $file or die $!;
while (<READ1>) {
print;
}
close READ1;
And here is the output from such code:
marius#mariusm-PC:~/perl$ ./test.pl
my open() test
open without mode got error: No such file or directory at ./test.pl line 21.
No such things happen with "usual" filenames, i.e., when filenames end with some other character.
Any ideas if this is a known problem? If yes, is there a way how to workaround it?
And just in case, before you start telling me "be nice, specify mode and tell your open() how to open the file". Unfortunately, this issue is present in some core modules, e.g., IO::File::open() (that's where I got stuck originaly). Last call in this function is open($fh, $file), i.e., it calls native open() without any particular mode.
It's documented in open
The filename passed to the one- and two-argument forms of
open() will have leading and trailing whitespace deleted
Read the following paragrpahs for more details.
#choroba gave the "why", but you also asked for a workaround.
Well, this is VERY kludgey, but if you're desperate and can't change the open() calls, this will work. First, detect if the filename ends with whitespace (I assume you can handle that). If it does, create a temp symlink to the file (without trailing whitespace!), and open the symlink.
WFM in my (old) Solaris 2.6 box.

In Perl, why does print not generate any output after I close STDOUT?

I have the code:
open(FILE, "<$new_file") or die "Cant't open file \n";
#lines=<FILE>;
close FILE;
open(STDOUT, ">$new_file") or die "Can't open file\n";
$old_fh = select(OUTPUT_HANDLE);
$| = 1;
select($old_fh);
for(#lines){
s/(.*?xsl.*?)xsl/$1xslt/;
print;
}
close(STDOUT);
STDOUT -> autoflush(1);
print "file changed";
After closing STDOUT closing the program does not write the last print print "file changed". Why is this?
*Edited* Print message I want to write on Console no to file
I suppose it is because print default filehandle is STDOUT, which at that point it is already closed. You could reopen it, or print to other filehandle, for example, STDERR.
print STDERR "file changed";
It's because you've closed the filehandle stored in STDOUT, so print can't use it anymore. Generally speaking opening a new filehandle into one of the predefined handle names isn't a very good idea because it's bound to lead to confusion. It's much clearer to use lexical filehandles, or just a different name for your output file. Yes you then have to specify the filehandle in your print call, but then you don't have any confusion over what's happened to STDOUT.
A print statement will output the string in the STDOUT, which is the default output file handle.
So the statement
print "This is a message";
is same as
print STDOUT "This is a message";
In your code, you have closed STDOUT and then printing the message, which will not work. Reopen the STDOUT filehandle or do not close it. As the script ends, the file handles will be automatically closed
open OLDOUT, ">&", STDOUT;
close STDOUT;
open(STDOUT, ">$new_file") or die "Can't open file\n";
...
close(STDOUT);
open (STDOUT, ">&",OLDOUT);
print "file changed";
You seem to be confused about how file IO operations are done in perl, so I would recommend you read up on that.
What went wrong?
What you are doing is:
Open a file for reading
Read the entire file and close it
Open the same file for overwrite (org file is truncated), using the STDOUT file handle.
Juggle around the default print handle in order to set autoflush on a file handle which is not even opened in the code you show.
Perform a substitution on all lines and print them
Close STDOUT then print a message when everything is done.
Your main biggest mistake is trying to reopen the default output file handle STDOUT. I assume this is because you do not know how print works, i.e. that you can supply a file handle to print to print FILEHANDLE "text". Or that you did not know that STDOUT was a pre-defined file handle.
Your other errors:
You did not use use strict; use warnings;. No program you write should be without these. They will prevent you from doing bad things, and give you information on errors, and will save you hours of debugging.
You should never "slurp" a file (read the entire file to a variable) unless you really need to, because this is ineffective and slow and for huge files will cause your program to crash due to lack of memory.
Never reassign the default file handles STDIN, STDOUT, STDERR, unless A) you really need to, B) you know what you are doing.
select sets the default file handle for print, read the documentation. This is rarely something that you need to concern yourself with. The variable $| sets autoflush on (if set to a true value) for the currently selected file handle. So what you did actually accomplished nothing, because OUTPUT_HANDLE is a non-existent file handle. If you had skipped the select statements, it would have set autoflush for STDOUT. (But you wouldn't have noticed any difference)
print uses print buffers because it is efficient. I assume you are trying to autoflush because you think your prints get caught in the buffer, which is not true. Generally speaking, this is not something you need to worry about. All the print buffers are automatically flushed when a program ends.
For the most part, you do not need to explicitly close file handles. File handles are automatically closed when they go out of scope, or when the program ends.
Using lexical file handles, e.g. open my $fh, ... instead of global, e.g. open FILE, .. is recommended, because of the previous statement, and because it is always a good idea to avoid global variables.
Using three-argument open is recommended: open FILEHANDLE, MODE, FILENAME. This is because you otherwise risk meta-characters in your file names to corrupt your open statement.
The quick fix:
Now, as I said in the comments, this -- or rather, what you intended, because this code is wrong -- is pretty much identical to the idiomatic usage of the -p command line switch:
perl -pi.bak -e 's/(.*?xsl.*?)xsl/$1xslt/' file.txt
This short little snippet actually does all that your program does, but does it much better. Explanation:
-p switch automatically assumes that the code you provide is inside a while (<>) { } loop, and prints each line, after your code is executed.
-i switch tells perl to do inplace-edit on the file, saving a backup copy in "file.txt.bak".
So, that one-liner is equivalent to a program such as this:
$^I = ".bak"; # turns inplace-edit on
while (<>) { # diamond operator automatically uses STDIN or files from #ARGV
s/(.*?xsl.*?)xsl/$1xslt/;
print;
}
Which is equivalent to this:
my $file = shift; # first argument from #ARGV -- arguments
open my $fh, "<", $file or die $!;
open my $tmp, ">", "/tmp/foo.bar" or die $!; # not sure where tmpfile is
while (<$fh>) { # read lines from org file
s/(.*?xsl.*?)xsl/$1xslt/;
print $tmp $_; # print line to tmp file
}
rename($file, "$file.bak") or die $!; # save backup
rename("/tmp/foo.bar", $file) or die $!; # overwrite original file
The inplace-edit option actually creates a separate file, then copies it over the original. If you use the backup option, the original file is first backed up. You don't need to know this information, just know that using the -i switch will cause the -p (and -n) option to actually perform changes on your original file.
Using the -i switch with the backup option activated is not required (except on Windows), but recommended. A good idea is to run the one-liner without the option first, so the output is printed to screen instead, and then adding it once you see the output is ok.
The regex
s/(.*?xsl.*?)xsl/$1xslt/;
You search for a string that contains "xsl" twice. The usage of .*? is good in the second case, but not in the first. Any time you find yourself starting a regex with a wildcard string, you're probably doing something wrong. Unless you are trying to capture that part.
In this case, though, you capture it and remove it, only to put it back, which is completely useless. So the first order of business is to take that part out:
s/(xsl.*?)xsl/$1xslt/;
Now, removing something and putting it back is really just a magic trick for not removing it at all. We don't need magic tricks like that, when we can just not remove it in the first place. Using look-around assertions, you can achieve this.
In this case, since you have a variable length expression and need a look-behind assertion, we have to use the \K (mnemonic: Keep) option instead, because variable length look-behinds are not implemented.
s/xsl.*?\Kxsl/xslt/;
So, since we didn't take anything out, we don't need to put anything back using $1. Now, you may notice, "Hey, if I replace 'xsl' with 'xslt', I don't need to remove 'xsl' at all." Which is true:
s/xsl.*?xsl\K/t/;
You may consider using options for this regex, such as /i, which causes it to ignore case and thus also match strings such as "XSL FOO XSL". Or the /g option which will allow it to perform all possible matches per line, and not just the first match. Read more in perlop.
Conclusion
The finished one-liner is:
perl -pi.bak -e 's/xsl.*?xsl\K/t/' file.txt

How to append to a file?

I am trying to append some text to the end of a file in Mac OSX having a .conf extension. I am using the following code to do that:
open NEW , ">>$self->{natConf}";
print NEW "$hostPort = $vmIP";
where
$self->{natConf} = \Library\Preferences\VMware Fusion\vmnet8\nat.conf
So basically this is a .conf file. And even though its not returning any error, but it is not appending anything to the end of the file. I checked all the permissions, and read-write privilege has been provided. Is there anything I am missing here.
First of all use strict and use warnings. This would have thrown errors and warnings for your code.
On Mac OS the delimiter in a path is / like in other unix-like systems not \.
To asign a string to a variable use quotation marks.
Do not use open(2) but open(3) (the arrow operator does not work in your usage of open anyway) and it is considered bad practice to use bareword filehandlers.
use strict;
use warnings;
# your code here
$self->{natConf} = '/Library/Preferences/VMware Fusion/vmnet8/nat.conf';
# more code here
open my $fh, '>>', $self->{natConf} or die "open failed: $!\n";
print $fh "$hostPort = $vmIP";
close $fh;
# rest of code here
Suffering from buffering? Call close NEW when you are done writing to it, or call (*NEW)->autoflush(1) on it after you open it to force Perl to flush the output after every print.
Also check the return values of the open and print calls. If either of these functions fail, they will return false and set the $! variable.
And I second the recommendation about using strict and warnings.