Perl script messes with file descriptor in matlab - perl

I use a perl script for replacing some strings in a data file. The perl script is called from within a matlab program, which writes to the data file before the execution of the perl script and after it's execution.
My matlab program would then write to the data file, but for some reason it does not.
Here is a minimal example:
Matlab code:
f = fopen('output.txt','a');
fprintf(f,'This is written\n');
perl('replace.perl','output.txt');
fprintf(f,'This is not\n');
[fname perm] = fopen(f)
type('output.txt');
fclose(f);
perl script:
#!/usr/bin/perl -i
while(<>){
s/This/This here/;
print;
}
close;
The variables fname and perm are correctly assigned. The output of type is only "This here is written".
I am quite new to perl and so I probably make some rookie mistake in the script that I can't find.
Thanks for helping.

The secret is in the -i. In-place editing in perl, and in many other programs, is accomplished by opening the original file for reading, opening a temp file for writing, then unlinking the original file and renaming the temp file to the original file's name.
Now after running your perl script, poor matlab is left holding a file handle to a now unlinked file. You continue writing, but there's no easy way to see what was written... Even if the file had not changed out from under matlab, you would have had to deal with the fact that matlab was about to write to a spot that would now no longer be the end of the file.
In the end, you need to be very careful of having two programs/users/computers writing to the same file at the same time. Close the matlab file handle before calling perl. Reopen it for appending later if that proves to be really necessary.

Related

Is this a standard Perl language construction or a customization: open HANDLE, ">$fname"

Not a Perl guru, working with an ancient script, ran into a construct I didn't recognize that yields results I don't expect. Curious whether this is the standard language, or a PM customization of sorts:
open FILE1, ">./$disk_file" or die "Can't open file: $disk_file: $?";
From the looks of this, file is to be opened for writing, but the log error says that file is not found. Perl's file i/o expects 3 parameters, not 2. Log doesn't have the die output, instead saying: "File not found"
Confused a bit here.
EDIT: Made it work using the answers below. Seemed like I was running a cashed version of the .pl for some time, instead of the newly-edited. Finally it caught up with a 2-param open, thanks y'all for your help!
That is the old 2-argument form of open. The second argument is a bit magical:
if it starts with '>' the remainder of the string is used as the name of a file to open for writing
if it starts with '<' the remainder of the string is used as the name of a file to open for reading (this is the default if '<' is omitted)
if it ends with '|' the string up to that point is interpreted as a command which is executed with its STDOUT connected to a pipe which your script will open for reading
if it starts with '|' the string after that point is interpreted as a command which is executed with its STDIN connected to a pipe which your script will open for writing
This is a potentially security vulnerability because if your script accepts a filename as user input, the user can add a '|' at the beginning or end to trick your script into running a command.
The 3-argument form of open was added in (I think) version 5.8 so it has been a standard part of Perl for a very long time.
The FILE1 part is known as a bareword filehandle - which is a global. Modern style would be to use a lexical scalar like my $file1 instead.
See perldoc perlopen for details but, in brief...
Perl's open() will accept either two or three parameters (there's even a one-parameter version - which no-one ever uses). The two-parameter version is a slightly older style where the open mode and the filename are joined together in the second parameter.
So what you have is equivalent to:
open FILE1, '>', "./$disk_file" or die "Can't open file: $disk_file: $?";
A couple of other points.
We prefer to use lexical variables as filehandles these days (so, open my $file1, ... instead of open FILE1, ...).
I think you'll find that $! will be more useful in the error message than $?. $? contains the error from a child process, but there's no child process here.
Update: And none of this seems to be causing the problems that you're seeing. That seems to be caused by a file actually not being in the expected place. Can you please edit your question to add the exact error message that you're seeing.
The other answers here are correct that's the two-argument syntax. They've done a good job covering why and how you should ideally change it, so I won't rehash here.
However they haven't tried to help you fix it, so let me try that...
This is a guess, but I suspect $disk_file contains a filename with a path (eg my_logs/somelog.log), and the directory part (my_logs in my entirely guessed example) doesn't exists, so is throwing an error. You could create that directory, or alter whatever sets that variable so it's writing to a location that does exist.
Bear in mind these paths will be relative to wherever you're running the script from - not relative to the script itself, so if there's a log directory (or whatever) in the same dir as the script you may want to cd to the script's dir first.

How do I make a perl script run another perl script?

I am writing a large Perl script, which needs to utilize other existing Perl scripts. The problem is the main script needs to reference many different scripts from different folders. For example the main script would be contained in:
/perl/programs/io
It may need to run a script which is stored in:
/perl/programs/tools
Note that there are other orthogonal folders besides tools so I need to be able to access any of them on the fly.
Currently this is what I got:
my $mynumber = '../tools/convert.pl bin2dec 1011';
In theory it should move back from the io directory then enter the appropriate tool directory and call the convert.pl script while passing it the parameters.
All this does is store the string in the single quotes to $myNumber.
I like to assign the output of a command to an array so I can loop through the array to find error or other messages. For example if I'm making a zip file to email to someone I want to check to see if the zip program had any errors before I continue to make and send the email.
#msgs = `zip -f myfile.zip *.pl`; # Use backticks
You can also assign the output to a scalar:
$msg = `ls -al *.pl`; # Use backticks
To run any system command or script all you have to do is use `backticks`. From observing another programer's perl code, I misread these strange quotes for 'single quotes'.
backticks are also nice because they return the text in STDOUT to your perl script so that the output can be assigned to a variable, something I have found impossible if using system("");
The similar question answer does not work with my version of perl. The line
use IPC::System::Simple qw(system capture);
throws some errors. However just using system works, like this:
my $mynumber = system($^X, "../tools/convert.pl", 'bin2dec', '1011');
I can use the above without setting equal to something to execute scripts which return no value and are only sent arguments.
This seems to be the easiest way to do what I need to and the entire programs folder can be moved anywhere and it will still work as no parent directories above programs are used.

Storing output of perl script to a file

I"m calling a perl script stored in one PC, for example with name machine1 from say machine2 using the command:
system("perl /CC/builds/123.pl\n");
Now, i need to get the log of the whole perl file executed to be stored in a say 123.txt file and created on machine1.Can any text file be opened in the perl file at the beginning which stored only output of the line executed?
Please help.
Thanks,
Ramki
I'm quite new on that and I don't know if I am understanding your question, but why you don't use "backticks" instead of "system"? It would let you store the output in a variable and then you can do with this variable whatever you want.
About backticks: What's the difference between Perl's backticks, system, and exec?

Accessing a file in perl

In my script I am dealing with opening files and writing to files. I found that there is some thing wrong with a file I try to open, the file exists, it is not empty and I am passing the right path to file handle.
I know that my question might sounds weird but while I was debugging my code I put the following command in my script to check some files
system ("ls");
Then my script worked well, when it's removed it does not work correctly anymore.
my #unique = ("test1","test2");
open(unique_fh,">orfs");
print unique_fh #unique ;
open(ORF,"orfs")or die ("file doesnot exist");
system ("ls");
while(<ORF>){
split ;
}
#neworfs=#_ ;
print #neworfs ;
Perl buffers the output when you print to a file. In other words, it doesn't actually write to the file every time you say print; it saves up a bunch of data and writes it all at once. This is faster.
In your case, you couldn't see anything you had written to the file, because Perl hadn't written anything yet. Adding the system("ls") call, however, caused Perl to write your output first (the interpreter is smart enough to do this, because it thinks you might want to use the system() call to do something with the file you just created).
How do you get around this? You can close the file before you open it again to read it, as choroba suggested. Or you can disable buffering for that file. Put this code just after you open the file:
my $fh = select (unique_fh);
$|=1;
select ($fh);
Then anytime you print to the file, it will get written immediately ($| is a special variable that sets the output buffering behavior).
Closing the file first is probably a better idea, although it is possible to have a filehandle for reading and writing open at the same time.
You did not close the filehandle before trying to read from the same file.

invoking a perl script with arguments from visual slickedit and in turn passing the output as parameters to visual slick edit for futher processing

I am trying to invoke a perl script from slickedit..through a .e file which i will use to create a macro.The perl script basically is a xml parser which parses a .xml file whose structure is defined according to a .dtd file.
here goes the .e file i am using for invoking the perl script
_command void invoke(){
shell("C:\\Users\\anits\\Desktop\\trial.pl");
}
the perl script which is to be invoked looks like this
use XML::Simple;
use Data::Dumper;
open logfile,">test.txt";
#sub process{
$xml = new XML::Simple (KeyAttr=>[]);# read XML file
my $error =$xml->XMLin("trial.xml");
print "There are " . scalar(#{$error->{problem}}) . " problems.\n";
foreach my $var (#{$error->{problem}}) {
print logfile $var->{name}."\n";
}
close logfile;
#args = ("C:/Program Files (x86)/SlickEdit 2009/win/vs.exe","C:/Users/anits/Desktop/test.txt");
system(#args) == 0 or die "system #args failed: $?";
As you can see my perl script should open a txt file back in slickedit..but i dont get any output.So please help me on that.
If the xml parsing can be done using slickc please suggest a way to do it.
Thanks and i hope my question is clear now
As daxim pointed out your question is not exactly clear, so my answer is going to be somewhat general
I am not sure if its the cleanest way (as it uses a fixed temp file), but I have written (in a past job) slickedit code that invoked scripts ... had the external scripts output to a well-known temp file, then have the temp file be parsed by the slick edit code after
don't remember exactly why but under windows I also wrapped execution of the script in a batch file (which implemented redirection to a named file, maybe I had quoting issues)
this is generally what it would look like ...
static str mytmp='somepath'
<construct command_string, which will direct output to the well known file location 'somepath'>
shell(command_string, 'qp');
get(mytmp);
top();
get_line(line);
do work with line;