how can i import in Matlab text file using PERL script? - matlab

i try to import huge text file (~5 million lines). I try with this script
aaa = perl('importFile.pl',fileName);
where "importFile.pl" is
use strict;
use warnings;
while (my $row = <>) {
chomp $row;
print "$row\n";
}
but nothing happens!. what is my mistake??? Or can you suggest similar (and fast) solution?
Matlab R2014a 64bit

Iam not super familiar with perl but have worked with it alittle in the past.I'm guessing there is an issue with your loop, but like I stated i'm not a perl monk! Here is a program I used to read from a file and prints it:
#!/usr/bin/perl
use strict;
use warnings;
#Opens a file handler
open my $fh, '<', '/home/user/Desktop/ScriptForChangesets/ToBeRead.txt' or die "Can't open file";
#Prints the file contents
print do{local $/; <$fh>};
Hope this helps!

Related

Most efficient way to write over file after reading

I'm reading in some data from a file, manipulating it, and then overwriting it to the same file. Until now, I've been doing it like so:
open (my $inFile, $file) or die "Could not open $file: $!";
$retString .= join ('', <$inFile>);
...
close ($inFile);
open (my $outFile, $file) or die "Could not open $file: $!";
print $outFile, $retString;
close ($inFile);
However I realized I can just use the truncate function and open the file for read/write:
open (my $inFile, '+<', $file) or die "Could not open $file: $!";
$retString .= join ('', <$inFile>);
...
truncate $inFile, 0;
print $inFile $retString;
close ($inFile);
I don't see any examples of this anywhere. It seems to work well, but am I doing it correctly? Is there a better way to do this?
The canonical way to read an entire file contents is to (temporarily) set the input record separator $/ to undef, after which the next readline will return all of the rest of the file. But I think your original technique is much clearer and less involved than just reopening the file for write.
Note that all the following examples make use of the autodie pragma, which avoids the need to explicitly test the status of open, close, truncate, seek, and many other related calls that aren't used here.
Opening in read/write mode and using truncate would look like this
use strict;
use warnings;
use autodie;
use Fcntl 'SEEK_SET';
my ($file) = #ARGV;
open my $fh, '+<', $file;
my $ret_string = do { local $/; <$fh>; };
# Modify $ret_string;
seek $fh, 0, SEEK_SET;
truncate $fh, 0;
print $fh $ret_string;
close $fh;
whereas a simple second open would be
use strict;
use warnings;
use autodie;
my ($file) = #ARGV;
open my $fh, '<', $file;
my $ret_string = do { local $/; <$fh>; };
# Modify $ret_string;
open $fh, '>', $file;
print $fh $ret_string;
close $fh;
There is a third way, which is to use the equivalent of the edit-in-place -i command-line option. If you set the built-in variable $^I to anything other than undef and pass a file on the command line to the program then Perl will transparently rename by appending the value of $^I and open a new output file with the original name.
If you set $^I to the empty string '' then the original file will be deleted from the directory and will disappear when it is closed (note that Windows doesn't support this, and you have to specify a non-null value). But while you are testing your code it is best to set it to something else so that you have a route of retreat if you succeed in destroying the data.
That mode would look like this
use strict;
use warnings;
$^I = '.old';
my $ret_string = do { local $/; <>; };
# Modify $ret_string;
print $ret_string;
Note that the new output file is selected as the default output, and if you want to print to the console you have to write an explicit print STDOUT ....
I would recommend using $INPLACE_EDIT:
use strict;
use warnings;
my $file = '...';
local #ARGV = $file;
local $^I = '.bak';
while (<>) {
# Modify the line;
print;
}
# unlink "$file$^I"; # Optionally delete backup
For additional methods for editing a file, just read perlfaq5 - How do I change, delete, or insert a line in a file, or append to the beginning of a file?.
I would change the way you're reading the file, not how you open it. Joining lines is less efficient than reading whole file at once,
$retString .= do { local $/; <$inFile> };
As for truncate you might want to seek to begining of the file first as perldoc suggests
The position in the file of FILEHANDLE is left unchanged. You may want to call seek before writing to the file.
The Tie::File module may help if you are changing some lines in a file.
Like this
use strict;
use warnings;
use Tie::File;
tie my #source, 'Tie::File', 'file.txt' or die $!;
for my $line (#source) {
# Modify $line here
}
untie #source;

Write to multiple output files using input file line numbers

Yes I know maybe the title is not very clear. If you have suggestions about changing it, please tell me.
This is my problem:
I have one file (we call it File_1) where there are several lines, each one is a sentence.
For ex:
File_1 content:
*Line 1*: I'm very happy today.
*Line 2*: We're going to the cinema.
*Line 3*: I'll write on stackoverflow today!
And so on..
My question is, how can I create several files from the content of each line?
So for ex. file_1 will be splitted into:
*Line_1_content.txt* : I'm very happy today.
*Line_2_content.txt* : We're going to the cinema.
And so on...
I don't know how to do, really. I've never faced this kind of problem.
I will appreciate your help.
use strict;
use warnings;
use autodie;
while ( <> ) {
open my $fh, '>', "${.}.txt";
print $fh $_;
}
$. corresponds to the number of the current line.
You call this program like this: perl prog.pl your_file
If I understand correctly the question something line:
use strict;
use warnings;
use autodie;
open(my $fh, '<', 'example.txt');
while (my $line = <$fh>) {
open(my $line_fh, '>', "line_${.}_file.txt");
print $line_fh $line;
close($line_fh);
}
close($fh);
Should do what you want.
(Although doing this maybe doesn't make a lot of sense like the comments point out)
You don't even need a program :-)
perl -ne 'open my $fh, ">", "Line_${.}_content.txt"; print $fh $_"' your_file_here
Or, perhaps,
perl -pe 'open my $fh, ">", "Line_${.}_content.txt"; select $fh' your_file_here

writing a script in perl to convert all files in a folder to another format

Question is 2 fold:
I'm writing a perl (not to experienced with perl) script and can get it to convert one file at a time from csv to ascii. I want to do a loop that takes all csv's in a folder and converts them to ascii/txt.
Is perl the best language to be attempting this with? I assumed yes since i can successfully do it one file at a time but having a very hard time figuring out a way to loop it.
I was trying to figure out how to load all the files into an array then run the loop for each one, but my googling has reached its limit and i'm out of ideas.
here's my working script:
#!/usr/bin/env perl
use strict;
use warnings;
use autodie;
use open IN => ':encoding(UTF-16)';
use open OUT => ':encoding(ascii)';
my $buffer;
open(my $ifh, '<', 'Software_compname.csv');
read($ifh, $buffer, -s $ifh);
close($ifh);
open(my $ofh, '>', 'Software_compname.txt');
print($ofh $buffer);
close($ofh);
Just add the following loop to your script and give it the files to process as arguments:
for my $input_file (glob shift) {
(my $output_file = $input_file) =~ s/csv$/txt/ or do {
warn "Invalid file name: $input_file\n";
next;
};
my $buffer;
open my $ifh, '<', $input_file;
read $ifh, $buffer, -s $ifh;
close $ifh;
open my $ofh, '>', $output_file;
print{$ofh} $buffer;
close $ofh;
}
If you want to do it from the Perl side only, I would suggest using File::Find::Rule:
use File::Find::Rule;
my #files = File::Find::Rule->file()
->name('*.in')
->in(my #input_directories = ('.'));
# etc.
regards,
matthias

Why is my Perl program not reading from the input file?

I'm trying to read in this file:
Oranges
Apples
Bananas
Mangos
using this:
open (FL, "fruits");
#fruits
while(<FL>){
chomp($_);
push(#fruits,$_);
}
print #fruits;
But I'm not getting any output. What am I missing here? I'm trying to store all the lines in the file into an array, and printing out all the contents on a single line. Why isn't chomp removing the newlines from the file, like it's supposed to?
you should always use :
use strict;
use warnings;
at the begining of your scripts.
and use 3 args open, lexical handles and test opening for failure, so your script becomes:
#!/usr/bin/perl
use strict;
use warnings;
use Data::Dumper;
my #fruits;
my $file = 'fruits';
open my $fh, '<', $file or die "unable to open '$file' for reading :$!";
while(my $line = <$fh>){
chomp($line);
push #fruits, $line;
}
print Dumper \#fruits;
I'm guessing that you have DOS-style newlines (i.e., \r\n) in your fruits file. The chomp command normally only works on unix-style (i.e., \n.)
You're not opening any file. FL is a file handle that never is opened, and therefore you can't read from it.
The first thing you need to do is put use warnings at the top of your program to help you with these problems.
#!/usr/bin/env perl
use strict;
use warnings;
use IO::File;
use Data::Dumper;
my $fh = IO::File->new('fruits', 'r') or die "$!\n";
my #fruits = grep {s/\n//} $fh->getlines;
print Dumper \#fruits;
that's nice and clean
You should check open for errors:
open( my $FL, '<', 'fruits' ) or die $!;
while(<$FL>) {
...
1) You should always print the errors from IO. `open() or die "Can't open file $f, $!";
2) you probably started the program from different directory from where file "fruits" is

What's the easiest way to write to a file using Perl?

Currently I am using
system("echo $panel_login $panel_password $root_name $root_pass $port $panel_type >> /home/shared/ftp");
What is the easiest way to do the same thing using Perl? IE: a one-liner.
Why does it need to be one line? You're not paying by the line, are you? This is probably too verbose, but it took a total of two minutes to type it out.
#!/usr/bin/env perl
use strict;
use warnings;
my #values = qw/user secret-password ftp-address/;
open my $fh, '>>', 'ftp-stuff' # Three argument form of open; lexical filehandle
or die "Can't open [ftp-stuff]: $!"; # Always check that the open call worked
print $fh "#values\n"; # Quote the array and you get spaces between items for free
close $fh or die "Can't close [ftp-stuff]: $!";
You might find IO::All to be helpful:
use IO::All;
#stuff happens to set the variables
io("/home/shared/ftp")->write("$panel_login $panel_password $root_name $root_pass $port $panel_type");
EDIT (By popular and editable demand)
http://perldoc.perl.org/functions/open.html
In your case you would have to :
#21st century perl.
my $handle;
open ($handle,'>>','/home/shared/ftp') or die("Cant open /home/shared/ftp");
print $handle "$panel_login $panel_password $root_name $root_pass $port $panel_type";
close ($handle) or die ("Unable to close /home/shared/ftp");
Alternatively, you could use the autodie pragma (as #Chas Owens suggested in comments).
This way, no check (the or die(...)) part needs to be used.
Hope to get it right this time. If so, will erase this Warning.
Old deprecated way
Use print (not one liner though). Just open your file before and get a handle.
open (MYFILE,'>>/home/shared/ftp');
print MYFILE "$panel_login $panel_password $root_name $root_pass $port $panel_type";
close (MYFILE);
http://perl.about.com/od/perltutorials/a/readwritefiles_2.htm
You might want to use the simple File::Slurp module:
use File::Slurp;
append_file("/home/shared/ftp",
"$panel_login $panel_password $root_name $root_pass ".
"$port $panel_type\n");
It's not a core module though, so you'll have to install it.
(open my $FH, ">", "${filename}" and print $FH "Hello World" and close $FH)
or die ("Couldn't output to file: ${filename}: $!\n");
Of course, it's impossible to do proper error checking in a one liner... That should be written slightly differently:
open my $FH, ">", "${filename}" or die("Can't open file: ${filename}: $!\n");
print $FH "Hello World";
close $FH;
For advanced one-liners like this, you could also use the psh command from Psh, a simple pure Perl shell.
psh -c '{my $var = "something"; print $var} >/tmp/out.txt'
I use FileHandle. From the POD:
use FileHandle;
$fh = new FileHandle ">> FOO"; # modified slightly from the POD, to append
if (defined $fh) {
print $fh "bar\n";
$fh->close;
}
If you want something closer to a "one-liner," you can do this:
use FileHandle;
my $fh = FileHandle->new( '>> FOO' ) || die $!;
$fh->print( "bar\n" );
## $fh closes when it goes out of scope
You can do a one-liner like this one:
print "$panel_login $panel_password $root_name $root_pass $port $panel_type" >> io('/home/shared/ftp');
You only need to add the IO::All module to your code, like this:
use IO::All;
Some good reading about editing files with perl:
FMTYEWTK About Mass Edits In Perl