matlab cannot read text file containing ^* as power of 10 - matlab

I need to read text files into Matlab. In the text files there are numbers like 5.875489^*-6, which is indeed 0.000005875489. Matlab cannot read this format and since there are too many files, I cannot change the format in all the files manually. So, I wonder if there is any tips to make Matlab reading the files as they are?
Any help and guide is highly appreciated.
Marilla.

As pointed out by #vu1p3n0x, it would probably be easier to replace ^* by e using a replace-all. Alternatively, if that is unpractical, you could read in the the mantissa and exponent separately and perform the exponentiation in Matlab:
Raw = textscan(fid, '%f^*%f');
Result = Raw{1}.*10.^Raw{2};

Related

Writing matlab vector to a file that is matlab readable

For matlab: Is there a way to write the value of a vector to a file that can later be opened and read by another matlab program?
Specifically: I have a matlab program that computes a binary-valued vector $zvector$ with 10^7 entries. I want to write $zvector$ as data to an output file so that it can be emailed and easily read as input to another matlab program. Ideally, the output file would be called “Output.m” and would look like:
zvector=[
0
1
1
…
0
1
];
I like the .m format because it is easy to use for matlab input. I have experimented with matlab’s write() and fwrite() commands, with no success. I observe that these generate files that cannot be easily read as matlab-recognizable inputs (at least, I do not know how to read from them). Is there a way to accomplish my goals? Thanks.
PS: I am interested in the easiest way. If this involves a different type of file format (not a .m format) that is fine. However, in that case, can you provide both the writing and reading commands? Thanks again.
Thanks to #edwinksl for pointing me in the right direction with MAT files. I do not know the accepted practice here, but in stackexchange math it is encouraged to answer your own question if a hint from comments got you all the way there. So I will answer my own question.
The Mat format does this well. Here are example script files for reading and writing in the Mat format (see also links in above comments for more documentation):
***Script file OutputTest.m:
filename = 'TestFile.mat';
TestVector=[1 1 0 1];
save(filename, 'TestVector');
***Script file IntputTest.m
filename = 'TestFile.mat';
file=load(filename);
z =file.TestVector;
z

Loading huge binary file partially into Matlab

I have a huge binary file with double precision numbers and I would like to load parts of it into Matlab. Is there a way to do this?
One way would be if I could convert it to a .mat file (without loading it in Matlab first), but I haven't been able to figure out how (or if it's actually possible).
Any ideas?
PS: I was thinking of using c++ to do the conversion but it turns out this is really problematic because I'm using a linux version of c++ (through cygwin) and a windows version of Matlab.
If you know what parts of the file you want to load, you can use fseek followed by fread (both preceded by fopen, of course).
For example, jump a few thousand bytes into a file and read a certain number of bytes, as doubles:
fid = fopen('binary.dat','r');
fseek(fid, 3000, 'bof');
A = fread(fid, N, 'double');
fclose(A); % don't forget to close the file
See the section of documentation called Reading Portions of a File for more information.

Simplest way to read space delimited text file matlab

Ok, so I'm struggling with the most mundane of things I have a space delimited text file with a header in the first row and a row per observation and I'd like to open that file in matlab. If I do this in R I have no problem at all, it'll create the most basic matrix and voila!
But MATLAB seems to be annoying with this...
Example of the text file:
"picFile" "subjCode" "gender"
"train_1" 504 "m"
etc.
Can I get something like a matrix at all? I would then like to have MATLAB pull out some data by doing data(1,2) for example.
What would be the simplest way to do this?
It seems like having to write a loop using f-type functions is just a waste of time...
If you have a sufficiently new version of Matlab (R2013b+, I believe), you can use readtable, which is very much like how R does it:
T = readtable('data.txt','Delimiter',' ')
There are many functions for manipulating tables and converting back and forth between them and other data types such as cell arrays.
There are some other options in the data import and export section of the Statistics toolbox that should work in older versions of Matlab:
tblread: output in terms of separate variables for strings and numbers
caseread: output in terms of a char array
tdfread: output in terms of a struct
Alternatively, textscan should be able to accomplish what you need and probably will be the fastest:
fid = fopen('data.txt');
header = textscan(fid,'%s',3); % Optionally save header names
C = textscan(fid,'%s%d%s','HeaderLines',1); % Read data skipping header
fclose(fid); % Don't forget to close file
C{:}
Found a way to solve my problem.
Because I don't have the latest version of MATLAB and cannot use readable which would be the preferred option I ended up doing using textread and specifying the format of each column.
Tedious but maybe the "simplest" way I could find:
[picFile subCode gender]=textread('data.txt', '%s %f %s', 'headerlines',1);
T=[picFile(:) subCode(:) gender(:)]
The textscan solution by #horchler seems pretty similar. Thanks!

Convert binary data to jpeg

I have spent the last couple hours scouring the Internet for a solution to my problem, and while I have seen some "answers" on other forums, none of them suit my needs...
I have a binary file, which I am creating in Matlab using fwrite (although, if someone has a better way to generate a binary file in Matlab, I'm open to suggestions). Back to my problem - I have this binary file, and I want to convert it to a jpeg. Nevermind where the binary data comes from, I just want to generate a jpeg image of the binary data.
Is this even possible? - Like I said, lots of "solutions" out there to similar problems, but none match up to my needs.
I can write code in C++, if necessary, but for simplicity, I'd like to stay in Matlab.
Any help will be appreciated.
EDIT:
from binary to array:
fid = fopen('yourfilename.bin');
% read the entire file as characters
% transpose so that F is a row vector
B = fread(fid, '*char')'
fclose(fid);
Reshape to array according to image dimensions
C=reshape(B,512,512); % or whatever dimension you have
Upon having the array of strings just use:
D=int32(str2num(C));

How do you save data to a text file in a given format?

I want to save a matrix to a text file, so I can read it by another program. Right now I use:
save('output.txt', 'A','-ascii');
But this saves my file as
6.7206983e+000 2.5896414e-001
6.5710723e+000 4.9800797e-00
6.3466334e+000 6.9721116e-001
5.9975062e+000 1.3346614e+000
6.0224439e+000 1.8127490e+000
6.3466334e+000 2.0517928e+000
6.3965087e+000 1.9721116e+000
But I would like to have them saved without the "e-notation" and not with all the digits. Is there an easy way to do this?
Edit: Thank you! That works just fine. Sorry, but I think I messed up your edit by using the rollback.
I would use the fprintf function, which will allow you to define for yourself what format to output the data in. For example:
fid = fopen('output.txt', 'wt');
fprintf(fid,'%0.6f %0.6f\n', A.');
fclose(fid);
This will output the matrix A with 6 digits of precision after the decimal point. Note that you also have to use the functions fopen and fclose.
Ditto gnovice's solution, if you need performance & custom formatting.
dlmwrite gives you some control (global, not per-field basis) of formatting. But it suffers from lower performance. I ran a test a few years ago and dlmwrite was something like 5-10x slower than the fopen/fprintf/fclose solution. (edit: I'm referring to large matrices, like a 15x10000 matrix)