I have a .dat binary file. Currently I can open and read in say the first 10 values in MATLAB with the following code:
binaryFile = fopen(filepath,'r');
data = fread(binaryFile, 10);
What I want is to be able to read in say the values from the 100th value to the 500th value, so to not start from the beginning. My reason is that I do not want to read into memory data that I won't use.
How to do this in MATLAB?
Related
I generated and saved large number of data files using Octave, and now I need to open them in MATLAB as part of analyzing them. MATLAB spits out this error.
Error using load
Unable to read MAT-file
[file path]/PhPar_40.mat: not a binary MAT-file.
Try LOAD -ASCII to read as text.
Trying its suggestion of load -ASCII then gives this error.
Error using load
Number of columns on line 2 of ASCII file
[filepath]/PhPar_40.mat must be the same as previous lines.
I (now) understand that Octave is capable of saving in a MATLAB readable format, but re-creating these data files would take an inordinate amount of time and really isn't an option. Is there a way to get MATLAB to read these files?
MATLAB can't open these files because these are not saved by octave properly. Try saving them in octave by following command:
save -mat7-binary '[filepath]/PhPar_40.mat' 'm'
If you have large number of files, you can place all files in folder and then run an iterator to read all load and save in correct format automatically. This iterator will look like:
file = dir('[filepath_read]/*.mat');
index = 1;
while (index==length(file)+1)
m = load('file(index).name')
save -mat7-binary strcat("[filepath_write]/", file(index).name, ".mat") 'm';
index = index+1;
pause(1);
endwhile
Once you have all the files converted to right format, load them in MATLAB. I hope it will solve your problem
I have a 37,000,000x1 double array saved in a matfile under a structure labelled r. I can point to this file using matfile(...) then just use the find(...) command to find all values above a threshold val
This finds all the values greater than/equal to 0.004 but given the size of my data, this takes some time.
I want to reduce the time and have considered using bin files (apparently they are better than txt files in terms of not losing precision?) etc, however I'm not knowledgable with the syntax/method
I've managed to save the data into the bin file, but what is the quickest way to search through this large file?
The only output data I want are the actually values greater than my specified value.
IS using a bin file the best? Or a matfile? Etc
I don't want to load the entire file into matlab. I want to conserve the matlab memory as other programs may need the space and I don't want memory errors again
As #OlegKomarov points out, a 37,000,000 element array of doubles is not very big. Your real problem may be that you don't have enough RAM and/or are using a 32-bit version of Matlab. The find function will require additional memory for the input and the out array of indices.
If you want to load and process your data in chunks, you can use the matfile function. Here's a small example:
fname = [tempname '.mat']; % Use temp directory file for example
matObj = matfile(fname,'Writable',true); % Create MAT-file
matObj.r = rand(37e4,1e2); % Write random date to r variable in file
szR = size(matObj,'r'); % Get dimensions of r variable in file
idx = [];
for i = 1:szR(2)
idx = [idx;find(matObj.r(:,i)>0.999)]; % Find indices of r greater than 0.999
end
delete(fname); % Delete example file
This will save you memory, but it definitely not faster than storing everything in memory and calling find once. File access is always slower (though it will help a bit if you have an SSD). The code above uses dynamic memory allocation for the idx variable, but the memory is only re-allocated a few times in large chunks, which can be quite fast in current versions of Matlab.
I have large .bin files (10GB 60GB) that I want to import to MATLAB; each binary file represents the output of two sensors, thus there are too columns of data. Here is a more manageable sized example of my data.
You will notice that there is a .txt version of the data; I need to upload the .bin files directly to MATLAB, I can't use the .txt version because it takes hours to convert with larger files.
The problem I have is that the .bin file has header information that I can't seem to interpret properly, and thus I cannot extract the data in MATLAB every time I try I seem to get gibberish values.
This is all the information I have about the binary header:
Loading Labview Binary Data into Matlab
LabVIEW Data Logger: Binary Header File Format
Any help/advice would be much appreciated I have been trying to solve this problem for days now.
P.S. Someone has already written a function to solve this problem but it does not seem to work with my binary data (could be something to do with the dimensions/size of my data): http://www.mathworks.co.uk/matlabcentral/fileexchange/27195-load-labview-binary-data
Below is the code that I am using to import my data, I believe that that d1 and d2 are the dimensions of my binary data. D2 is probably incorrect for the example file in the dropbox because it has been truncated.
The problem I have is that the code extracts my data and I know it is correct because I can check it with the .txt file (also in the drop box) however there are seaming random bad values between the good data points. These bad values result from the following strings following strings: "NI_ChannelName", "Sensor A", "Sensor B", "NI_UnitDescription", and "Volts" scatted throughout the binary file.
clear all
clc
fname = 'RTL5_57.bin';
fid = fopen(fname,'r','ieee-be');
d1 = fread(fid,4);
trash=fread(fid,2,'double');
d2 = fread(fid,4);
trash=fread(fid,1,'double');
data=fread(fid,'double');
I suppose you will need to change the data-format. See Matlab help.
https://decibel.ni.com/content/docs/DOC-39038
Scope:
1) Write a binary file in matlab and read into labview. 2) Write a binary file in labview and read into matlab.
Background:
IMPORTANT:
You must know (3) things about the binary data in the file before you can read the data:
1) what binary format (precision) was used to store the data
2) the exact number of values in the file to read.
3) Endianness
There is no row or column in binary files. Think of a long row/or a long column that needs to be mapped to a 2D array.
Resources on data in binary format.
http://cse.unl.edu/~sincovec/Matlab/Lesson%2024/Binary/CS211%20Lesson%2024%20-%20Binary%20File%20Input-Output.htm
I have 20 text files each containing a vector of size 180. How can I access each text file and assign the vector to a variable?
If you want to do it manually use Import Data under File (or use uiimport).
If you want to automate it use fid = fopen(filename) and then use var = textscan(fid, 'format') where format depends on how your vectors are structured. Spend some time reading doc textscan and doc fopen everything you probably need to know is in those two files. If your data is nicely structured look at doc importdata.
My knowledge of matlab is merely on a need to know basis, so this is probably an elementary question. Nevertheless here it comes:
I have got a file containing data (16-bit integers) stored in binary format. How do I read it into a vector /an array in matlab? How do I write this data to a file in matlab? Is there any smart tweak to increase the performance speed when reading/writing a huge amount of data (gigabytes)?
As Bill the Lizard wrote you can use fread to load the data into a vector. I just want to expand a little on his answer.
Reading Data
>> fid=fopen('data.bin','rb') % opens the file for reading
>> A = fread(fid, count, 'int16') % reads _count_ elements and stores them in A.
The commands fopen and fread default to Little-endian[1] encoding for the integers. If your file is Big-endian encoded you will need to change the fread to
>> A = fread(fid, count, 'int16', 'ieee-be');
Also, if you want to read the whole file set
>> count=inf;
and if you want to read the data into matrix with n columns use
>> count=[n inf];
Writing Data
As for witting the data to a file. The command, fwrite, in Bill's answer will write to a binary file. If you want to write the data to a text file you can use dlmwrite
>> dlmwrite('data.csv',A,',');
References
[1] http://en.wikipedia.org/wiki/Endianness
Update
The machine format (IE, ieee-be,
ieee-le, vaxd etc.) of the binary data can be specified in either the
fopen or the fread commands in Matlab. Details of the supported
machine format can be found in
Matlab's documentation of fopen.
Scott French's comment to Bill's
answer
suggests reading the data into an
int16 variable. To do this use
>> A = int16(fread(fid,count,precision,machineFormat));
where count is the size/shape of
the data to be read, precision is
the data format, and machineformat
is the encoding of each byte.
See commands fseek to move around the file. For example,
>> fseek(fid,0,'bof');
will rewind the file to the beginning where bof stands for beginning of file.
Assuming you know how many values you have stored in the file, you can do something like this to read the data into an array.
fid = fopen('data.bin','rb')
A = fread(fid, count, 'int16')
To write data to a file do this:
fid = fopen('data.bin','w')
count = fwrite(fid, A, 'int16')
The fwrite function returns the number of elements (not bytes) written to the file.
As far as performance tuning goes, you can read data in chunks to only use as much as you need to process. This is the same in any language, and there's no way to speed it up that's specific to Matlab.
I usually hate seeing links in a response, but this looks pretty close:
http://www.mathworks.com/support/tech-notes/1400/1403.html
As to the second part of performance tuning, it's been 6 years since I've used Matlab, so I don't know.
HTH