Saving an array in matlab efficiently - matlab

I want to save an array efficiently in matlab. I have the array of size 3000 by 9000. If I save this array in the mat file it consumes around 214 MB using just the save function. If I use fwrite and use float data type came to be around 112. Is there any other way that I can still reduce the hard disk space consumed when I save this array in matlab?

I would suggest writing using binary mode and then using compression algorithm such as bzip

There are a few ways to reduce the required memory:
1. Reducing precision
Rather than using that double you normally have, consider using a single, or perhaps even a uint8 or logical. Using the print function will also so this, but you may want to consider compressing further afterwards as printing does not create a compressed file.
2. Utilizing a pattern
If your matrix has a certain pattern, this can sometimes be leveraged to store the data more efficiently. Or at least the information to create the data. The most common example is that your matrix is storeable as a few vectors. For example when it is sparse.

Related

loading large csv files in Matlab

I had csv files of size 6GB and I tried using the import function on Matlab to load them but it failed due to memory issue. Is there a way to reduce the size of the files?
I think the no. of columns are causing the problem. I have a 133076 rows by 2329 columns. I had another file which is of the same no. of rows but only 12 rows and Matlab could handle that. However, once the columns increases, the files got really big.
Ulitmately, if I can read the data column wise so that I can have 2329 column vector of 133076, that will be great.
I am using Matlab 2014a
Numeric data are by default stored by Matlab in double precision format, which takes up 8 bytes per number. Data of size 133076 x 2329 therefore take up 2.3 GiB in memory. Do you have that much free memory? If not, reducing the file size won't help.
If the problem is not that the data themselves don't fit into memory, but is really about the process of reading such a large csv-file, then maybe using the syntax
M = csvread(filename,R1,C1,[R1 C1 R2 C2])
might help, which allows you to only read part of the data at one time. Read the data in chunks and assemble them in a (preallocated!) array.
If you do not have enough memory, another possibility is to read chunkwise and then convert each chunk to single precision before storing it. This reduces memory consumption by a factor of two.
And finally, if you don't process the data all at once, but can implement your algorithm such that it uses only a few rows or columns at a time, that same syntax may help you to avoid having all the data in memory at the same time.

Read and represent mp3 files using memmapfile in matlab

I have to analyze bio acoustic audiofiles using matlab. Eventually I want to be able to find anomalies in the audio. That's the reason I need to find a way to represent the audio in a way I can extract and compare features. I'm dealing with mp3 files up to 150 mb. These files are too large for matlab to read in to it's memory. Therefore I want to use the memmapfile() function. I used the following code and a small mp3 file to find out how it actually works.
[testR, ~] = audioread('test.mp3');
testM = memmapfile('test.mp3');
disp(testM.Data);
disp(testR);
The actual values of the testM.Data and testR are different. Audioread() returns a 7483391 x 2 matrix and memmapfile() a 4113874 x 1 matrix.
I'm not really sure how memmapfile() works, I expected this to be equal to each other. Is there a way to read mp3 files in the same format audioread() does using memmapfile()? And what does memmapfile actually return in case of an audio file? Maybe it's also usable in the vector format in the case of anomaly detection?
Thanks in advance!
NOTE: The original files were in wav IMA ADPCM format with sizes from 1.5 up to 2.5 gb. Since Matlab can't deal with that format and the size of the files I converted them to 8bit mp3 files.
I think that the problem is mammapfile by default read data in uint8 format, while audioread function read data in another way.
How you can see here you can specify the format of data when you read it with memmapfile, so try to "play" with different values. From the documentation I read that you can read data in double format, so try to modify the memmapfile data format and audioread data format.
Last thing, memmapfile always organize the data in matrix like "somenumbers x 1", so if you want the original one you need to use something like reshape.
Anyway if you work with big data I suggest you to try with something different instead memmapfile, because it is very very slow

Efficient way to store single matrices generated in a loop in Matlab?

I would like to know whether there is a way to reduce the amount of memory used by the following piece of code in Matlab:
n=3;
T=100;
r=T*2;
b=80;
BS=1000
bsuppostmp_=cell(1,BS);
bslowpostmp_=cell(1,BS);
bsuppnegtmp_=cell(1,BS);
bslownegtmp_=cell(1,BS);
for w=1:BS
bsuppostmp_{w}= randi([0,1],n*T,2^(n-1),r,b);
bslowpostmp_{w}=randi([0,3],n*T,2^(n-1),r,b);
bsuppnegtmp_{w}=randi([0,4],n*T,2^(n-1),r,b);
bslownegtmp_{w}=randi([0,2],n*T,2^(n-1),r,b);
end
I have decided to use cells of matrices because after this loop I need to call separately each single matrix in another loop.
If I run this code I get the message error "Your system has run out of application memory".
Do you know a more efficient (in terms of memory) way to store each single matrix?
Let's refer the page about Strategies for Efficient Use of Memory:
Because simple numeric arrays (comprising one mxArray) have the least overhead, you should use them wherever possible. When data is too complex to store in a simple array (or matrix), you can use other data structures.
Cell arrays are comprised of separate mxArrays for each element. As a result, cell arrays with many small elements have a large overhead.
I doubt that the overhead for cell array is really large ...
Let me give a possible explanation. What if Matlab cannot use the swap file in case of storing the 4D arrays into a cell array? When storing large numeric arrays, there is no out-of-memory error because Matlab uses the swap file for caching each variable when the used memory becomes too big. Whereas if each 4D array is stored in a super cell array, Matlab sees it as a single variable and cannot fragment it part in the RAM and part in the swap file. Ok I don't work at Mathworks so I don't know if I'm right or not, it's just an idea about this topic so I would be glad to know what is the real reason.
So my advice is the same as other comments: try to free matrices as soon as you've done with them. There is not so many possibilities to store many dense arrays: one big array (NOT recommended here, will reach out-of-memory sooner because Matlab makes it contiguous), cell array or struct array (and if I correctly understand the documentation, the overhead can be equivalent). In all cases, the data amount over all 4D arrays is really large, so the best thing to do is to care about keeping the memory constantly as low as possible by discarding some data once they are used and keep in memory only the results of computation (in case they take lower memory usage ...).

error reading a large binary file in MATLAB

I have to read in a large Binary file whose size is 92,504 KB. When I am using fread command MATLAB is giving me error:
Error using fread Out of memory. Type HELP MEMORY for your options.
I tried to restart MATLAB also so that if I am using any virtual memory it should be cleared but still the problem persists.
How can I solve this problem of reading data.
The problem is the code that you are using to read the data:
[data,count] = fread(fid,'uint8');
The above line tells matlab to read in as many uint8s as it can and put them into a vector.
The trouble is that matlab will put it into a vector of doubles. So rather than a vector where each element is one byte, you have a vector where each element is 8 bytes. This ends up making your 92Mb of data take up 92*8 = 736mb which is probably going to be bigger than the maximum possible array size shown by the memory command.
The solution here is to tell matlab to put the data you are reading into a vector of uint8 which can be achieved as follows:
[data,count] = fread(fid,'*uint8');
This method for reading in the data tells matlab that the output vector should be the same type as the input data. You can read more about it in the precision section of the fread documentation.
In a 32-bit system, you may have very less memory available to MATLAB. The fread command you are using reads the entire file at once. This is probably a bad idea, since you system is not having enough memory. A better way to implement would be to read file part by part. See,
A = fread(fileID, sizeA)
in link below[1]. You can put this code inside a loop. In case you want to read whole file at once, what i would recommend is to use a 64-bit system with 3GB RAM.
[1] http://www.mathworks.in/help/matlab/ref/fread.html

Alternatives to Matlab's Mat File Format

I'm finding that writing and reading the native mat file format becomes very, very slow with larger data structures of about 1G in size. In addition we have other, non-matlab, software that should be able to read and write these files. So I would to find an alternative format to use to serialize matlab data structures. Ideally this format would ...
be able to represent an arbitrary matlab structure to a file.
have faster I/O than than mat files.
have I/O libraries for other languages like Java, Python and C++.
Simplifying your data structures and using the new v7.3 MAT file format, which is a variant of HDF5, might actually be the best approach. The HDF5 format is open and already has I/O libraries for your other languages. And depending on your data structure, they may be faster than the old binary mat files.
Simplify the data structures you're saving, preferring large arrays of primitives to complex container structures.
Try turning off compression if your data structures are still complex.
Try the v7.3 MAT file format using "-v7.3"
If using a network file system, consider saving and loading to a temporary dir on a fast local drive and copying to/from the network
For large data structures, your MAT file I/O speed may be determined more by the internal structure of the data you're writing out than the size of the resulting MAT file itself. (In my experience, this has usually been the major factor in slow MAT files.) When you say "arbitrary Matlab structure", that suggests you might be using cells, structs, or objects to make complex data structures. That slows down MAT I/O because there is per-array overhead in MAT file I/O, and the members of cell and struct arrays (container types) all count as separate arrays. For example, 5,000 strings stored in a cellstr are much, much slower than the same 5,000 strings stored in a 2-D char array. And objects have even more overhead. As a test, try writing out a 1 GB file that contains just a 1 GB primitive array of random uint8s, and see how long that takes. From there, see if you can simplify your data to reduce the total mxarray count, even if that means reshaping it for serialization. (My experience with this is mostly with the v7 format; the newer HDF5 format may have less per element overhead.)
If your data files live on the network, you could also try doing the save and load operations on temporary files on fast local drives, and separately using copy operations to move them back and forth between the network. At least on Windows networks, I've seen speedups of up to 2x from doing this. Possibly due to optimizations the full-file copy operation can do that the MAT I/O code can't.
It would probably be a substantial effort to come up with an alternate file format that supported fully arbitrary Matlab data structures and was portable to other languages. I'd try making smaller changes around your use of the existing format first.
mat format has changed with Matlab versions. v7.3 uses HDF5 format, which has builtin compression and other features and it can take a large time to read/write. However, you can force Matlab to use previous formats which are faster (but might take more space).
See here:
http://www.mathworks.com/help/matlab/import_export/mat-file-versions.html