Passing variables to another MATLAB version - matlab

I have to work with an old fashioned program and I have to extract the data using a 32 bit MATLAB. After extracting the data, I will do some heavy processing on it. To speed up the processing, I'd like to use MATLAB 64 bit instead of 32.
How can I transfer the data from MATLAB 32 to 64 without saving and loading? Any other faster way?
Thanks,
Mina

You can use memmapfile to share the data via memory.

Related

Train a huge model inception with keras

I need to train an inception model with more than 400 000 images.
I know I can't load it all on the memory, since it's too big.
So, I will certainly train it over batch, instead than epoch (And so generate load every batch from the disk)
But, it will be very slow, no ?
Do you know if there is a different way of doing it ?
I also want to apply different and aleatory transformations to my images during the training.
I looked over the dataimagegenerator class, but, it's incompatible with all the images I have.
So, there is a way to do that without the generator ?
Thanks to you!
You can use the fit_generator method (https://keras.io/models/model/#fit_generator) of the model. This still loads images from memory, however this is done in parallel and has less overhead. You can write your own generator to apply the transformations you want to (https://wiki.python.org/moin/Generators).
If you need faster memory access you can take a look at hdf5. You can store the images in hdf5 to provide faster indexing and loading for your program. (http://www.h5py.org/)

Saving an array in matlab efficiently

I want to save an array efficiently in matlab. I have the array of size 3000 by 9000. If I save this array in the mat file it consumes around 214 MB using just the save function. If I use fwrite and use float data type came to be around 112. Is there any other way that I can still reduce the hard disk space consumed when I save this array in matlab?
I would suggest writing using binary mode and then using compression algorithm such as bzip
There are a few ways to reduce the required memory:
1. Reducing precision
Rather than using that double you normally have, consider using a single, or perhaps even a uint8 or logical. Using the print function will also so this, but you may want to consider compressing further afterwards as printing does not create a compressed file.
2. Utilizing a pattern
If your matrix has a certain pattern, this can sometimes be leveraged to store the data more efficiently. Or at least the information to create the data. The most common example is that your matrix is storeable as a few vectors. For example when it is sparse.

error reading a large binary file in MATLAB

I have to read in a large Binary file whose size is 92,504 KB. When I am using fread command MATLAB is giving me error:
Error using fread Out of memory. Type HELP MEMORY for your options.
I tried to restart MATLAB also so that if I am using any virtual memory it should be cleared but still the problem persists.
How can I solve this problem of reading data.
The problem is the code that you are using to read the data:
[data,count] = fread(fid,'uint8');
The above line tells matlab to read in as many uint8s as it can and put them into a vector.
The trouble is that matlab will put it into a vector of doubles. So rather than a vector where each element is one byte, you have a vector where each element is 8 bytes. This ends up making your 92Mb of data take up 92*8 = 736mb which is probably going to be bigger than the maximum possible array size shown by the memory command.
The solution here is to tell matlab to put the data you are reading into a vector of uint8 which can be achieved as follows:
[data,count] = fread(fid,'*uint8');
This method for reading in the data tells matlab that the output vector should be the same type as the input data. You can read more about it in the precision section of the fread documentation.
In a 32-bit system, you may have very less memory available to MATLAB. The fread command you are using reads the entire file at once. This is probably a bad idea, since you system is not having enough memory. A better way to implement would be to read file part by part. See,
A = fread(fileID, sizeA)
in link below[1]. You can put this code inside a loop. In case you want to read whole file at once, what i would recommend is to use a 64-bit system with 3GB RAM.
[1] http://www.mathworks.in/help/matlab/ref/fread.html

large data file in matlab doesn't load/import

I have been trying to load data file (csv) into matlab 64 bit running on win7(64 bit) but get memory related errors. The file size is about 3 GB, containing date ( dd/mm/yyyy hh:mm:ss) in first column and bid and ask prices in another two columns. The memory command returns the following :
Maximum possible array: 19629 MB (2.058e+010 bytes) *
Memory available for all arrays: 19629 MB (2.058e+010 bytes) *
Memory used by MATLAB: 522 MB (5.475e+008 bytes)
Physical Memory (RAM): 16367 MB (1.716e+010 bytes)
* Limited by System Memory (physical + swap file) available.
Can somebody here please explain if the max possible array size is 19.6 GB then why would matlab throw a memory error while importing a data array that is just about 3GB. Apologies if this is a simple question to the experienced as I have little experience in process/app memory management.
I would greatly appreciate if someone would also suggest solution to being able to load this dataset into matlab workspace.
Thank you.
I am no expert in memory management but from experience I can tell you that you will run into all kinds of problems if you're importing/exporting 3GB text files.
I would either use an external tool to split your data before you read it or look into storing that data in another format that is more suited to large datasets. Personally, I have used hdf5 in the past---this is designed for large sets of data and is also supported by matlab.
In the meantime, these links may help:
Working with a big CSV file in MATLAB
Handling Large Data Sets Efficiently in MATLAB
I've posted before showing how to use memmapfile() to read huge text files in matlab. This technique may help you as well.

Matlab: Is it possible to save in the workspace a vector that contains 4 millions of values?

I am calculating something and, as result I have a vector 4 millions elements. I am not still finished to calculate it. I´ve calculate it will take 2 and a half hours more. When I will have finished I could save it?
It is not possible, what can I do?
Thank you.
On Windows 32-bit you can have at maximum a double array of 155-200 millions of elements. Check other OSs on Mathworks support page.
Yes, just use the command save. If you just need it for later Matlab computations, then it is best to save it in .mat format.
save('SavedFile.mat','largeVector')
You can then load your file whenever you need it using the load function.
load('SavedFile.mat')
On my machine it takes 0.01 sec to get a random vector with 4 million elements, with whos you can see that it takes (only) 32 MB.
It would take only few seconds to save such amount of data with MATLAB. If you are working with post-R2007b then maybe it is better to save with '-v7.3' option, newer MATLAB versions use by default general HDF5 format but there could be some performance/disc usage issues.