How can I resolve out of memory error in MATLAB? - matlab

I want to calculate 2 covariance matrices with size (10304,1034) and matlab creates the first one but when it runs the second command, this error occurs:
>> j=ones(10000,10000);
>> jj=ones(10000,10000);
??? Out of memory. Type HELP MEMORY for your options.
My laptop's RAM is 2GB, but it still has 1 GB free. I am using Windows 7 and 32-bit MATLAB 2009b.
How can I resolve this error?

A 10k-by-10k array of doubles uses 1e8*8 bytes, which corresponds to 800MB. MATLAB needs these 800MB to be contiguous. Most likely, your 1GB free memory is a little fragmented, so MATLAB cannot fit the new array into RAM.
Use the command MEMORY to find out the maximum variable size that MATLAB can handle at a given moment.

Try to use sparse matrices, in that case MATLAB doesn't allocate the entire space.

Try any of these two options little bit increase in memory allocated for matlab.exe processing.
1- Give higher priority to Matlab.exe task. You can do that by going to task manager, Processes tab, right click the Matlab.exe Task, select priority and set it to higher priority (say real time), this tells Windows to allocate more resources to this process.
2- Increase the page file size of your applications in general. You can do this by right clicking MyComputer ->properties->Advanced System Settings ->Advanced-> Performance->Virtual Memory (change..). Then the tick from the Automatic .... and set the initial and maximum page size to say 10000 MB.

Go to Matlab-->file-->Preferences-->general-->Java heap memory--> and increase the level.. This solved my problem

Related

When generate a network of 500 nodes, the behavior space went wrong and how can I solve it?

I use the NW extension to generate small-world networks and make an experiment in behaviour space to control the . In the experiment, I set ["nb-nodes" 50 100 500] (nb-nodes: the number of nodes).
Everything goes well until n = 500. CPU usage is too high, causing the program to become unresponsive.But when I set a single simulation with n = 500 in the interface, it keeps working. Only when I try to do it in the behaviour space, it goes wrong.
How can I solve it?
solution:
Thank for advice of Jasper and Steve Railsback, it helps a lot :)
In the FAQ (http://ccl.northwestern.edu/netlogo/docs/faq.html#how-big-can-my-model-be-how-many-turtles-patches-procedures-buttons-and-so-on-can-my-model-contain) it says:
"If you are using BehaviorSpace, note that doing runs in parallel will multiply your RAM usage accordingly"
So I just reduce the parallel from 16 to 8, I know it will slow down the program, but at the same time it will also reduce RAM usage and it works.
Of course, change RAM is another way and it can fit higher calculating demand.
I agree that you should start with increasing NetLogo's memory allocation as directed in the FAQ that Jasper referred you to. (At least in Windows, you must edit the NetLogo.cfg file using administrator privileges.) You can start by doubling or quadrupling the allocation, but the only real limitation is how much RAM you have on your machine.
We keep notes and a publication on NetLogo performance issues here:
http://www.railsback-grimm-abm-book.com/jasss-models/

How to store giant matrix in matlab

I'm using a matlab script to create and store a large matrix of floating point numbers. When I tried running this program on my personal laptop, the program ended hours later with the message 'out of memory'. Supposedly, Matlab has a limit for the maximum-sized array it can store, which makes sense.
My question is: how to store large matrix in matlab? Specifically, I'm using a 64-bit linux OS, and I need to store a 5-6 GB matrix.
I am not an expert in this, but as I understand it the most simple solution would be to get more RAM. However, you could try to check the available memory at the time of the error with
dbstop if error
memory
This should tell you how much Memory is available for Matlab, how much is currently used and how large your biggest array can be. If you exceed this I don't think there is a software solution for that besides storing the data in multiple smaller files.
If you get the "Out of memory: Java Heap Space" error you can increase the memory which is available for Java under (Home -> Preferences -> General -> Java Heap Memory)
Also check if your array side is limited to a certain percentage of your available memory under (Home -> Preferences -> Workspace -> MATLAB array size limit) and set it to 100%.
Similar question in Matlab forum

How can I use only part of my total RAM during MATLAB computation?

I would like to dedicate 8 GB of RAM instead of the full (12) for a very long computation, in order to use the remainder for another operation. Is it possible?
Is there maybe a MATLAB command that forces the maximum limit of memory usage?
I would like to work with 2 separate editors.
See here for a possible solution on "limit the memory of a process on windows":
Set Windows process (or user) memory limit
Matlab has no command to limit the memory usage, it will aquire as much memory as needed to do the computation. On some operating systems you can limit the memory usage, for example using ulimit on Linux. But be aware, when Matlab needs more than 8gb it will not be slow when reaching the limit, it will throw an exception and stop computing.

Matlab Preallocation

I'm running a simulation of a diffusion-reaction equation in MATLAB, and I pre-allocate the memory for all of my vectors beforehand, however, during the loop, in which I solve a system of equations using BICG, the amount of memory that MATLAB uses is increasing.
For example:
concentration = zeros(N, iterations);
for t = 1:iterations
concentration(:,t+1) = bicg(matrix, concentration(:,t));
end
As the program runs, the amount of memory MATLAB is using increases, which seems to suggest that the matrix, concentration, is increasing in size as the program continues, even though I pre-allocated the space. Is this because the elements in the matrix are becoming doubles instead of zeros? Is there a better way to pre-allocate the memory for this matrix, so that all of the memory the program requires will be pre-allocated at the start? It would be easier for me that way, because then I would know from the start how much memory the program will require and if the simulation will crash the computer or not.
Thanks for all your help, guys. I did some searching around and didn't find an answer, so I hope I'm not repeating a question.
EDIT:
Thanks Amro and stardt for your help guys. I tried running 'memory' in MATLAB, but the interpreter said that command is not supported for my system type. I re-ran the simulation though with 'whos concentration' displayed every 10 iterations, and the allocation size of the matrix wasn't changing with time. However, I did notice that the size of the matrix was about 1.5 GB. Even though that was the case, system monitor was only showing MATLAB as using 300 MB (but it increased steadily to reach a little over 1 GB by the end of the simulation). So I'm guessing that MATLAB pre-allocated the memory just fine and there are no memory leaks, but system monitor doesn't count the memory as in use until MATLAB starts writing values to it in the loop. I don't know why that would be, as I would imagine that writing zeros would trigger the system monitor to see that memory as 'in use,' but I guess that's not the case here.
Anyway, I appreciate your help with this. I would vote both of your answers up as I found them both helpful, but I don't have enough reputation points to do that. Thanks guys!
I really doubt it's a memory leak, since most "objects" in MATLAB clean after themselves once they go out of scope. AFAIK, MATLAB does not use a GC per se, but a deterministic approach to managing memory.
Therefore I suspect the issue is more likely to be caused by memory fragmentation: when MATLAB allocates memory for a matrix, it has to be contiguous. Thus when the function is repeatedly called, creating and deleting matrices, and over time, the fragmentation becomes a noticeable problem...
One thing that might help you debug is using the undocumented: profile on -memory which will track allocation in the MATLAB profiler. Check out the monitoring tool by Joe Conti as well. Also this page has some useful information.
I am assuming that you are watching the memory usage of matlab in, for example, the task manager on windows. The memory usage is probably increasing due to the execution of bicg() and variables that have not been garbage collected after it ends. The memory allocated to the concentration matrix stays the same. You can type
whos concentration
before and after your "for" loop to see how much memory is allocated to that variable.

MATLAB "out of memory" error

When I run a sample script in MATLAB, it says:
Out of memory. Type HELP MEMORY for your options.
When I type "memory", it reports:
Maximum possible array: 156 MB (1.638e+008 bytes) *
Memory available for all arrays: 740 MB (7.756e+008 bytes) **
Memory used by MATLAB: 1054 MB (1.105e+009 bytes)
Physical Memory (RAM): 3070 MB (3.219e+009 bytes)
* Limited by contiguous virtual address space available.
** Limited by virtual address space available.
Is there any way to get around this error? I'm using Windows XP x32 with MATLAB 2009a.
pack does a memory defragmentation. It might help you a bit as far as the contiguous memory available.
Remember, when MATLAB says it's out of memory, it means it's out of contiguous memory, so rebooting or restarting MATLAB may work.
But, I'd recommend optimizing your code and identifying how you're eating up so much memory. It could be an ill-designed recursive loop, or a bad indexing function (using doubles instead of logicals to index a huge matrix).
I practically lived with memory errors for a while since I was dealing with huge datasets, but there's always a workaround, ask specific questions and you'll be surprised.
Problem fixed.
Under Windows XP x32, I managed to almost double the amount of memory available to MATLAB by editing boot.ini to add the switch /3GB /USERVA=3030
[boot loader]
timeout=30
default=multi(0)disk(0)rdisk(0)partition(1)\WINDOWS
[operating systems]
multi(0)disk(0)rdisk(0)partition(1)\WINDOWS="Microsoft Windows XP Professional" /noexecute=optin /fastdetect /3GB /USERVA=3030
Together with reducing our array sizes, this completely fixed the problem :)
I could have also fixed the problem by upgrading to Windows x64 or Windows 7 x64. This act also doubles the amount of memory available to MATLAB, even if you stick with MATLAB x32 and don't upgrade to MATLAB x64. Windows x64 is just far more memory efficient, even with systems that only have 4 GB of physical RAM installed.
Try this, it works well for me.
Go to Home -> Preference icon -> General -> Java Heap Memory -> Allocate what size of memory you want
In Preference window, go to "Workspace" (out of Java heap memory level) -> See "Matlab Array size limit"
Make sure uncheck the 'Limit the maximum array size to a percentage of RAM'. Because you want to extend memory
so we don't need this feature.
Done.
What are you attempting to allocate when it runs out of memory (OOM)? Do you have code to reproduce? A wide range of problems can cause out of memory errors.
To diagnose, use "dbstop if all error" to set a breakpoint on errors. The out of memory will trigger this, and you can use dbup, dbdown, and whos() to see what's consuming memory. Often an OOM is caused by a bad array size or index calculation, not just by big data structures. E.g. this will trigger an OOM in pretty much any 32-bit MATLAB.
>> x = 1;
>> x(2^30) = 2
??? Out of memory. Type HELP MEMORY for your options.
I faced a similar error while running an (old) C file in MATLAB using mex.
I found my solution at this issue on GitLab.
First, uncheck the option "Limit the maximum Array Size to a % of RAM" located under Preferences -> Workspace, as also indicated in this earlier answer.
Once applied, run your C file in the command window using
mex filename.c -compatibleArrayDims