When I run a sample script in MATLAB, it says:
Out of memory. Type HELP MEMORY for your options.
When I type "memory", it reports:
Maximum possible array: 156 MB (1.638e+008 bytes) *
Memory available for all arrays: 740 MB (7.756e+008 bytes) **
Memory used by MATLAB: 1054 MB (1.105e+009 bytes)
Physical Memory (RAM): 3070 MB (3.219e+009 bytes)
* Limited by contiguous virtual address space available.
** Limited by virtual address space available.
Is there any way to get around this error? I'm using Windows XP x32 with MATLAB 2009a.
pack does a memory defragmentation. It might help you a bit as far as the contiguous memory available.
Remember, when MATLAB says it's out of memory, it means it's out of contiguous memory, so rebooting or restarting MATLAB may work.
But, I'd recommend optimizing your code and identifying how you're eating up so much memory. It could be an ill-designed recursive loop, or a bad indexing function (using doubles instead of logicals to index a huge matrix).
I practically lived with memory errors for a while since I was dealing with huge datasets, but there's always a workaround, ask specific questions and you'll be surprised.
Problem fixed.
Under Windows XP x32, I managed to almost double the amount of memory available to MATLAB by editing boot.ini to add the switch /3GB /USERVA=3030
[boot loader]
timeout=30
default=multi(0)disk(0)rdisk(0)partition(1)\WINDOWS
[operating systems]
multi(0)disk(0)rdisk(0)partition(1)\WINDOWS="Microsoft Windows XP Professional" /noexecute=optin /fastdetect /3GB /USERVA=3030
Together with reducing our array sizes, this completely fixed the problem :)
I could have also fixed the problem by upgrading to Windows x64 or Windows 7 x64. This act also doubles the amount of memory available to MATLAB, even if you stick with MATLAB x32 and don't upgrade to MATLAB x64. Windows x64 is just far more memory efficient, even with systems that only have 4 GB of physical RAM installed.
Try this, it works well for me.
Go to Home -> Preference icon -> General -> Java Heap Memory -> Allocate what size of memory you want
In Preference window, go to "Workspace" (out of Java heap memory level) -> See "Matlab Array size limit"
Make sure uncheck the 'Limit the maximum array size to a percentage of RAM'. Because you want to extend memory
so we don't need this feature.
Done.
What are you attempting to allocate when it runs out of memory (OOM)? Do you have code to reproduce? A wide range of problems can cause out of memory errors.
To diagnose, use "dbstop if all error" to set a breakpoint on errors. The out of memory will trigger this, and you can use dbup, dbdown, and whos() to see what's consuming memory. Often an OOM is caused by a bad array size or index calculation, not just by big data structures. E.g. this will trigger an OOM in pretty much any 32-bit MATLAB.
>> x = 1;
>> x(2^30) = 2
??? Out of memory. Type HELP MEMORY for your options.
I faced a similar error while running an (old) C file in MATLAB using mex.
I found my solution at this issue on GitLab.
First, uncheck the option "Limit the maximum Array Size to a % of RAM" located under Preferences -> Workspace, as also indicated in this earlier answer.
Once applied, run your C file in the command window using
mex filename.c -compatibleArrayDims
Related
I'm using a matlab script to create and store a large matrix of floating point numbers. When I tried running this program on my personal laptop, the program ended hours later with the message 'out of memory'. Supposedly, Matlab has a limit for the maximum-sized array it can store, which makes sense.
My question is: how to store large matrix in matlab? Specifically, I'm using a 64-bit linux OS, and I need to store a 5-6 GB matrix.
I am not an expert in this, but as I understand it the most simple solution would be to get more RAM. However, you could try to check the available memory at the time of the error with
dbstop if error
memory
This should tell you how much Memory is available for Matlab, how much is currently used and how large your biggest array can be. If you exceed this I don't think there is a software solution for that besides storing the data in multiple smaller files.
If you get the "Out of memory: Java Heap Space" error you can increase the memory which is available for Java under (Home -> Preferences -> General -> Java Heap Memory)
Also check if your array side is limited to a certain percentage of your available memory under (Home -> Preferences -> Workspace -> MATLAB array size limit) and set it to 100%.
Similar question in Matlab forum
I have several "out of memory" problems using MATLAB. I don't understand exactly if Matlab can use (or not) all the ram memory of my computer. This is the problem: my computer has 4gb of ram memory and 2 gb for Swap memory (my OS is Linux/Ubuntu 12.10), but Matlab only uses up to 2.6 gb and then shows the warning: "out of memory".
Is it possible to fix this and allow Matlab to use all the "available" memory?
Thanks.
It sounds like you're running 32bit linux and or 32bit MATLAB.
If you allow for enough swap, a process can take up to its virtual memory address space worth of memory.
Generally for 32bit linux you're limited to 3gb of address space for any process (the last gb is kernel memory space). It's entirely possible, depending on usage patterns, that at 2.6gb the next request for memory can't complete because there isn't enough /contiguous/ memory to satisfy it. This is especially common when growing large arrays.
Upgrading to a 64bit version of linux/windows/macOS with 64bit MATLAB should solve this problem but even so, using 3gb+ of virtual address space on a system with 4gb of ram is probably going to start making things very slow.
Some googling brought up this:
MATLAB will use as much memory as your OS lets it use;
the only way to increase the amount of memory MATLAB can use is to reduce
the amount of memory occupied by other applications or to give the OS more
memory to partition out to the applications.
So no, there's no easy way to tell matlab to use more memory.
You either have to buy more memory, optimize your code, run your scripts/functions with less output to store at once or reduce memory usage by other procedures that are running.
Here are some helpful links though:
memory management functions
memory allocation
related discussion on the mathworks forum
I'm not sure what is going on. I am running my neural network simulations on my laptop, which has MATLAB R2013a on it.
The code runs fast on my desktop (R2012a though), but very very slow on the laptop. I ran it with performance and timing thing because this seems abnormal, here are the screenshots I took of the functions spending the most time doing something:
This is located in the codeHints.m file, so it isn't something I wrote. Is there any way I can disable this? I googled it but maybe I am not searching for the right things... I couldn't find anything. I can't get any work done because it is so slow :(
Would appreciate some advice!
Update: I have also attempted to run it on my desktop at work (same MATLAB version as laptop, also 8GB of RAM), and I get the same issue. I checked the resource monitor and it seems like the process is triggering a lot of memory faults (~40/sec), even though not even half of my RAM is being used.
I typed in "memory" in MATLAB and got the following information:
Maximum possible array: 11980 MB (1.256e+10 bytes) *
Memory available for all arrays: 11980 MB (1.256e+10 bytes) *
Memory used by MATLAB: 844 MB (8.849e+08 bytes)
Physical Memory (RAM): 8098 MB (8.491e+09 bytes)
So it seems like there should be sufficient room. I will try to put together a sample file.
Update #2: I ran my code on 2012a on the work computer with the following "memory" info:
Maximum possible array: 10872 MB (1.140e+10 bytes) *
Memory available for all arrays: 10872 MB (1.140e+10 bytes) *
Memory used by MATLAB: 846 MB (8.874e+08 bytes)
Physical Memory (RAM): 8098 MB (8.491e+09 bytes)
The run with more iterations than above (15000 as opposed to 10000) completed much faster and there are no extraneous calls for memory allocation:
So it seems to me that it is an issue exclusively with 2013a. For now I will use 2012a (because I need this finished), but if anyone has ideas on what to do with 2013a to stop those calls to codeHints, I would appreciate it.
Though this would scream memory problems at first sight, it seems like your test have made a lack of memory improbable. In this case the only reasonable explanation that I can think off is that the computer is actually trying to do 2 different things, thus taking more time.
Some possibilities:
Actually not using exactly the same inputs
Actually not using exactly the same functions
The first point can be detected by putting some breakpoints in the codes whilst running it on 2 computers and verifying that the inputs are exactly the same. (Consider using visdiff if you have a lot of variables)
The second one could almost only be caused by having overloaded zeros. Make sure to stop at this line and see which function is being called.
If both these points don't solve the problem, try reducing the code as much as possible till you have only one or a few lines that create the difference. If it turns out that the difference just comes from this one line, try using the zeros function with the right size input on both computers and time the result with the timeit File Exchange Submission
If you find that you are using the builtin function on both computers, with plenty of memory and there still is a huge performance difference, it is probably time to contact mathworks support and hear what they have to say about it.
I have a problem which occurs when I write the command line of the rbf (radial basis function) neural network
net = newrb(T, D);
I get the error
**??? Error using ==> unknown
Out of memory. Type HELP MEMORY for your options.
Error in ==> dist>apply at 119
z = zeros(S,Q);
Error in ==> boiler_weight at 38
result = apply(a,b,c);
Error in ==> dist at 90
boiler_weight
Error in ==> newrb>designrb at 143
P = radbas(dist(p',p)*b);
Error in ==> newrb at 127
[w1,b1,w2,b2,tr] = designrb(p,t,goal,spread,mn,df);**
I'm working with 2 GB RAM
Virtual Memory Initial size 4 GB & Maximum size 8 GB
I tried
Maximizing the virtual memory
Under Windows XP x32, I managed to almost double the amount of memory available to Matlab by editing boot.ini to add the switch /3GB /USERVA=3030
/fastdetect /3GB /USERVA=3030
pack (for memory defragmentation)
but all this with no use
Any help please ?!!!!!!
Thanx in advance
I don't have a fix, but here are some debugging techniques for OOMs in Matlab that seem germane.
Pack doesn't work nearly as well as its doco says it does. If memory is fragmented at a low level (not uncommon), you must restart Matlab to fix it. "Memory" and "feature memstats" will give some indication of low level fragmentation. Try restarting and running from a fresh Matlab session to see if it's fragmentation, or it's really peak memory usage.
Try a "dbstop if all error" so you break in to the debugger when you run out of memory. Then you can examine the stack frames with dbup and dbdown to see what's holding down memory, and see if there are any surprisingly large arrays. OOMs are sometimes from miscomputed indexes or array sizes that end up allocating extra-big arrays.
The undocumented "profile on -memory" option can tell you about memory usage during execution, which may help.
And your data set might just be too big. See if you can break it in to smaller parts and loop over them, reducing peak memory requirements.
Good luck.
Maybe one of the solutions offered by The MathWorks solves your issue:
http://www.mathworks.com/support/tech-notes/1100/1107.html
I want to calculate 2 covariance matrices with size (10304,1034) and matlab creates the first one but when it runs the second command, this error occurs:
>> j=ones(10000,10000);
>> jj=ones(10000,10000);
??? Out of memory. Type HELP MEMORY for your options.
My laptop's RAM is 2GB, but it still has 1 GB free. I am using Windows 7 and 32-bit MATLAB 2009b.
How can I resolve this error?
A 10k-by-10k array of doubles uses 1e8*8 bytes, which corresponds to 800MB. MATLAB needs these 800MB to be contiguous. Most likely, your 1GB free memory is a little fragmented, so MATLAB cannot fit the new array into RAM.
Use the command MEMORY to find out the maximum variable size that MATLAB can handle at a given moment.
Try to use sparse matrices, in that case MATLAB doesn't allocate the entire space.
Try any of these two options little bit increase in memory allocated for matlab.exe processing.
1- Give higher priority to Matlab.exe task. You can do that by going to task manager, Processes tab, right click the Matlab.exe Task, select priority and set it to higher priority (say real time), this tells Windows to allocate more resources to this process.
2- Increase the page file size of your applications in general. You can do this by right clicking MyComputer ->properties->Advanced System Settings ->Advanced-> Performance->Virtual Memory (change..). Then the tick from the Automatic .... and set the initial and maximum page size to say 10000 MB.
Go to Matlab-->file-->Preferences-->general-->Java heap memory--> and increase the level.. This solved my problem