I have a problem which occurs when I write the command line of the rbf (radial basis function) neural network
net = newrb(T, D);
I get the error
**??? Error using ==> unknown
Out of memory. Type HELP MEMORY for your options.
Error in ==> dist>apply at 119
z = zeros(S,Q);
Error in ==> boiler_weight at 38
result = apply(a,b,c);
Error in ==> dist at 90
boiler_weight
Error in ==> newrb>designrb at 143
P = radbas(dist(p',p)*b);
Error in ==> newrb at 127
[w1,b1,w2,b2,tr] = designrb(p,t,goal,spread,mn,df);**
I'm working with 2 GB RAM
Virtual Memory Initial size 4 GB & Maximum size 8 GB
I tried
Maximizing the virtual memory
Under Windows XP x32, I managed to almost double the amount of memory available to Matlab by editing boot.ini to add the switch /3GB /USERVA=3030
/fastdetect /3GB /USERVA=3030
pack (for memory defragmentation)
but all this with no use
Any help please ?!!!!!!
Thanx in advance
I don't have a fix, but here are some debugging techniques for OOMs in Matlab that seem germane.
Pack doesn't work nearly as well as its doco says it does. If memory is fragmented at a low level (not uncommon), you must restart Matlab to fix it. "Memory" and "feature memstats" will give some indication of low level fragmentation. Try restarting and running from a fresh Matlab session to see if it's fragmentation, or it's really peak memory usage.
Try a "dbstop if all error" so you break in to the debugger when you run out of memory. Then you can examine the stack frames with dbup and dbdown to see what's holding down memory, and see if there are any surprisingly large arrays. OOMs are sometimes from miscomputed indexes or array sizes that end up allocating extra-big arrays.
The undocumented "profile on -memory" option can tell you about memory usage during execution, which may help.
And your data set might just be too big. See if you can break it in to smaller parts and loop over them, reducing peak memory requirements.
Good luck.
Maybe one of the solutions offered by The MathWorks solves your issue:
http://www.mathworks.com/support/tech-notes/1100/1107.html
Related
My code is:
function eigs_mem_test
N = 20000;
density = 0.2;
numOfModes = 250;
A = sprand(N, N, density);
profile -memory on
eigs(A, numOfModes, 0.0)
profile off
profsave(profile('info'), 'eigs_test')
profview
end
And this returns
i.e. it says that MATLAB allocated 18014398508117708.00 Kb or 1.8e10 Gb -- completely impossible. How did this happen? The code finishes with correct output and in htop I can see the memory usage vary quite a bit, but staying under 16G.
For N = 2000, I get sensible results (i.e. 0.2G allocated.)
How can I profile this case effectively, if I want to obtain an upper bound on memory used for large sparse matrices?
I use MATLAB R2017a.
I cannot reproduce your issue in R2017b, with 128GB of RAM on my machine. Here is the result after running your example code:
Notably, the function peaked at 14726148Kb, or ~1.8GB. I'm more confused by the units MATLAB has used here, as I saw nearer 14GB of usage in the task manager, which matches your large observed usage (and 1.4e7KB in GB), I can only think the profiler is meant to state KB (kilobytes) instead of Kb (kilobits).
Ridiculously large, unexpected values like this are often the result of overflow, so this could be an internal overflow bug.
You could use whos to get the size on disk of a variable
w = whos('A'); % get details of variable A
sizeOnDisk = w.bytes; % get size on disk
This doesn't necessarily tell you how much memory a function like eigs in your example uses though. You could poll memory within your function to get the current usage.
I'll resist exploring this further, since the question of how to profile for memory usage has already been asked and answered.
N.B. I'm not sure why my machine was ~100x slower than yours, I assume the image of your memory usage didn't come from actually running your example code? Or my RAM is awful...
I'm not sure what is going on. I am running my neural network simulations on my laptop, which has MATLAB R2013a on it.
The code runs fast on my desktop (R2012a though), but very very slow on the laptop. I ran it with performance and timing thing because this seems abnormal, here are the screenshots I took of the functions spending the most time doing something:
This is located in the codeHints.m file, so it isn't something I wrote. Is there any way I can disable this? I googled it but maybe I am not searching for the right things... I couldn't find anything. I can't get any work done because it is so slow :(
Would appreciate some advice!
Update: I have also attempted to run it on my desktop at work (same MATLAB version as laptop, also 8GB of RAM), and I get the same issue. I checked the resource monitor and it seems like the process is triggering a lot of memory faults (~40/sec), even though not even half of my RAM is being used.
I typed in "memory" in MATLAB and got the following information:
Maximum possible array: 11980 MB (1.256e+10 bytes) *
Memory available for all arrays: 11980 MB (1.256e+10 bytes) *
Memory used by MATLAB: 844 MB (8.849e+08 bytes)
Physical Memory (RAM): 8098 MB (8.491e+09 bytes)
So it seems like there should be sufficient room. I will try to put together a sample file.
Update #2: I ran my code on 2012a on the work computer with the following "memory" info:
Maximum possible array: 10872 MB (1.140e+10 bytes) *
Memory available for all arrays: 10872 MB (1.140e+10 bytes) *
Memory used by MATLAB: 846 MB (8.874e+08 bytes)
Physical Memory (RAM): 8098 MB (8.491e+09 bytes)
The run with more iterations than above (15000 as opposed to 10000) completed much faster and there are no extraneous calls for memory allocation:
So it seems to me that it is an issue exclusively with 2013a. For now I will use 2012a (because I need this finished), but if anyone has ideas on what to do with 2013a to stop those calls to codeHints, I would appreciate it.
Though this would scream memory problems at first sight, it seems like your test have made a lack of memory improbable. In this case the only reasonable explanation that I can think off is that the computer is actually trying to do 2 different things, thus taking more time.
Some possibilities:
Actually not using exactly the same inputs
Actually not using exactly the same functions
The first point can be detected by putting some breakpoints in the codes whilst running it on 2 computers and verifying that the inputs are exactly the same. (Consider using visdiff if you have a lot of variables)
The second one could almost only be caused by having overloaded zeros. Make sure to stop at this line and see which function is being called.
If both these points don't solve the problem, try reducing the code as much as possible till you have only one or a few lines that create the difference. If it turns out that the difference just comes from this one line, try using the zeros function with the right size input on both computers and time the result with the timeit File Exchange Submission
If you find that you are using the builtin function on both computers, with plenty of memory and there still is a huge performance difference, it is probably time to contact mathworks support and hear what they have to say about it.
I have a small question regarding MATLAB memory consumption.
My Architecture:
- Linux OpenSuse 12.3 64bit
- 16 GB of RAM
- Matlab 2013a 64 bit
I handle a matrix of double with size: 62 x 11969100 (called y)
When I try the following:
a = bsxfun(#minus,y,-1)
or simply
a = minus(y, -1)
I got a OUT of MEMORY error (in both cases).
I've just computed the ram space allocated for the matrix:
62 x 11969100 x 8 = 5.53 GB
Where am I wrong?!
Thanks a lot!
I'm running on Win64, with 16GB RAM.
Starting with a fresh MATLAB, with only a couple of other inconsequential applications open, my baseline memory usage is about 3.8GB. When I create y, that increases to 9.3GB (9.3-3.8 = 5.5GB, about what you calculate). When I then run a = minus(y, -1), I don't run out of memory, but it goes up to about 14.4GB.
You wouldn't need much extra memory to have been taken away (1.6GB at most) for that to cause an out of memory error.
In addition, when MATLAB stores an array, it requires a contiguous block of memory to do so. If your memory was a little fragmented - perhaps you had a couple of other tiny variables that happened to be stored right in the middle of one of those 5.5GB blocks - you would also get an out of memory error (you can sometimes avoid that issue with the use of pack).
The output of memory on windows platform:
>> memory
Maximum possible array: 2046 MB (2.145e+009 bytes) *
Memory available for all arrays: 3226 MB (3.382e+009 bytes) **
Memory used by MATLAB: 598 MB (6.272e+008 bytes)
Physical Memory (RAM): 3561 MB (3.734e+009 bytes)
* Limited by contiguous virtual address space available.
** Limited by virtual address space available.
The output of computer on linux/mac:
>> [~,maxSize] = computer
maxSize =
2.814749767106550e+14 % Max. number of elements in a single array
with some hacks (found here):
>> java.lang.Runtime.getRuntime.maxMemory
ans =
188416000
>> java.lang.Runtime.getRuntime.totalMemory
ans =
65011712
>> java.lang.Runtime.getRuntime.freeMemory
ans =
57532968
As you can see, aside from memory limitations per variable, there are also limitations on total storage for all variables. This is not different for Windows or Linux.
The important thing to note is that for example on my Windows machine, it is impossible to create two 1.7GB variables, even though I have enough RAM, and neither is limited by maximum variable size.
Since carrying out the minus operation will assign a result of equal size to a new variable (a in your case, or ans when not assigning to anything), there need to be at least two of these humongous things in memory.
My guess is you run into the second limit of total memory space available for all variables.
bsxfun is vectorized for efficiency. Typically vectorized solutions require more than just minimal memory.
You could try using repmat, or if that does not work a simple for loop.
In general I believe the for loop will require the least memory.
I want to calculate 2 covariance matrices with size (10304,1034) and matlab creates the first one but when it runs the second command, this error occurs:
>> j=ones(10000,10000);
>> jj=ones(10000,10000);
??? Out of memory. Type HELP MEMORY for your options.
My laptop's RAM is 2GB, but it still has 1 GB free. I am using Windows 7 and 32-bit MATLAB 2009b.
How can I resolve this error?
A 10k-by-10k array of doubles uses 1e8*8 bytes, which corresponds to 800MB. MATLAB needs these 800MB to be contiguous. Most likely, your 1GB free memory is a little fragmented, so MATLAB cannot fit the new array into RAM.
Use the command MEMORY to find out the maximum variable size that MATLAB can handle at a given moment.
Try to use sparse matrices, in that case MATLAB doesn't allocate the entire space.
Try any of these two options little bit increase in memory allocated for matlab.exe processing.
1- Give higher priority to Matlab.exe task. You can do that by going to task manager, Processes tab, right click the Matlab.exe Task, select priority and set it to higher priority (say real time), this tells Windows to allocate more resources to this process.
2- Increase the page file size of your applications in general. You can do this by right clicking MyComputer ->properties->Advanced System Settings ->Advanced-> Performance->Virtual Memory (change..). Then the tick from the Automatic .... and set the initial and maximum page size to say 10000 MB.
Go to Matlab-->file-->Preferences-->general-->Java heap memory--> and increase the level.. This solved my problem
When I run a sample script in MATLAB, it says:
Out of memory. Type HELP MEMORY for your options.
When I type "memory", it reports:
Maximum possible array: 156 MB (1.638e+008 bytes) *
Memory available for all arrays: 740 MB (7.756e+008 bytes) **
Memory used by MATLAB: 1054 MB (1.105e+009 bytes)
Physical Memory (RAM): 3070 MB (3.219e+009 bytes)
* Limited by contiguous virtual address space available.
** Limited by virtual address space available.
Is there any way to get around this error? I'm using Windows XP x32 with MATLAB 2009a.
pack does a memory defragmentation. It might help you a bit as far as the contiguous memory available.
Remember, when MATLAB says it's out of memory, it means it's out of contiguous memory, so rebooting or restarting MATLAB may work.
But, I'd recommend optimizing your code and identifying how you're eating up so much memory. It could be an ill-designed recursive loop, or a bad indexing function (using doubles instead of logicals to index a huge matrix).
I practically lived with memory errors for a while since I was dealing with huge datasets, but there's always a workaround, ask specific questions and you'll be surprised.
Problem fixed.
Under Windows XP x32, I managed to almost double the amount of memory available to MATLAB by editing boot.ini to add the switch /3GB /USERVA=3030
[boot loader]
timeout=30
default=multi(0)disk(0)rdisk(0)partition(1)\WINDOWS
[operating systems]
multi(0)disk(0)rdisk(0)partition(1)\WINDOWS="Microsoft Windows XP Professional" /noexecute=optin /fastdetect /3GB /USERVA=3030
Together with reducing our array sizes, this completely fixed the problem :)
I could have also fixed the problem by upgrading to Windows x64 or Windows 7 x64. This act also doubles the amount of memory available to MATLAB, even if you stick with MATLAB x32 and don't upgrade to MATLAB x64. Windows x64 is just far more memory efficient, even with systems that only have 4 GB of physical RAM installed.
Try this, it works well for me.
Go to Home -> Preference icon -> General -> Java Heap Memory -> Allocate what size of memory you want
In Preference window, go to "Workspace" (out of Java heap memory level) -> See "Matlab Array size limit"
Make sure uncheck the 'Limit the maximum array size to a percentage of RAM'. Because you want to extend memory
so we don't need this feature.
Done.
What are you attempting to allocate when it runs out of memory (OOM)? Do you have code to reproduce? A wide range of problems can cause out of memory errors.
To diagnose, use "dbstop if all error" to set a breakpoint on errors. The out of memory will trigger this, and you can use dbup, dbdown, and whos() to see what's consuming memory. Often an OOM is caused by a bad array size or index calculation, not just by big data structures. E.g. this will trigger an OOM in pretty much any 32-bit MATLAB.
>> x = 1;
>> x(2^30) = 2
??? Out of memory. Type HELP MEMORY for your options.
I faced a similar error while running an (old) C file in MATLAB using mex.
I found my solution at this issue on GitLab.
First, uncheck the option "Limit the maximum Array Size to a % of RAM" located under Preferences -> Workspace, as also indicated in this earlier answer.
Once applied, run your C file in the command window using
mex filename.c -compatibleArrayDims