Why would Matlab's `rand(1,1e9)` make 64-bit Windows 7 unresponsive? - matlab

When I start a new Matlab session, the rand(1,1e9); command causes my 64-bit Windows 7 to become unresponsive. That means once every few minutes, it might respond to a mouse click or some such from a few minutes back, but otherwise, I can't even flip between apps, can't invoke the task manager, and if the task manager was running before the rand(1,1e9); command, can't even scroll to Matlab on the Processes tab. I don't get an out-of-memory message. Clicking on Matlab's "X" icon to close the app doesn't do anything. Ctrl-C doesn't do anything, and neither does Break in unison with any 1-, 2-, and 3-key combination of Shift, Ctrl, and Alt.
It might be informative to know that rand(1,1e8); (10x fewer doubles) doesn't cause these problems and finishes in (relatively) no time at all.
The memory info from the memory command is:
>> memory
Maximum possible array: 12782 MB (1.340e+10 bytes) *
Memory available for all arrays: 12782 MB (1.340e+10 bytes) *
Memory used by MATLAB: 674 MB (7.068e+08 bytes)
Physical Memory (RAM): 8070 MB (8.462e+09 bytes)
* Limited by System Memory (physical + swap file) available.
On the few occasions in which I can kill Matlab, the OS remains hardly responsive as described above, even when the task manager shows the memory usage go from 7.9GB to nearly zero.
How can Matlab's rand(1,1e9); cause this persistent OS irresponsiveness? Assuming the problem is memory related, how can I ensure that Matlab plays nicely with the OS (and/or vice-versa) when these limits are being bumped up against?
Please note that this is not a question about how to avoid bumping against the memory limits, as I know I can code around them. It's about how to avoid the loss of control when the limits are bumped up against so that I can decide whether I want to interrupt or kill the operation and/or app. For example, Matlab's Memory Allocation page shows that 1e9 doubles takes 8e9 bytes, so together with other memory requirements, I'm probably bumping up against the real and swap memory shown above. No matter if I get an error, or the command simply takes much, much longer, I would still want Matlab to respond to keyboard requests to break. I would also want the rest of the OS to be responsive, including the use of the task manager to kill Matlab in the event that it doesn't respond to a break request.

Related

aborting the MATLAB code when the RAM is FULL

Is there any MATLAB command that let us abort the MATLAB code when the 90% of RAM is full due to the huge amount of data?
I am asking this question because I do not want to restart the computer every time that MATLAB is stuck and computer is hanged?
As far as I know, you can't "automatically" do that, if MATLAB hangs, it hangs.
However, in your code, you can always add somewhere (e.g. inside of a memory heavy iterative function) a memory check.
if you do
maxmem=2e10; %about 2GB of RAM
%% //this inside the memory heavy code
mem=memory;
if mem.MemUsedMATLAB>maxmem
exit; % // or some other thing you may want to do
end
This will exit MATLAB when the memory is about 2GB of RAM (the value is in bits, so make sure you note that when putting your own value)
Adding this answer to SO as suggested by #Ander Biguri, the answer if purely based on this link
Using Matlab try (as an option), you can monitor your memory usage as
tryOptions.watchdog.virtualAddressSpace = 7e9 ; %//7GB Mem
tryOptions.watchdog.execTime = 1800 ; %//Execution Time 1800 seconds
try tryOptions
...
catch %// use the try and catch combo to monitor your memory usage and kill process if you need to.
Other useful tools which may help:
T = evalc('feature(''memstats'')') ;
str2mat(regexp(T, '(?<=Use:\s*)\d+', 'match'))
The memstats could output the current stats of your memory, you can add break points in your code (at the beginning of a major operation) to monitor your memory usage and decide if you want to continue executing.

MATLAB and clearing the swap space

In the debugging mode I stop at some breakpoint and do some matrix manipulation in order to test the program. These manipulations are computationally expensive so MATLAB uses the swap space on my linux system. Then, after continuing the program running, the swap space is almost full so MATLAB crushes. Is there a way I could clean the swap at the debugging node? Doing clear all and clear classes makes effect only on RAM memory, but do not affect the swap.
You can't. Swap isn't special, so just work through this as as a regular out-of-memory issue. If you free up memory, you'll indirectly free up the swap that's being used to back it (or avoid having to use swap to supplement it).
Swap space is just an OS-managed backing store for virtual memory. From a normal program's point of view, swap is RAM (just slow RAM) and you don't manage it separately. (Well... you can "wire" pages to prevent them from being swapped out and so on, or use OS APIs to directly manipulate swap, but those are low-level platform-specific details, (like, below malloc), and not exposed to you as a Matlab M-code programmer, and not what you want to do here.) If your Matlab program runs out of memory, that means it's used up or fragmented its process's virtual memory, not something in particular about your swap space. (Unless there's a low-level bug somewhere.)
When this happens, you may need to look elsewhere in your Matlab program (e.g. in global variables, figure handle properties, or other levels of the function call stack) to find additional data that hasn't been cleared yet, or just restart the Matlab process to fix memory fragmentation (which can happen if your code fills up the memory with lots of small arrays).
Like #siliconwafer suggests, memory, whos, and feature memstats are good tools for debugging this. And if you're stopped inside the debugger, realize you can't actually clear everything until you dbquit out of it.
Doing large matrix operations inside the debugger is not necessarily a recoverable operation: if you've modified arrays held in local variables in the stack frame(s) you're working on, but there are still copies of them held in other variables or frames, Matlab's copy-on-write mechanism needs to hold on to both copies of the arrays, and you might be out of luck for that run of the program if you hit your RAM limits.
If clear all and clear classes after exiting the debugger are not recovering enough memory for you, that smells like either memory fragmentation or a C-level memory leak (like in a MEX file). In either case, you need to restart Matlab to resolve it. Avoid the use of large cellstr arrays or other arrays-of-small-arrays to reduce fragmentation. And take a good hard look at your C code if you're using any custom MEX functions.
Or you just might not have enough memory to do the operations you're doing.

MATLAB - codeHints taking 99.9% of runtime (R2013a)

I'm not sure what is going on. I am running my neural network simulations on my laptop, which has MATLAB R2013a on it.
The code runs fast on my desktop (R2012a though), but very very slow on the laptop. I ran it with performance and timing thing because this seems abnormal, here are the screenshots I took of the functions spending the most time doing something:
This is located in the codeHints.m file, so it isn't something I wrote. Is there any way I can disable this? I googled it but maybe I am not searching for the right things... I couldn't find anything. I can't get any work done because it is so slow :(
Would appreciate some advice!
Update: I have also attempted to run it on my desktop at work (same MATLAB version as laptop, also 8GB of RAM), and I get the same issue. I checked the resource monitor and it seems like the process is triggering a lot of memory faults (~40/sec), even though not even half of my RAM is being used.
I typed in "memory" in MATLAB and got the following information:
Maximum possible array: 11980 MB (1.256e+10 bytes) *
Memory available for all arrays: 11980 MB (1.256e+10 bytes) *
Memory used by MATLAB: 844 MB (8.849e+08 bytes)
Physical Memory (RAM): 8098 MB (8.491e+09 bytes)
So it seems like there should be sufficient room. I will try to put together a sample file.
Update #2: I ran my code on 2012a on the work computer with the following "memory" info:
Maximum possible array: 10872 MB (1.140e+10 bytes) *
Memory available for all arrays: 10872 MB (1.140e+10 bytes) *
Memory used by MATLAB: 846 MB (8.874e+08 bytes)
Physical Memory (RAM): 8098 MB (8.491e+09 bytes)
The run with more iterations than above (15000 as opposed to 10000) completed much faster and there are no extraneous calls for memory allocation:
So it seems to me that it is an issue exclusively with 2013a. For now I will use 2012a (because I need this finished), but if anyone has ideas on what to do with 2013a to stop those calls to codeHints, I would appreciate it.
Though this would scream memory problems at first sight, it seems like your test have made a lack of memory improbable. In this case the only reasonable explanation that I can think off is that the computer is actually trying to do 2 different things, thus taking more time.
Some possibilities:
Actually not using exactly the same inputs
Actually not using exactly the same functions
The first point can be detected by putting some breakpoints in the codes whilst running it on 2 computers and verifying that the inputs are exactly the same. (Consider using visdiff if you have a lot of variables)
The second one could almost only be caused by having overloaded zeros. Make sure to stop at this line and see which function is being called.
If both these points don't solve the problem, try reducing the code as much as possible till you have only one or a few lines that create the difference. If it turns out that the difference just comes from this one line, try using the zeros function with the right size input on both computers and time the result with the timeit File Exchange Submission
If you find that you are using the builtin function on both computers, with plenty of memory and there still is a huge performance difference, it is probably time to contact mathworks support and hear what they have to say about it.

why Matlab don't use Swap but error "Out of memory"?

I was wondering why Matlab doesn't use swap, but instead throws the error "Out of memory"?
Shouldn't Matlab just slow down instead of throwing an "Out of memory"?
Is this Java related?
added:
I know "out of memory" means it's out of contiguous memory. Doesn't swap have contiguous memory, or? I'm confused...
It is not about MATLAB. What happens when you try allocate more memory than exists in your hardware is an OS specific behavior.
On Linux, by default the OS will 'optimistically' allocate almost anything you want, i.e. swap space is also counted as allocatable memory. You will get what you want - no OOM error, but slow computations with swap-allocated data. This 'feature' is called overcommit. You can change this behavior by modifying the overcommit settings in Linux (have a look e.g. here for a brief summary).
Overcommit is probably not the best idea to use this for solving larger problems in MATLAB, since the entire OS starts to work really slow. It can definitely not be compared to optimized 'out-of-core' implementations that consciously use the hard disk in computations.
This is how it is on Linux. I do not know how to change the memory allocation behavior on Windows, but I doubt you really want to do that. You need more RAM.
And you do confuse things - swap has nothing to do with contiguous memory. Memory allocated by the OS is 'virtual memory', which is contiguous regardless of whether the underlying physical memory can be mapped to contiguous pages.
Edit For transparent 'out-of-core' operations on large matriices using disk space as extra memory you might want to have look at VVAR fileexchange project. This class pretends to be a usual MATLAB class, but it operates on an underlying HDD file. Note that the usual array size limitations of MATLAB still apply.

If only segmentation is enabled

beginners question:
"If" paging is disabled and only segmentation is enabled (CR0.PE is set) then does that mean if a program is loaded in memory (RAM), its whole binary image is loaded and none of its "part" is swapped out, becoz a program is broken into fixed size chunks only when paging is enabled (which then can be swapped out). And if it's true this will reduce the number of processes that run in memory of a particular size of RAM, say 2 GB?
Likely, but not necessarily.
It depends on the operating system...
You could write an operating system that uses a segment to map a part of the program into memory. When the program accesses memory outside the segment, you get a segmentation fault. As the segmentation fault is then passed to the operating system, it could swap in some data from disk and modify segmentation information, before returning control to the program.
However, this is probably difficult and expensive to do, and i do not know of any operating system that acts in this way.
As to the number of processes - you need to split the available memory into contiguous parts, one for each process. This is easy if processes do not grow; if they do, you need padding and may need to copy processes around, which is rather expensive...