MATLAB - codeHints taking 99.9% of runtime (R2013a) - matlab

I'm not sure what is going on. I am running my neural network simulations on my laptop, which has MATLAB R2013a on it.
The code runs fast on my desktop (R2012a though), but very very slow on the laptop. I ran it with performance and timing thing because this seems abnormal, here are the screenshots I took of the functions spending the most time doing something:
This is located in the codeHints.m file, so it isn't something I wrote. Is there any way I can disable this? I googled it but maybe I am not searching for the right things... I couldn't find anything. I can't get any work done because it is so slow :(
Would appreciate some advice!
Update: I have also attempted to run it on my desktop at work (same MATLAB version as laptop, also 8GB of RAM), and I get the same issue. I checked the resource monitor and it seems like the process is triggering a lot of memory faults (~40/sec), even though not even half of my RAM is being used.
I typed in "memory" in MATLAB and got the following information:
Maximum possible array: 11980 MB (1.256e+10 bytes) *
Memory available for all arrays: 11980 MB (1.256e+10 bytes) *
Memory used by MATLAB: 844 MB (8.849e+08 bytes)
Physical Memory (RAM): 8098 MB (8.491e+09 bytes)
So it seems like there should be sufficient room. I will try to put together a sample file.
Update #2: I ran my code on 2012a on the work computer with the following "memory" info:
Maximum possible array: 10872 MB (1.140e+10 bytes) *
Memory available for all arrays: 10872 MB (1.140e+10 bytes) *
Memory used by MATLAB: 846 MB (8.874e+08 bytes)
Physical Memory (RAM): 8098 MB (8.491e+09 bytes)
The run with more iterations than above (15000 as opposed to 10000) completed much faster and there are no extraneous calls for memory allocation:
So it seems to me that it is an issue exclusively with 2013a. For now I will use 2012a (because I need this finished), but if anyone has ideas on what to do with 2013a to stop those calls to codeHints, I would appreciate it.

Though this would scream memory problems at first sight, it seems like your test have made a lack of memory improbable. In this case the only reasonable explanation that I can think off is that the computer is actually trying to do 2 different things, thus taking more time.
Some possibilities:
Actually not using exactly the same inputs
Actually not using exactly the same functions
The first point can be detected by putting some breakpoints in the codes whilst running it on 2 computers and verifying that the inputs are exactly the same. (Consider using visdiff if you have a lot of variables)
The second one could almost only be caused by having overloaded zeros. Make sure to stop at this line and see which function is being called.
If both these points don't solve the problem, try reducing the code as much as possible till you have only one or a few lines that create the difference. If it turns out that the difference just comes from this one line, try using the zeros function with the right size input on both computers and time the result with the timeit File Exchange Submission
If you find that you are using the builtin function on both computers, with plenty of memory and there still is a huge performance difference, it is probably time to contact mathworks support and hear what they have to say about it.

Related

Why would Matlab's `rand(1,1e9)` make 64-bit Windows 7 unresponsive?

When I start a new Matlab session, the rand(1,1e9); command causes my 64-bit Windows 7 to become unresponsive. That means once every few minutes, it might respond to a mouse click or some such from a few minutes back, but otherwise, I can't even flip between apps, can't invoke the task manager, and if the task manager was running before the rand(1,1e9); command, can't even scroll to Matlab on the Processes tab. I don't get an out-of-memory message. Clicking on Matlab's "X" icon to close the app doesn't do anything. Ctrl-C doesn't do anything, and neither does Break in unison with any 1-, 2-, and 3-key combination of Shift, Ctrl, and Alt.
It might be informative to know that rand(1,1e8); (10x fewer doubles) doesn't cause these problems and finishes in (relatively) no time at all.
The memory info from the memory command is:
>> memory
Maximum possible array: 12782 MB (1.340e+10 bytes) *
Memory available for all arrays: 12782 MB (1.340e+10 bytes) *
Memory used by MATLAB: 674 MB (7.068e+08 bytes)
Physical Memory (RAM): 8070 MB (8.462e+09 bytes)
* Limited by System Memory (physical + swap file) available.
On the few occasions in which I can kill Matlab, the OS remains hardly responsive as described above, even when the task manager shows the memory usage go from 7.9GB to nearly zero.
How can Matlab's rand(1,1e9); cause this persistent OS irresponsiveness? Assuming the problem is memory related, how can I ensure that Matlab plays nicely with the OS (and/or vice-versa) when these limits are being bumped up against?
Please note that this is not a question about how to avoid bumping against the memory limits, as I know I can code around them. It's about how to avoid the loss of control when the limits are bumped up against so that I can decide whether I want to interrupt or kill the operation and/or app. For example, Matlab's Memory Allocation page shows that 1e9 doubles takes 8e9 bytes, so together with other memory requirements, I'm probably bumping up against the real and swap memory shown above. No matter if I get an error, or the command simply takes much, much longer, I would still want Matlab to respond to keyboard requests to break. I would also want the rest of the OS to be responsive, including the use of the task manager to kill Matlab in the event that it doesn't respond to a break request.

Growing memory usage in MATLAB

I use MATLAB for programming some meta-heuristics. Recently, I have been working on an algorithm for solving an industrial engineering problem. My problem with MATLAB is getting "out of memory" errors. Now I'm trying some suggestions from Mathworks and Stackoverflow (Hope they will work). However, there is one thing I did not understand.
During the run of the algorithm in MATLAB (it takes 4000-5000 cpu sec for a medium sized problem), even though I preallocate variables, code does not demand dynamic array resizing and does not add new variables, I observe that the memory usage of the algorithm grows continuously. The main function calls some other functions written by me. What could be the reason of increase of the memory usage?
The computer I use for the running of the algorithm has 8GBs of memory and win8 64bit installed.
The only way to figure this out is to see where the memory is going.
I think you may accidentally store results that you don't need, or that you underestimate the size of your output/intermediate variables.
Here is how I would proceed:
Turn on dbstop if error
Run the code till you get the out of memory error
See how much memory is being used (make sure to check all work spaces)
Probably you now know where the extra memory is going. If you don't find much memory being used, continue with this:
Check the memory command to see how much memory is still available
Carefully look at the line being executed, perhaps you actually need a huge amount of memory for it
If all else fails share your findings here and others can help you look for it.
The reason of memory usage growth is CPlex. I tried many alternatives but I couldn't find any other useful solution than increasing virtual memory to several hundred GBs. If you don't have special reasons to insist on CPlex (commercial usage, licensing etc.), I would suggest anyone, who encounter this problem, to use GUROBI. It is free and unlimited for academic usage, totally integrable with MATLAB. That's the solution I have found for my problem with Cplex. I hope this solution works for everybody.

Deployed Matlab application using significantly more memory than Matlab scripts

I was testing a stand-alone application we developed in Matlab when I noticed that its memory usage, according to Windows Task Manager, was peaking several times above 16gb. I decided to run Matlab's profiler with profile -memory on on the scripts behind the compiled version to see where the memory peaks were occurring, using the exact same input. However, the highest peak memory it found was 2400860.00 Kb, or about 1/4 as much, for the function that essentially acts as the program's main().
Thus, I was wondering if people have noticed huge memory usage differences between running a compiled Matlab program and running the original scripts in Matlab. I noticed it took a lot longer running in Matlab, but I figured that was due to the profiler keeping track of all of the memory allocations and deallocations, rather than reading and writing to a swap space on disk.
To make a real quick answer to this question. Yes, MATLAB compiled applications run with more overhead than MATLAB scripts.
This is because MATLAB deployed applications open up a version of MATLAB which is stored in the memory called the MCR. The MCR runs with more overhead than MATLAB.
One thing that I have found useful in situations like this is to recompile and see if that helps at all. If it doesn't, you could try to lower the memory usage by running calculations in segments.
This might be helpful for better memory usage: http://www.mathworks.com/help/matlab/matlab_prog/strategies-for-efficient-use-of-memory.html
Source:
http://www.mathworks.com/matlabcentral/newsreader/view_thread/306814
Matlab executable too slow
Comment if you have questions.

Matlab Preallocation

I'm running a simulation of a diffusion-reaction equation in MATLAB, and I pre-allocate the memory for all of my vectors beforehand, however, during the loop, in which I solve a system of equations using BICG, the amount of memory that MATLAB uses is increasing.
For example:
concentration = zeros(N, iterations);
for t = 1:iterations
concentration(:,t+1) = bicg(matrix, concentration(:,t));
end
As the program runs, the amount of memory MATLAB is using increases, which seems to suggest that the matrix, concentration, is increasing in size as the program continues, even though I pre-allocated the space. Is this because the elements in the matrix are becoming doubles instead of zeros? Is there a better way to pre-allocate the memory for this matrix, so that all of the memory the program requires will be pre-allocated at the start? It would be easier for me that way, because then I would know from the start how much memory the program will require and if the simulation will crash the computer or not.
Thanks for all your help, guys. I did some searching around and didn't find an answer, so I hope I'm not repeating a question.
EDIT:
Thanks Amro and stardt for your help guys. I tried running 'memory' in MATLAB, but the interpreter said that command is not supported for my system type. I re-ran the simulation though with 'whos concentration' displayed every 10 iterations, and the allocation size of the matrix wasn't changing with time. However, I did notice that the size of the matrix was about 1.5 GB. Even though that was the case, system monitor was only showing MATLAB as using 300 MB (but it increased steadily to reach a little over 1 GB by the end of the simulation). So I'm guessing that MATLAB pre-allocated the memory just fine and there are no memory leaks, but system monitor doesn't count the memory as in use until MATLAB starts writing values to it in the loop. I don't know why that would be, as I would imagine that writing zeros would trigger the system monitor to see that memory as 'in use,' but I guess that's not the case here.
Anyway, I appreciate your help with this. I would vote both of your answers up as I found them both helpful, but I don't have enough reputation points to do that. Thanks guys!
I really doubt it's a memory leak, since most "objects" in MATLAB clean after themselves once they go out of scope. AFAIK, MATLAB does not use a GC per se, but a deterministic approach to managing memory.
Therefore I suspect the issue is more likely to be caused by memory fragmentation: when MATLAB allocates memory for a matrix, it has to be contiguous. Thus when the function is repeatedly called, creating and deleting matrices, and over time, the fragmentation becomes a noticeable problem...
One thing that might help you debug is using the undocumented: profile on -memory which will track allocation in the MATLAB profiler. Check out the monitoring tool by Joe Conti as well. Also this page has some useful information.
I am assuming that you are watching the memory usage of matlab in, for example, the task manager on windows. The memory usage is probably increasing due to the execution of bicg() and variables that have not been garbage collected after it ends. The memory allocated to the concentration matrix stays the same. You can type
whos concentration
before and after your "for" loop to see how much memory is allocated to that variable.

MATLAB "out of memory" error

When I run a sample script in MATLAB, it says:
Out of memory. Type HELP MEMORY for your options.
When I type "memory", it reports:
Maximum possible array: 156 MB (1.638e+008 bytes) *
Memory available for all arrays: 740 MB (7.756e+008 bytes) **
Memory used by MATLAB: 1054 MB (1.105e+009 bytes)
Physical Memory (RAM): 3070 MB (3.219e+009 bytes)
* Limited by contiguous virtual address space available.
** Limited by virtual address space available.
Is there any way to get around this error? I'm using Windows XP x32 with MATLAB 2009a.
pack does a memory defragmentation. It might help you a bit as far as the contiguous memory available.
Remember, when MATLAB says it's out of memory, it means it's out of contiguous memory, so rebooting or restarting MATLAB may work.
But, I'd recommend optimizing your code and identifying how you're eating up so much memory. It could be an ill-designed recursive loop, or a bad indexing function (using doubles instead of logicals to index a huge matrix).
I practically lived with memory errors for a while since I was dealing with huge datasets, but there's always a workaround, ask specific questions and you'll be surprised.
Problem fixed.
Under Windows XP x32, I managed to almost double the amount of memory available to MATLAB by editing boot.ini to add the switch /3GB /USERVA=3030
[boot loader]
timeout=30
default=multi(0)disk(0)rdisk(0)partition(1)\WINDOWS
[operating systems]
multi(0)disk(0)rdisk(0)partition(1)\WINDOWS="Microsoft Windows XP Professional" /noexecute=optin /fastdetect /3GB /USERVA=3030
Together with reducing our array sizes, this completely fixed the problem :)
I could have also fixed the problem by upgrading to Windows x64 or Windows 7 x64. This act also doubles the amount of memory available to MATLAB, even if you stick with MATLAB x32 and don't upgrade to MATLAB x64. Windows x64 is just far more memory efficient, even with systems that only have 4 GB of physical RAM installed.
Try this, it works well for me.
Go to Home -> Preference icon -> General -> Java Heap Memory -> Allocate what size of memory you want
In Preference window, go to "Workspace" (out of Java heap memory level) -> See "Matlab Array size limit"
Make sure uncheck the 'Limit the maximum array size to a percentage of RAM'. Because you want to extend memory
so we don't need this feature.
Done.
What are you attempting to allocate when it runs out of memory (OOM)? Do you have code to reproduce? A wide range of problems can cause out of memory errors.
To diagnose, use "dbstop if all error" to set a breakpoint on errors. The out of memory will trigger this, and you can use dbup, dbdown, and whos() to see what's consuming memory. Often an OOM is caused by a bad array size or index calculation, not just by big data structures. E.g. this will trigger an OOM in pretty much any 32-bit MATLAB.
>> x = 1;
>> x(2^30) = 2
??? Out of memory. Type HELP MEMORY for your options.
I faced a similar error while running an (old) C file in MATLAB using mex.
I found my solution at this issue on GitLab.
First, uncheck the option "Limit the maximum Array Size to a % of RAM" located under Preferences -> Workspace, as also indicated in this earlier answer.
Once applied, run your C file in the command window using
mex filename.c -compatibleArrayDims