Matlab function memory consuption - matlab

How can I determine the peak of memory consumption in some function in Matlab?
for example
A= rand(1000,1000); %A takes N Mb
func(A); % some operation on A because of which memory consumption can grow

You can start the profiler with the memory switch:
profile -memory
which I think is actually undocumented...Works on R2010, can anyone verify that it still works on R2013a?
Anyway, when profiling with this switch, the profiler GUI will now include basic memory info, of which I think you'll find the last column ("Peak Memory") most interesting:

Related

Matlab R2017a memory profiler gives a ridiculous number for allocated memory

My code is:
function eigs_mem_test
N = 20000;
density = 0.2;
numOfModes = 250;
A = sprand(N, N, density);
profile -memory on
eigs(A, numOfModes, 0.0)
profile off
profsave(profile('info'), 'eigs_test')
profview
end
And this returns
i.e. it says that MATLAB allocated 18014398508117708.00 Kb or 1.8e10 Gb -- completely impossible. How did this happen? The code finishes with correct output and in htop I can see the memory usage vary quite a bit, but staying under 16G.
For N = 2000, I get sensible results (i.e. 0.2G allocated.)
How can I profile this case effectively, if I want to obtain an upper bound on memory used for large sparse matrices?
I use MATLAB R2017a.
I cannot reproduce your issue in R2017b, with 128GB of RAM on my machine. Here is the result after running your example code:
Notably, the function peaked at 14726148Kb, or ~1.8GB. I'm more confused by the units MATLAB has used here, as I saw nearer 14GB of usage in the task manager, which matches your large observed usage (and 1.4e7KB in GB), I can only think the profiler is meant to state KB (kilobytes) instead of Kb (kilobits).
Ridiculously large, unexpected values like this are often the result of overflow, so this could be an internal overflow bug.
You could use whos to get the size on disk of a variable
w = whos('A'); % get details of variable A
sizeOnDisk = w.bytes; % get size on disk
This doesn't necessarily tell you how much memory a function like eigs in your example uses though. You could poll memory within your function to get the current usage.
I'll resist exploring this further, since the question of how to profile for memory usage has already been asked and answered.
N.B. I'm not sure why my machine was ~100x slower than yours, I assume the image of your memory usage didn't come from actually running your example code? Or my RAM is awful...

aborting the MATLAB code when the RAM is FULL

Is there any MATLAB command that let us abort the MATLAB code when the 90% of RAM is full due to the huge amount of data?
I am asking this question because I do not want to restart the computer every time that MATLAB is stuck and computer is hanged?
As far as I know, you can't "automatically" do that, if MATLAB hangs, it hangs.
However, in your code, you can always add somewhere (e.g. inside of a memory heavy iterative function) a memory check.
if you do
maxmem=2e10; %about 2GB of RAM
%% //this inside the memory heavy code
mem=memory;
if mem.MemUsedMATLAB>maxmem
exit; % // or some other thing you may want to do
end
This will exit MATLAB when the memory is about 2GB of RAM (the value is in bits, so make sure you note that when putting your own value)
Adding this answer to SO as suggested by #Ander Biguri, the answer if purely based on this link
Using Matlab try (as an option), you can monitor your memory usage as
tryOptions.watchdog.virtualAddressSpace = 7e9 ; %//7GB Mem
tryOptions.watchdog.execTime = 1800 ; %//Execution Time 1800 seconds
try tryOptions
...
catch %// use the try and catch combo to monitor your memory usage and kill process if you need to.
Other useful tools which may help:
T = evalc('feature(''memstats'')') ;
str2mat(regexp(T, '(?<=Use:\s*)\d+', 'match'))
The memstats could output the current stats of your memory, you can add break points in your code (at the beginning of a major operation) to monitor your memory usage and decide if you want to continue executing.

Find Time and Memory after program running in MATLAB

Is it any way in matlab that after program is finished to run, find the memory and time?
Also, if the workplace is saved and then it is loaded again, is it possible to see the time and memory for it ?
Thanks.
For the time consumption, would the profiler work? It slows the execution a bit, but is nice for debugging. Otherwise try to enclose the section you want to time with tic-toc.
And for memory consumption there were, and still is I think, no really convenient way to do this, however, something may have happened here. This is how mathworks answered a few years ago. You can try whos, but that one only works inside the current scope. Also memory can be used to see matlabs total memory consumption.
The time taken for loading a file should be possible to see by enclosing it with the usual tic-toc command. The size of a saved file on disk can be seen using dir on the file, but the size could be different in matlab. I guess that the safest way is to check the size before saving if it will be loaded under the same execution and otherwise it may be convenient to log the size somehow.
Don't know if i got your question correctly, but if you need to trace the time your function takes there are two ways:
the functions
tic;
t=toc
work like a stopwatch, tic starts the counting and toc tells you how long passed since last tic.
if you need to do more in depth analysis of the times matlab also offers a profile function.
i suggest you go through matlab documentation on how to use it...
hope i helped.
S.
For execution time between code lines use:
tic;
toc;
t = toc;
disp(['Execution time: ' num2str(t)])
To know and show memory usage of variables you can use whos
whos
S = whos; % type struct variable containing all the info of the actual workspace
S.bytes
To calculate the total storage, you can make a loop
Memory = 0;
S = whos;
for k = 1:length(S)
Memory = Memory + S(k).bytes;
end
disp(['Total memory used by variables in storage (Bytes): ' num2str(Memory)])
You might prefer to see whos page in mathworks

MATLAB Parallel Parfor Memory Usage Interrogation

I am new here for asking questions though I have found solutions to many problems here before.
This particular problem I cannot seem to find an answer for so I though I would join up and ask.
I am using the Parallel computing toolbox to run multiple simulations at once, the code I am developing is to be deployed on a single core so there is no need for converting the algorithm to parallel.
The data structures being created by each of the simulations are large, and running 8 simulations at once is using all of the available RAM in my machine (4GB).
I am currently looking at reducing the memory used by each simulation, and was wondering if anybody knew how to get memory usage info from each of the instances of the function.
So far I have been calling:
parfor i=1:8
[IR(:, i) Data(i)] = feval(F, NX, NY(i), SR, NS, i);
end
And inside function F
[usr, sys] = memory;
format short eng;
TEST.Mem = usr.MemUsedMATLAB;
But this understandably is returning the memory being used by all 8 instances of F.
I would like to get the information from each instance of F.
Note: The data structure TEST is returned as Data to the top level function.
Thanks in advance for any help.
you can use the matlab profiler to get a hint at the memory usage:
% start profiler
profile -memory on;
% start your simulation
my_sim();
% look at profiler results
profile viewer

Matlab Preallocation

I'm running a simulation of a diffusion-reaction equation in MATLAB, and I pre-allocate the memory for all of my vectors beforehand, however, during the loop, in which I solve a system of equations using BICG, the amount of memory that MATLAB uses is increasing.
For example:
concentration = zeros(N, iterations);
for t = 1:iterations
concentration(:,t+1) = bicg(matrix, concentration(:,t));
end
As the program runs, the amount of memory MATLAB is using increases, which seems to suggest that the matrix, concentration, is increasing in size as the program continues, even though I pre-allocated the space. Is this because the elements in the matrix are becoming doubles instead of zeros? Is there a better way to pre-allocate the memory for this matrix, so that all of the memory the program requires will be pre-allocated at the start? It would be easier for me that way, because then I would know from the start how much memory the program will require and if the simulation will crash the computer or not.
Thanks for all your help, guys. I did some searching around and didn't find an answer, so I hope I'm not repeating a question.
EDIT:
Thanks Amro and stardt for your help guys. I tried running 'memory' in MATLAB, but the interpreter said that command is not supported for my system type. I re-ran the simulation though with 'whos concentration' displayed every 10 iterations, and the allocation size of the matrix wasn't changing with time. However, I did notice that the size of the matrix was about 1.5 GB. Even though that was the case, system monitor was only showing MATLAB as using 300 MB (but it increased steadily to reach a little over 1 GB by the end of the simulation). So I'm guessing that MATLAB pre-allocated the memory just fine and there are no memory leaks, but system monitor doesn't count the memory as in use until MATLAB starts writing values to it in the loop. I don't know why that would be, as I would imagine that writing zeros would trigger the system monitor to see that memory as 'in use,' but I guess that's not the case here.
Anyway, I appreciate your help with this. I would vote both of your answers up as I found them both helpful, but I don't have enough reputation points to do that. Thanks guys!
I really doubt it's a memory leak, since most "objects" in MATLAB clean after themselves once they go out of scope. AFAIK, MATLAB does not use a GC per se, but a deterministic approach to managing memory.
Therefore I suspect the issue is more likely to be caused by memory fragmentation: when MATLAB allocates memory for a matrix, it has to be contiguous. Thus when the function is repeatedly called, creating and deleting matrices, and over time, the fragmentation becomes a noticeable problem...
One thing that might help you debug is using the undocumented: profile on -memory which will track allocation in the MATLAB profiler. Check out the monitoring tool by Joe Conti as well. Also this page has some useful information.
I am assuming that you are watching the memory usage of matlab in, for example, the task manager on windows. The memory usage is probably increasing due to the execution of bicg() and variables that have not been garbage collected after it ends. The memory allocated to the concentration matrix stays the same. You can type
whos concentration
before and after your "for" loop to see how much memory is allocated to that variable.