Find Time and Memory after program running in MATLAB - matlab

Is it any way in matlab that after program is finished to run, find the memory and time?
Also, if the workplace is saved and then it is loaded again, is it possible to see the time and memory for it ?
Thanks.

For the time consumption, would the profiler work? It slows the execution a bit, but is nice for debugging. Otherwise try to enclose the section you want to time with tic-toc.
And for memory consumption there were, and still is I think, no really convenient way to do this, however, something may have happened here. This is how mathworks answered a few years ago. You can try whos, but that one only works inside the current scope. Also memory can be used to see matlabs total memory consumption.
The time taken for loading a file should be possible to see by enclosing it with the usual tic-toc command. The size of a saved file on disk can be seen using dir on the file, but the size could be different in matlab. I guess that the safest way is to check the size before saving if it will be loaded under the same execution and otherwise it may be convenient to log the size somehow.

Don't know if i got your question correctly, but if you need to trace the time your function takes there are two ways:
the functions
tic;
t=toc
work like a stopwatch, tic starts the counting and toc tells you how long passed since last tic.
if you need to do more in depth analysis of the times matlab also offers a profile function.
i suggest you go through matlab documentation on how to use it...
hope i helped.
S.

For execution time between code lines use:
tic;
toc;
t = toc;
disp(['Execution time: ' num2str(t)])
To know and show memory usage of variables you can use whos
whos
S = whos; % type struct variable containing all the info of the actual workspace
S.bytes
To calculate the total storage, you can make a loop
Memory = 0;
S = whos;
for k = 1:length(S)
Memory = Memory + S(k).bytes;
end
disp(['Total memory used by variables in storage (Bytes): ' num2str(Memory)])
You might prefer to see whos page in mathworks

Related

Matlab Process Memory Leak Over 16 Days

I'm running a real-time data assimilation program written in Matlab, and there seems to be a slow memory leak. Over the course of about 16 days, the average memory usage has increased by about 40% (see the figure below) from about 1.1GB to 1.5GB. The program loops every 15 minutes, and there is a peak in memory usage for about 30 seconds during the data assimilation step (visible in the figure).
At the end of each 15 minute cycle, I'm saving the names, sizes, and types of all variables in the currently active workspace to a .mat file using the whos function. There are just over 100 variables, and after running the code for about 16 days, there is no clear trend in the amount of memory used by any of the variables.
Some variables are cleared at the end of each cycle, but some of them are not. I'm also calling close all to make sure there are no figures sitting in memory, and I made sure that when I'm writing ASCII files, I always fclose(fileID) the file.
I'm stumped...I'm wondering if anyone here has any suggestions about things I should look for or tools that could help track down the issue. Thanks in advance!
Edit, system info:
RHEL 6.8
Matlab R2014b
I figured out the problem. It turns out that the figure handles were hidden, and close('all') only works on figures that are visible. I assume they're hidden because the figures are created outside the scope of where I was trying to close the figures. The solution was to replace close('all') with close all hidden, which closes all figures including those with hidden handles.
I'll go ahead and restate what #John and #horchler mentioned in their comments, in case their suggestions can help people with similar issues:
Reusing existing figures can increase performance and reduce the potential for memory leaks.
Matlab has an undocumented memory profiler that could help debug performance related issues.
For processes that are running indefinitely, it's good practice to separate data collection/processing and product generation (figures etc). The first reads in and processes the data and saves it to a DB or file. The second allows you to "view/access/query" the data.
If you are calling compiled mex functions in your code, the memory leak could be coming from the Fortran or C/C++ code. Not cleaning up a single variable could cause a leak, and would explain linear memory growth.
The Matlab function whos is great for looking at the size in memory of each variable in the workspace. This can be useful for tracking down which variable is the culprit of a memory leak.
Thanks #John and #horchler!

aborting the MATLAB code when the RAM is FULL

Is there any MATLAB command that let us abort the MATLAB code when the 90% of RAM is full due to the huge amount of data?
I am asking this question because I do not want to restart the computer every time that MATLAB is stuck and computer is hanged?
As far as I know, you can't "automatically" do that, if MATLAB hangs, it hangs.
However, in your code, you can always add somewhere (e.g. inside of a memory heavy iterative function) a memory check.
if you do
maxmem=2e10; %about 2GB of RAM
%% //this inside the memory heavy code
mem=memory;
if mem.MemUsedMATLAB>maxmem
exit; % // or some other thing you may want to do
end
This will exit MATLAB when the memory is about 2GB of RAM (the value is in bits, so make sure you note that when putting your own value)
Adding this answer to SO as suggested by #Ander Biguri, the answer if purely based on this link
Using Matlab try (as an option), you can monitor your memory usage as
tryOptions.watchdog.virtualAddressSpace = 7e9 ; %//7GB Mem
tryOptions.watchdog.execTime = 1800 ; %//Execution Time 1800 seconds
try tryOptions
...
catch %// use the try and catch combo to monitor your memory usage and kill process if you need to.
Other useful tools which may help:
T = evalc('feature(''memstats'')') ;
str2mat(regexp(T, '(?<=Use:\s*)\d+', 'match'))
The memstats could output the current stats of your memory, you can add break points in your code (at the beginning of a major operation) to monitor your memory usage and decide if you want to continue executing.

Why would saving to a folder called temp cause data loading to slow down in a for loop in Matlab?

IMPORTANT UPDATE
I just made the discovery that after restarting Matlab and the computer, this simplified code no longer reproduces the problem for me either... I am so sorry for taking up your time with a script that didn't work. However, the old problem still persists in my original script if I save anything in any folder (that I have tried) in the inner 'for' loop. For my purposes, I have worked around it by simply not make this save unless I absolutely need it. The original script has the following structure in terms of for loops and use of save or load:
load() % .mat files, size 365x92x240
for day = 1:365
load() % .mat files, size 8x92x240
for type = 1:17
load() % .mat files size 17x92x240
load() % .mat files size 92x240
for step 1:8
%only calculations
end
save() % .mat files size 8x92x240
end
save() % .mat files, size 8x92x240
end
% the load and saves outside the are in for loops too, but do not seem to affect the described behavior in the above script
load() % .mat files size 8x92x240
save() % .mat files size 2920x92x240
load()
save() % .mat files size 365x92x240
load()
save() % .mat files size 12x92x240
If run in full, the script saves approx. 10 Gb and loads approx. 2Gb of data.
The entire script is rather lengthy and makes a lot of saves and loads. It would be rather impractical too share all here before I have managed to reproduce the problem in a reduced version, unfortunately. As I frustratingly discovered that the very same code could behave differently from to time to time, it immediately got more tedious than anticipated to find a simplification that consistently reproduces the behavior. I will get back as soon as I am sure about a manageable code that produces the problem.
PREVIOUS PROBLEM DESCRIPTION
(NB. The code below does not for sure reproduce the described problem.):
I just learnt the hard way that, in Matlab, you can't name a saving folder to temp in a for loop without slowing down data loading in the next round of the loop. My question is why?
If you are interested in reproducing the problem yourself, please see the code below. To run it, you will also need a matfile called anyData.mat to load and two folders for saving, one called temp and the other called temporary.
clear all;clc;close all;profile off;
profile on
tT= zeros(1,endDay+1);
tTD= zeros(1,endDay+1);
for day = 0:2;
tic
T = importdata('anyData.mat')
tT(day+1)=toc; %loading time in seconds
tic
TD = importdata('anyData.mat')
tTD(day+1)=toc;
for type = 0:1
saveFile = ones(92,240);
save('AnyFolder\temporary\saveFile.mat', 'saveFile') % leads to fast data loading
%save('AnyFolder\temp\saveFile.mat', 'saveFile') %leads to slow data loading
end % end of type
end% end of day
profile off
profile report
plot(tT)
You will see in y-axis of the plot that data loading takes significantly longer time when you in the later for loop save to temp rather than temporary. Is there anyone out there who knows why this occurs?
There are two things here
Storage during a for loop is an expensive operation as it usually opens a file stream and closes it before it moves on. You might not be able to avoid this.
Second thing is speed of storage and its cache speed. Most likely programs use temp folder for its own temporary files and have a garbage collector or software looking after these to clean them. If you start opening and closing file stream to this folder you have to send a request to get exclusive write access to the folder. This again adds to the time.
If you are doing image processing operations and you have multiple images you can run into a bottle neck with writing to hard drive due to its speed, cache and current memory available to MATLAB.
I can't reproduce the problem, suspect it's system and data-size specific. But some general comments which could help you out of the predicament:
As pointed out by commenters and the above answers, file i/o within a double for loop can be extremely parasitic, especially in cases where you only need to access part of the data in the file, where other system operations delay the process, or where the data files are large enough to require virtual memory (windows) / swap space (linux) to even load them. In the latter case, you could be in a situation where you're moving a file from one part of the hard disk to another when you open it!
I assume that you're loading/saving because you don't have c.10GB of ram to hold everything in memory for computation. The actual problem is not described, so I can't be certain, but think you might find that the matfile class to be useful... TMW documentation. This is used to map directly to/from a mat file. This:
reduces file stream opening and closing IOPS
allows arbitrarily large variable sizes (governed by disk size, not memory)
allows you to read/write partially (i.e. write only some elements of an array without loading the whole file)
in the case that your mat file is too large to be held in memory, avoids loading it into swap space which would be extremely cumbersome.
Hope this helps.
Tom

MATLAB Parallel Parfor Memory Usage Interrogation

I am new here for asking questions though I have found solutions to many problems here before.
This particular problem I cannot seem to find an answer for so I though I would join up and ask.
I am using the Parallel computing toolbox to run multiple simulations at once, the code I am developing is to be deployed on a single core so there is no need for converting the algorithm to parallel.
The data structures being created by each of the simulations are large, and running 8 simulations at once is using all of the available RAM in my machine (4GB).
I am currently looking at reducing the memory used by each simulation, and was wondering if anybody knew how to get memory usage info from each of the instances of the function.
So far I have been calling:
parfor i=1:8
[IR(:, i) Data(i)] = feval(F, NX, NY(i), SR, NS, i);
end
And inside function F
[usr, sys] = memory;
format short eng;
TEST.Mem = usr.MemUsedMATLAB;
But this understandably is returning the memory being used by all 8 instances of F.
I would like to get the information from each instance of F.
Note: The data structure TEST is returned as Data to the top level function.
Thanks in advance for any help.
you can use the matlab profiler to get a hint at the memory usage:
% start profiler
profile -memory on;
% start your simulation
my_sim();
% look at profiler results
profile viewer

Matlab Preallocation

I'm running a simulation of a diffusion-reaction equation in MATLAB, and I pre-allocate the memory for all of my vectors beforehand, however, during the loop, in which I solve a system of equations using BICG, the amount of memory that MATLAB uses is increasing.
For example:
concentration = zeros(N, iterations);
for t = 1:iterations
concentration(:,t+1) = bicg(matrix, concentration(:,t));
end
As the program runs, the amount of memory MATLAB is using increases, which seems to suggest that the matrix, concentration, is increasing in size as the program continues, even though I pre-allocated the space. Is this because the elements in the matrix are becoming doubles instead of zeros? Is there a better way to pre-allocate the memory for this matrix, so that all of the memory the program requires will be pre-allocated at the start? It would be easier for me that way, because then I would know from the start how much memory the program will require and if the simulation will crash the computer or not.
Thanks for all your help, guys. I did some searching around and didn't find an answer, so I hope I'm not repeating a question.
EDIT:
Thanks Amro and stardt for your help guys. I tried running 'memory' in MATLAB, but the interpreter said that command is not supported for my system type. I re-ran the simulation though with 'whos concentration' displayed every 10 iterations, and the allocation size of the matrix wasn't changing with time. However, I did notice that the size of the matrix was about 1.5 GB. Even though that was the case, system monitor was only showing MATLAB as using 300 MB (but it increased steadily to reach a little over 1 GB by the end of the simulation). So I'm guessing that MATLAB pre-allocated the memory just fine and there are no memory leaks, but system monitor doesn't count the memory as in use until MATLAB starts writing values to it in the loop. I don't know why that would be, as I would imagine that writing zeros would trigger the system monitor to see that memory as 'in use,' but I guess that's not the case here.
Anyway, I appreciate your help with this. I would vote both of your answers up as I found them both helpful, but I don't have enough reputation points to do that. Thanks guys!
I really doubt it's a memory leak, since most "objects" in MATLAB clean after themselves once they go out of scope. AFAIK, MATLAB does not use a GC per se, but a deterministic approach to managing memory.
Therefore I suspect the issue is more likely to be caused by memory fragmentation: when MATLAB allocates memory for a matrix, it has to be contiguous. Thus when the function is repeatedly called, creating and deleting matrices, and over time, the fragmentation becomes a noticeable problem...
One thing that might help you debug is using the undocumented: profile on -memory which will track allocation in the MATLAB profiler. Check out the monitoring tool by Joe Conti as well. Also this page has some useful information.
I am assuming that you are watching the memory usage of matlab in, for example, the task manager on windows. The memory usage is probably increasing due to the execution of bicg() and variables that have not been garbage collected after it ends. The memory allocated to the concentration matrix stays the same. You can type
whos concentration
before and after your "for" loop to see how much memory is allocated to that variable.