I'm using MATLAB profile to observe memory using the command
profile -memory on
profile clear
% my code
profile report
and i got this table
1- i want to ask about the meaning of
Allocated Memory,Freed Memory, SelfMemory, and Peak Memory
2- what is the meaning of negative self memory?
After a quick google, it would seem that no-one knows, except perhaps MathWorks and they aren't telling. (I jest, but in truth I found very little information on the subject).
Logically however I would interpret the column names as follows:
Allocated memory = the total amount of memory allocated within the function and any it calls.
Freed memory = the total amount of memory released within the function and any it calls.
Peak Memory = the maximum amount of memory in use at any one time during the execution of the function.
Self Memory = the amount of memory used by the function, but not including any functions it calls.
I would hypothesize that a negative 'Self Memory' would indicate that the function frees more memory than it allocates. This could be that it has ownership of a piece of data passed to it, which it then clears. E.g.:
function A()
foo = B();
clear foo
end
function foo = B()
foo = rand(10000,10000);
end
In the example above, the data is created in the call to B and since Matlab employs a lazy copy memory management, this case works pretty much as pass-by-reference for the return value. So, B allocates the memory, and A frees it.
Indeed, running that code with the profiling method in the question produces the following output, which supports my hypothesis.
Related
I have an app written in swift which works fine initially, but throughout time the app gets sluggish. I have opened an instruments profiling session using the Allocation and Leaks profile.
What I have found is that the allocation increases dramatically, doing something that should only overwrite the current data.
The memory in question is in the group < non-object >
Opening this group gives hundreds of different allocations, with the responsible library all being libvDSP. So with this I can conclude it is a vDSP call that is not releasing the memory properly. However, double clicking on any of these does not present me with any code, but the raw language I do not understand.
The function that callas vDSP is wrapped like this:
func outOfPlaceComplexFourierTransform(
setup: FFTSetup,
resultSize:Int,
logSize: UInt,
direction: FourierTransformDirection) -> ComplexFloatArray {
let result = ComplexFloatArray.zeros(count:resultSize)
self.useAsDSPSplitComplex { selfPointer in
result.useAsDSPSplitComplex { resultPointer in
vDSP_fft_zop(
setup,
&selfPointer,
ComplexFloatArray.strideSize,
&resultPointer,
ComplexFloatArray.strideSize,
logSize,
direction.rawValue)
}
}
return result
}
This is called from another function:
var mags1 = ComplexFloatArray.zeros(count: measurement.windowedImpulse!.count)
mags1 = (measurement.windowedImpulse?.outOfPlaceComplexFourierTransform(setup: fftSetup, resultSize: mags1.count, logSize: UInt(logSize), direction: ComplexFloatArray.FourierTransformDirection(rawValue: 1)!))!
Within this function, mags1 is manipulated and overwrites an existing array. It was my understanding that mags1 would be deallocated once this function has finished, as it is only available inside this function.
This is the function that is called, many times per second at times. Any help would be appreciated, as what should only take 5mb, very quickly grows by two hundred megabytes in a couple of seconds.
Any pointers to either further investigate the source of the leak, or to properly deallocate this memory once finished would be appreciated.
I cannot believe I solved this so quickly after posting this. (I genuinely had several hours of pulling my hair out).
Not included in my code here, I was creating a new FFTSetup every time this was called. Obviously this is memory intensive, and it was not reusing this memory.
In instruments looking at the call tree I was able to see the function utilising this memory.
I recently wrote some code using Matlab's OOP. In each class object I save some measurement data as a property and define the methods for evaluating them. With an average data set one single class object uses about 32 MB of memory.
Now I am writing a GUI that should process these objects.
In the first step I load a set of objects from a saved .mat-file (about 200 objects, 2GB on harddisk) and store them in the handles struct. They fill the RAM and use about 6-7 GB, when loaded. This is no problem.
But if I close the GUI, it seems that I can't free the used memory.
I tried different approaches with no success.
Setting the data fields to "empty" in the destructor of the class:
function delete(obj)
obj.timeVector = [];
obj.valueVector = [];
end
Trying to free it in the figure_CloseRequestFcn:
function figure_CloseRequestFcn(hObject, eventdata, handles)
handles.data = [];
handles = rmfield(handles,'data');
guidata(hObject,handles);
clear handles;
pack; %Matlab issues a warning, that pack could only
%be used from the command line, but that did
%not work either
delete(hObject);
end
Any ideas, besides closing Matlab every time after working with the GUI?
I found the answer in the Matlab Bug Report Center. Seems to exist since R2011b.
Summary
Storing objects in MAT-files can cause a memory leak and prevent the object class from being cleared
Description
After storing an instance of a class, 'MyClass', in a MAT-file, calling clear classes may result in the warning:
Warning: Objects of 'MyClass' class exist. Cannot clear this class or any of its superclasses.
This warning persists, even if you have cleared all instances of the class in the workspace.
The warning may occur for one MAT-file format, and not for another.
Workaround
Under some circumstances, switching to a different MAT-file format may eliminate the warning.
http://www.mathworks.ch/support/bugreports/857319
Edit:
I tried older formats for saving, but this does not work either. I get an "Error closing file" (http://www.mathworks.ch/matlabcentral/answers/18098-error-using-save-error-closing-file). So Matlab does not support saving class objects that well. I will have to live with the memory issues then and restart Matlab after every use of the GUI.
Based on your memory screenshots, there is definitely memory that is not being cleared. There is a small chance that you have found a fundamental flaw in Matlab's garbage collection, but it is much more likely that the ~6Gigs of memory resident data is still actually available via some series of links. Based on personal experience, here are a few ways that memory which you thought was cleared can still be available:
Timer objects: If one of the callback functions of a timer references a this data (or a copy), then that data is still available. You need to call deleted(t) on that timer.
Persistent variables in functions: I often cache data in a persistent variable within a function, this clearly allows access to that data in the future, so it will not be cleared. You need to call clear FUNCTIONNAME to clear associated persistent variables.
In GUI objects, as either data or within callback functions: The figures and any persistents need to be cleared.
Any static methods or constant attributes in classes which can retain data. These can either be cleared individually within the class, or by force using clear CLASSNAME.
Some tips for finding stale link to data (again, based on personal mistakes)
Look at the exact number of bytes being lost after each call, using the x=memory; call to get an exact count. Is it consistent? Is it a number that you recognize? Sometimes I can find the leak after realizing that it is exactly 238263232 bytes, therefore a 29782904 double array, which must be from function xyz.
See which classes are actually being deleted. Within your delete(obj) function add a detailed display or which objects are being deleted, and by inference, which are not. For a given non-deleted object, where could it be reference from? You should not need to clear data in the delete(obj) function like you are doing, Matlab should handle that for you. Use the delete function instead as a debugging tool.
Matlab has a garbage collector so you don't need to manually manage memory. After closing the GUI, all the memory will be freed except for what is in your workspace. You can clear the workspace variables using clear.
One thing I've noticed on Windows (not sure about other platforms) is that Matlab's GUI sometimes retains extra memory (maybe 100 MB, but not multiple GB like you are seeing). Simply minimizing and then restoring the GUI will free this excess memory.
I have post here ,a function that i use , to get the accelerator fft .
Setup the accelerator framework for fft on the iPhone
It is working great.
The thing is, that i use it in real time, so for each new audio buffer i call this function with the new buffer.
I get a memory warning because of these lines (probably)
A.realp = (float *) malloc(nOver2 * sizeof(float));
A.imagp = (float *) malloc(nOver2 * sizeof(float));
questions :
do i have another way, but to malloc them again and again(dont forget i have to feed it with a new buffer many times a second )
how exactly do i free them? (code lines)
can it caused by the fact that the fft is heavy to the system ?
Any way to get rid of this warning will help me a lot .
Thanks a lot.
These things should be done once, at the start of your program:
Allocate memory for buffers, using code like float *buffer = malloc(NumberOfElements * sizeof *buffer);.
Create an FFT setup, using code like FFTSetup setup = vDSP_create_fftsetup(log2n, FFT_RADIX2);.
Also test the return values. If malloc or vDSP_create_fftsetup returns 0, write an error message and exit the program or take other exception behavior.
These things should be done once, at the end of your program:
Destroy the FFT setup, using code like vDSP_destroy_fftsetup(setup);.
Release the memory for the buffers, using code like free(buffer);.
In the middle of your program, while you are processing samples, the code should use the existing buffers and setup. So the variables pointing to the buffers and the setup must be visible to that code. You can either pass them in as parameters (perhaps grouped together in a struct) or make them global (which should be only a temporary solution for small programs).
Your program should be arranged so that it is never necessary to allocate memory or create an FFT setup while samples are being processed.
All memory that is allocated should be freed eventually.
If you are malloc'ing and never freeing, you will run out of memory. Make sure to 'free' your memory using free().
*Note: free() doesn't actually erase any memory. It simply tells the system that we're done with the memory and it's available for other allocations.
// Example:
// allocating memory
int *intpointer;
intpointer = malloc(sizeof(int));
// ... do stuff...
// 'Freeing' it when you are done
free(intpointer);
I am looking for an equivalent of the data returned by memory on windows platform on unix, in matlab.
I am aware of the possibility of using unix('vm_stat'), but the specific part of information I require is the largest contiguous free memory block.
This information is returned by memory as follows:
[userview, ~] = memory;
a = userview.MaxPossibleArrayBytes
Does anybody no how to write a unix command that could return this same information?
Call command 'free' and parse the results. This works on linux
[r,w] = unix('free | grep Mem');
stats = str2double(regexp(w, '[0-9]*', 'match'));
memsize = stats(1)/1e6;
freemem = (stats(3)+stats(end))/1e6;
The output is in Gbytes. The last number free returns is 'cached' memory used by the OS, e.g. dynamic libraries. It can in general be used, but you can decide to leave it out and just use what free reports as 'Free' - the third numerical field in the output.
Edit On Linux, memory allocation within MATLABs mxMalloc/mxCalloc most likely simply calls malloc and friends. To get a hint that this is the case do the following experiment. In a mex file allocate an array using the following code, and return it to MATLAB:
rout = calloc(sizeof(Double),M*N);
pargout[0] = mxCreateNumericMatrix(0,0,mxDOUBLE_CLASS,mxREAL);
mxSetM(pargout[0], m);
mxSetN(pargout[0], n);
mxSetData(pargout[0], rout);
mexMakeMemoryPersistent(rout);
You can normally use the variable returned in MATLAB. You can even clear it - this does not cause any problems. If indeed MATLAB simply uses malloc, there is no way that I know of in which they can enforce physically contiguous memory.
I know that you can not run the above code on Windows though. This code crashes MATLAB. Of course, you should not do that in your codes. It merely illustrates the point.
I have a function that's taking a long time to run. When I profile it, I find that over half the time (26 out of 50 seconds) is not accounted for in the line by line timing breakdown, and I can show that the time is spent after the function finishes running but before it returns control by the following method:
ts1 = tic;
disp ('calling function');
functionCall(args);
disp (['control returned to caller - ', num2str(toc(ts1))]);
The first line of the function I call is ts2 = tic, and the last line is
disp (['last line of function- ', num2str(toc(ts2))]);
The result is
calling function
last line of function - 24.0043
control returned to caller - 49.857
Poking around on the interwebs, I think this is a symptom of the way MATLAB manages memory. It deallocates on function returns, and sometimes this takes a long time. The function does allocate some large (~1 million element) arrays. It also works with handles, but does not create any new handle objects or store handles explicitly. My questions are:
Is this definitely a memory management problem?
Is there any systematic way to diagnose what causes a problem in this function, as opposed to others which return quickly?
Are there general tips for reducing the amount of time MATLAB spends cleaning up on a function exit?
You are right, it seems to be the time spent on garbage collection. I am afraid it is a fundamental MATLAB flaw, it is known since years but MathWorks has not solved it even in the newest MATLAB version 2010b.
You could try setting variables manually to [] before leaving function - i.e. doing garbage collection manually. This technique also helps against memory leaks in previous MATLAB versions. Now MATLAB will spent time not on end but on myVar=[];
You could alleviate problem working without any kind of references - anonymous functions, nested functions, handle classes, not using cellfun and arrayfun.
If you have arrived to the "performance barrier" of MATLAB then maybe you should simply change the environment. I do not see any sense anyway starting today a new project in MATLAB except if you are using SIMULINK. Python rocks for technical computing and with C# you can also do many things MATLAB does using free libraries. And both are real programming languages and are free, unlike MATLAB.
I discovered a fix to my specific problem that may be applicable in general.
The function that was taking a long time to exit was called on a basic object that contained a vector of handle objects. When I changed the definition of the basic object to extend handle, I eliminated the lag on the close of the function.
What I believe was happening is this: When I passed the basic object to my function, it created a copy of that object (MATLAB is pass by value by default). This doesn't take a lot of time, but when the function exited, it destroyed the object copy, which caused it to look through the vector of handle objects to make sure there weren't any orphans that needed to be cleaned up. I believe it is this operation that was taking MATLAB a long time.
When I changed the object I was passing to a handle, no copy was made in the function workspace, so no cleanup of the object was required at the end.
This suggests a general rule to me:
If a function is taking a long time to clean up its workspace on exiting and you are passing a lot of data or complex structures by value, try encapsulating the arguments to that function in a handle object
This will avoid duplication and hence time consuming cleanup on exit. The downside is that your function can now unexpectedly change your inputs, because MATLAB doesn't have the ability to declare an argument const, as in c++.
A simple fix could be this: pre-allocate the large arrays and pass them as args to your functionCall(). This moves the deallocation issue back to the caller of functionCall(), but it could be that you are calling functionCall more often than its parent, in which case this will speed up your code.
workArr = zeros(1,1e6); % allocate once
...
functionCall(args,workArr); % call with extra argument
...
functionCall(args,wokrArr); % call again, no realloc of workArr needed
...
Inside functionCall you can take care of initializing and/or re-setting workArr, for instance
[workArr(:)] = 0; % reset work array