Set up :
I have created a Matlab handle class called "Participant' for reading and operating on certain research data. I have created multiple instances of this object and saved them to the hard disk with no problem. I have also checked my problematic instance to ensure that it is functional in Matlab. There does not seem to be any bugs with the instance.
The problem
However, on certain instances, for no clear reason to me, Matlab gets stuck on an infinite loop writing to disk. This is evident in looking at the .mat fiels output's modification date which keeps changing per minute and in the fact that my Matlab instance slows down tremendously.
The code to create the the participant is
myparticipant = participant([basedir ,p_folder{p_num}]);
Methods tried
I have saved to disk by right clicking in the workspace which results in the problem above.
Using the save function, I get :
save('test.mat', 'myparticipant')
Error using save
Error closing file test.mat.
The file may be corrupt.
and of course it won't load after.
Any insight would be appreciated as I'm not sure how to start approaching this issue.
Thanks to excaza's comment I was able to spot this issue. As I explained in my comment response, the problem was that because I was using a handle class, the size of my data shown in working memory was much small. However, my data size was actualy bigger than 2gb. In these cases, you have to use Matlab's "-V7.3" keyword to save to file! adding that flag did it for me.
Related
I'm writing an UEFI application that should be able to write a lot of data to disk.
I'm aware of the FAT-32 file size limitations and number of files per directory etc. That should not be the problem. The memory region I'm trying to write is marked as usable by the memory map und I can read/write to it without problems but after a certain amount of data my vm just reboots without any error messages.
The following line makes problems:
uefi_call_wrapper(handle->Write, 3, handle, size, content);
handle is initialized a few lines earlier, size is always max 128MiB and content a valid memory region with read/write access.
I've already rewritten the whole thin on Windows with EDK2 and got the same problems.
Can anyone help me with this?
Thank you in advance and have a nice evening
Assuming that handle is a pointer to EFI_FILE_PROTOCOL, the BufferSize parameter to Write is passed by reference. When the function returns, BufferSize contains the number of bytes written. You didn't give enough context in your question, but it looks like you are passing it by value.
Hi guys and thank you for you answeres. The size argument is a pointer. I've just found the solution for the problem. I didn't know I have to reset the watchdog timer.
After calling uefi_call_wrapper(ST->BootServices->SetWatchdogTimer, 4, 0, 0, 0, NULL); everything works as expected
Cheers!
Say I have a project which is comprised of:
A main script that handles all of the running of my simulation
Several smaller functions
A couple of structs containing the data
Within the script I will be accessing the functions many times within for loops (some over a thousand times within the minute long simulation). Each function is also looking for data contained with a struct files as part of their calculations, which are usually parameters that are fixed over the course of the simulation, however need to be varied manually between runs to observe the effects.
As typically these functions form the bulk of the runtime I'm trying to save time, as my simulation can't quite run at real-time as it stands (the ultimate goal), and I lose alot of time passing variables/parameters around functions. So I've had three ideas to try and do this:
Load the structs in the main simulation, then pass each variable in turn to the function in the form of a large argument (the current solution).
Load the structs every time the function is called.
Define the structs as global variables.
In terms of both the efficiency of the system (most relevent as the project develops), and possibly as I'm no expert programmer from a "good practise" perspective what is the best solution for this? Is there another option that I have not considered?
As mentioned above in the comments - the 1st item is best one.
Have you used the profiler to find out where you code takes most of its time?
profile on
% run your code
profile viewer
Note: if you are modifying your input struct in your child functions -> this will take more time, but if you are just referencing them then that should not be a problem.
Matlab does what's known as a "lazy copy" when passing arguments between functions. This means that it passes a pointer to the data to the function, rather than creating a new instance of that data, which is very efficient memory- and speed-wise. However, if you make any alteration to that data inside the subroutine, then it has to make a new instance of that argument so as to not overwrite the argument's value in the main function. Your response to matlabgui indicates you're doing just that. So, the subroutine may be making an entire new struct every time it's called, even though it's only modifying a small part of that struct's values.
If your subroutine is altering a small part of the array, then your best bet is to just pass that small part to it, then assign your outputs. For instance,
[modified_array] = somesubroutine(struct.original_array);
struct.original_array=modified_array;
You can also do this in just one line. Conceptually, the less data you pass to the subroutine, the smaller the memory footprint is. I'd also recommend reading up on in-place operations, as it relates to this.
Also, as a general rule, don't use global variables in Matlab. I have not personally experienced, nor read of an instance in which they were genuinely faster.
When my simulation, using Omnet++, starts, there is a .ned file describing the initial scenario (in my case it shows a certain type of a network configuration). During the simulation this scenario changes, come times it changes very much. Is there a way to get a .ned file describing the final scenario after the simulation has ended? So that I can analyze it with a script...
thnaks
Not the NED file, because that describes the initial network structure in an abstract way (i.e. it can contain vectors, loops, conditional statements etc.)
What you need is a simple dump of all or selected objects. You should use the 'snapshot feature' for this. It produces a nicely formatted XML output. You can read more about it in the manual:
http://www.omnetpp.org/doc/omnetpp/manual/usman.html#sec299
I am currently writing code to run a series of time-consuming experiments using nodes on a Unix cluster. Each of these experiments takes over 3 days runs on a a 12-core machine. When each experiment is done, I am hoping to have it save some data to a common file.
I have a slight issue in that I submit all of my experiments to the cluster at the same time and so they are likely to be saving to the same file at the same time as well.
I am wondering what will happen when multiple instances of MATLAB try to save the same file at the same time (error/crash/nothing). Whatever the outcome, could I work around it using a try/catch loop as follows:
n_tries = 0;
while n_tries < 10
try
save('common_file',data)
n_tries = 10;
catch
wait_time = 60 * rand;
pause(wait_time);
n_tries = n_tries+1;
end
end
end
Don't.
All Matlab functions are explicitly not safe to use in a multi-threading/processing environment.
If you write to one mat-file simultaneously from multiple matlab sessions, chances are good that either several variables are missing (because e.g. 2 matlab append to the same state of the file) or the whole file gets corrupted.
Save individual files and merge them in a post-processing step.
For such long simulation runs, don't aggregate your data automatically unless you have a reliable framework. There are several reasons:
Out of Memory exceptions or similar while writing can destroy all previous results, this is likely to happen while writing large amounts of data.
Coding errors can destroy previous results. Your code will overwrite at least the most recent added data in case of a collision.
Undetected errors in mex functions, which by randomly hit the matlab address space instead of casing a segmentation fault, can cause Matlab to write crap to your Matfile and destroy previous results.
Use some unique pattern, e.g. pc-name + current date/time
You would be best served by having a single recorder task that does the file output and queue the save information to that task.
Don't forget that the output "file" that you supply to the matlab only has to be file like - i.e. support the necessary methods.
I recently wrote some code using Matlab's OOP. In each class object I save some measurement data as a property and define the methods for evaluating them. With an average data set one single class object uses about 32 MB of memory.
Now I am writing a GUI that should process these objects.
In the first step I load a set of objects from a saved .mat-file (about 200 objects, 2GB on harddisk) and store them in the handles struct. They fill the RAM and use about 6-7 GB, when loaded. This is no problem.
But if I close the GUI, it seems that I can't free the used memory.
I tried different approaches with no success.
Setting the data fields to "empty" in the destructor of the class:
function delete(obj)
obj.timeVector = [];
obj.valueVector = [];
end
Trying to free it in the figure_CloseRequestFcn:
function figure_CloseRequestFcn(hObject, eventdata, handles)
handles.data = [];
handles = rmfield(handles,'data');
guidata(hObject,handles);
clear handles;
pack; %Matlab issues a warning, that pack could only
%be used from the command line, but that did
%not work either
delete(hObject);
end
Any ideas, besides closing Matlab every time after working with the GUI?
I found the answer in the Matlab Bug Report Center. Seems to exist since R2011b.
Summary
Storing objects in MAT-files can cause a memory leak and prevent the object class from being cleared
Description
After storing an instance of a class, 'MyClass', in a MAT-file, calling clear classes may result in the warning:
Warning: Objects of 'MyClass' class exist. Cannot clear this class or any of its superclasses.
This warning persists, even if you have cleared all instances of the class in the workspace.
The warning may occur for one MAT-file format, and not for another.
Workaround
Under some circumstances, switching to a different MAT-file format may eliminate the warning.
http://www.mathworks.ch/support/bugreports/857319
Edit:
I tried older formats for saving, but this does not work either. I get an "Error closing file" (http://www.mathworks.ch/matlabcentral/answers/18098-error-using-save-error-closing-file). So Matlab does not support saving class objects that well. I will have to live with the memory issues then and restart Matlab after every use of the GUI.
Based on your memory screenshots, there is definitely memory that is not being cleared. There is a small chance that you have found a fundamental flaw in Matlab's garbage collection, but it is much more likely that the ~6Gigs of memory resident data is still actually available via some series of links. Based on personal experience, here are a few ways that memory which you thought was cleared can still be available:
Timer objects: If one of the callback functions of a timer references a this data (or a copy), then that data is still available. You need to call deleted(t) on that timer.
Persistent variables in functions: I often cache data in a persistent variable within a function, this clearly allows access to that data in the future, so it will not be cleared. You need to call clear FUNCTIONNAME to clear associated persistent variables.
In GUI objects, as either data or within callback functions: The figures and any persistents need to be cleared.
Any static methods or constant attributes in classes which can retain data. These can either be cleared individually within the class, or by force using clear CLASSNAME.
Some tips for finding stale link to data (again, based on personal mistakes)
Look at the exact number of bytes being lost after each call, using the x=memory; call to get an exact count. Is it consistent? Is it a number that you recognize? Sometimes I can find the leak after realizing that it is exactly 238263232 bytes, therefore a 29782904 double array, which must be from function xyz.
See which classes are actually being deleted. Within your delete(obj) function add a detailed display or which objects are being deleted, and by inference, which are not. For a given non-deleted object, where could it be reference from? You should not need to clear data in the delete(obj) function like you are doing, Matlab should handle that for you. Use the delete function instead as a debugging tool.
Matlab has a garbage collector so you don't need to manually manage memory. After closing the GUI, all the memory will be freed except for what is in your workspace. You can clear the workspace variables using clear.
One thing I've noticed on Windows (not sure about other platforms) is that Matlab's GUI sometimes retains extra memory (maybe 100 MB, but not multiple GB like you are seeing). Simply minimizing and then restoring the GUI will free this excess memory.