How to have multiple instances of MATLAB save the same file simultaneously - matlab

I am currently writing code to run a series of time-consuming experiments using nodes on a Unix cluster. Each of these experiments takes over 3 days runs on a a 12-core machine. When each experiment is done, I am hoping to have it save some data to a common file.
I have a slight issue in that I submit all of my experiments to the cluster at the same time and so they are likely to be saving to the same file at the same time as well.
I am wondering what will happen when multiple instances of MATLAB try to save the same file at the same time (error/crash/nothing). Whatever the outcome, could I work around it using a try/catch loop as follows:
n_tries = 0;
while n_tries < 10
try
save('common_file',data)
n_tries = 10;
catch
wait_time = 60 * rand;
pause(wait_time);
n_tries = n_tries+1;
end
end
end

Don't.
All Matlab functions are explicitly not safe to use in a multi-threading/processing environment.
If you write to one mat-file simultaneously from multiple matlab sessions, chances are good that either several variables are missing (because e.g. 2 matlab append to the same state of the file) or the whole file gets corrupted.
Save individual files and merge them in a post-processing step.

For such long simulation runs, don't aggregate your data automatically unless you have a reliable framework. There are several reasons:
Out of Memory exceptions or similar while writing can destroy all previous results, this is likely to happen while writing large amounts of data.
Coding errors can destroy previous results. Your code will overwrite at least the most recent added data in case of a collision.
Undetected errors in mex functions, which by randomly hit the matlab address space instead of casing a segmentation fault, can cause Matlab to write crap to your Matfile and destroy previous results.
Use some unique pattern, e.g. pc-name + current date/time

You would be best served by having a single recorder task that does the file output and queue the save information to that task.
Don't forget that the output "file" that you supply to the matlab only has to be file like - i.e. support the necessary methods.

Related

MATLAB: Does the execution of addpath/rmpath/savepath in one MATLAB instance affect other instances?

Does the execution of addpath/rmpath/savepath in one MATLAB instance affect other instances?
Motivation: Imagine that you are developing a MATLAB package, which provides a group of functions to the users. You have multiple versions of this package being developed on a single laptop. You would like to test these different versions in multiple instances of MATLAB:
You open one MATLAB window, type run_test(DIRECTORY_OF_PACKAGE_VERSION1), and hit enter;
While the first test is running, you open another MATLAB window, type run_test(DIRECTORY_OF_PACKAGE_VERSION2), and hit enter.
See the pseudo-code below for a better idea about the tests.
No code or data is shared between different tests --- except for those embedded in MATLAB, as the tests are running on the same laptop, using the same installation of MATLAB. Below is a piece of pseudo-code for such a scenario.
% MATLAB instance 1
run_test(DIRECTORY_OF_PACKAGE_VERSION1);
% MATLAB instance 2
run_test(DIRECTORY_OF_PACKAGE_VERSION2);
% Code for the tests
function run_test(package_directory)
setup_package(package_dirctory);
RUN EXPERIMENTS TO TEST THE FUNCTIONS PROVIDED BY THE PACKAGE;
uninstall_package(package_directory);
end
% This is the setup of the package that you are developing.
% It should be called as a black box in the tests.
function setup_package(package_dirctory)
addpath(PATH_TO_THE_FUNCTIONS_PROVIDED_BY_THE_PACKAGE);
% Make the package available in subsequent MATLAB sessions
savepath;
end
% The function that uninstalls the package: remove the paths
% added by `setup_package` and delete the files etc.
function uninstall_package(package_directory)
rmpath(PATH_TO_THE_FUNCTIONS_PROVIDED_BY_THE_PACKAGE);
savepath;
end
You want to make sure the following.
The tests do not interfere with each other;
Each test is calling funtions from the correct version of the package.
Hence here come our questions.
Questions:
Does the execuation of addpath, rmpath, and savepath in one MATLAB instance affect the other instance, sooner or later?
More generally, what kind of commands executed in one MATLAB instance can affect the other instance?
3. What if I am running only one instance of MATLAB, but invoke a parfor loop with two loops running in parallel? Does the execution of addpath/rmpath/savepath in one loop affect the other loop, sooner or later? In general, what kind of commands executed in one parallel loop can affect the other loop? (As pointed out by #Edric, this can be complicated; so let us not worry about it. Thank you, #Edric.)
Thank you very much for any comments and insights. It would be much appreciated if you could direct me to relevant sections in the official documentation of MATLAB --- I did some searching in the documentation, but have not found an answer to my question.
BTW, in case you find that the test described in the pseudo code is conducted in a wrong/bad manner, I will be very grateful if you could recommend a better way of doing it.
The documentation page for the MATLAB Search Path specifies at the bottom:
When you change the search path, MATLAB uses it in the current session, but does not update pathdef.m. To use the modified search path in the current and future sessions, save the changes using savepath or the Save button in the Set Path dialog box. This updates pathdef.m.
So, standard MATLAB sessions are "isolated" in terms of their MATLAB Search Path unless you use savepath. After a call to savepath, new MATLAB sessions will read the updated pathdef.m on startup.
The situation with a parallel pool is slightly more complex. There are a couple of things that affect this. First is the parameter AutoAddClientPath that you can specify for the parpool command. When true, an attempt is made to reflect the desktop MATLAB's path on the workers. (This might not work if the workers cannot access the same folders).
When a parallel pool is running, any changes to the path on the desktop MATLAB client are sent to the workers, so they can attempt to add or remove path entries. Parallel pool workers calling addpath or rmpath do so in isolation. (I'm afraid I can't find a documentation reference for this).

Saving Matlab object instance results in an infinite loop

Set up :
I have created a Matlab handle class called "Participant' for reading and operating on certain research data. I have created multiple instances of this object and saved them to the hard disk with no problem. I have also checked my problematic instance to ensure that it is functional in Matlab. There does not seem to be any bugs with the instance.
The problem
However, on certain instances, for no clear reason to me, Matlab gets stuck on an infinite loop writing to disk. This is evident in looking at the .mat fiels output's modification date which keeps changing per minute and in the fact that my Matlab instance slows down tremendously.
The code to create the the participant is
myparticipant = participant([basedir ,p_folder{p_num}]);
Methods tried
I have saved to disk by right clicking in the workspace which results in the problem above.
Using the save function, I get :
save('test.mat', 'myparticipant')
Error using save
Error closing file test.mat.
The file may be corrupt.
and of course it won't load after.
Any insight would be appreciated as I'm not sure how to start approaching this issue.
Thanks to excaza's comment I was able to spot this issue. As I explained in my comment response, the problem was that because I was using a handle class, the size of my data shown in working memory was much small. However, my data size was actualy bigger than 2gb. In these cases, you have to use Matlab's "-V7.3" keyword to save to file! adding that flag did it for me.

Adding Multiple Values to a Variable in MATLAB

I have to work with a lot of data and run the same MATLAB program more than once, and every time the program is run it will store the data in the same preset variables. The problem is, every time the program is run the values are overwritten and replaced, most likely because all the variables are type double and are not a matrix. I know how to make a variable that can store multiple values in a program, but only when the program is run once.
This is the code I am able to provide:
volED = reconstructVolume(maskAlignedED1,maskAlignedED2,maskAlignedED3,res)
volMean = (volED1+volED2+volES3)/3
strokeVol = volED-volES
EF = strokeVol/volED*100
The program I am running depends on a ton more MATLAB files that I cannot provide at this moment, however I believe the double variables strokeVol and EF are created at this instant. How do I create a variable that will store multiple values and keep adding the values every time the program is run?
The reason your variables are "overwritten" with each run is that every function (or standalone program) has its own workspace where the local variables are located, and these local variables cease to exist when the function (or standalone program) returns/terminates. In order to preserve the value of a variable, you have to return it from your function. Since MATLAB passes its variables by value (rather than reference), you have to explicitly provide a vector (or more generally, an array) as input and output from your function if you want to have a cumulative set of data in your calling workspace. But it all depends on whether you have a function or a deployed program.
Assuming your program is a function
If your function is now declared as something like
function strokefraction(inputvars)
you can change its definition to
function [EFvec]=strokefraction(inputvars,EFvec)
%... code here ...
%volES initialized somewhere
volED = reconstructVolume(maskAlignedED1,maskAlignedED2,maskAlignedED3,res);
volMean = (volED1+volED2+volES3)/3;
strokeVol = volED-volES;
EF = strokeVol/volED*100;
EFvec = [EFvec; EF]; %add EF to output (column) vector
Note that it's legal to have the same name for an input and an output variable. Now, when you call your function (from MATLAB or from another function) each time, you add the vector to its call, like this:
EFvec=[]; %initialize with empty vector
for k=1:ndata %simulate several calls
inputvar=inputvarvector(k); %meaning that the input changes
EFvec=strokefraction(inputvar,EFvec);
end
and you will see that the size of EFvec grows from call to call, saving the output from each run. If you want to save several variables or arrays, do the same (for arrays, you can always introduce an input/output array with one more dimension for this purpose, but you probably have to use explicit indexing instead of just shoving the next EF value to the bottom of your vector).
Note that if your input/output array eventually grows large, then it will cost you a lot of time to keep allocating the necessary memory by small chunks. You could then choose to allocate the EFvec (or equivalent) array instead of initializing it to [], and introduce a counter variable telling you where to overwrite the next data points.
Disclaimer: what I said about the workspace of functions is only true for local variables. You could also define a global EFvec in your function and on your workspace, and then you don't have to pass it in and out of the function. As I haven't yet seen a problem which actually needed the use of global variables, I would avoid this option. Then you also have persistent variables, which are basically globals with their scope limited to their own workspace (run help global and help persistent in MATLAB if you'd like to know more, these help pages are surprisingly informative compared to usual help entries).
Assuming your program is a standalone (deployed) program
While I don't have any experience with standalone MATLAB programs, it seems to me that it would be hard to do what you want for that. A MathWorks Support answer suggests that you can pass variables to standalone programs, but only as you would pass to a shell script. By this I mean that you have to pass filenames or explicit numbers (but this makes sense, as there is no MATLAB workspace in the first place). This implies that in order to keep a cumulative set of output from your program you would probably have to store those in a file. This might not be so painful: opening a file to append the next set of data is straightforward (I don't know about issues such as efficiency, and anyway this all depends on how much data and how many runs of your function we're talking about).

Loading Variables into Functions from Structs in Matlab

Say I have a project which is comprised of:
A main script that handles all of the running of my simulation
Several smaller functions
A couple of structs containing the data
Within the script I will be accessing the functions many times within for loops (some over a thousand times within the minute long simulation). Each function is also looking for data contained with a struct files as part of their calculations, which are usually parameters that are fixed over the course of the simulation, however need to be varied manually between runs to observe the effects.
As typically these functions form the bulk of the runtime I'm trying to save time, as my simulation can't quite run at real-time as it stands (the ultimate goal), and I lose alot of time passing variables/parameters around functions. So I've had three ideas to try and do this:
Load the structs in the main simulation, then pass each variable in turn to the function in the form of a large argument (the current solution).
Load the structs every time the function is called.
Define the structs as global variables.
In terms of both the efficiency of the system (most relevent as the project develops), and possibly as I'm no expert programmer from a "good practise" perspective what is the best solution for this? Is there another option that I have not considered?
As mentioned above in the comments - the 1st item is best one.
Have you used the profiler to find out where you code takes most of its time?
profile on
% run your code
profile viewer
Note: if you are modifying your input struct in your child functions -> this will take more time, but if you are just referencing them then that should not be a problem.
Matlab does what's known as a "lazy copy" when passing arguments between functions. This means that it passes a pointer to the data to the function, rather than creating a new instance of that data, which is very efficient memory- and speed-wise. However, if you make any alteration to that data inside the subroutine, then it has to make a new instance of that argument so as to not overwrite the argument's value in the main function. Your response to matlabgui indicates you're doing just that. So, the subroutine may be making an entire new struct every time it's called, even though it's only modifying a small part of that struct's values.
If your subroutine is altering a small part of the array, then your best bet is to just pass that small part to it, then assign your outputs. For instance,
[modified_array] = somesubroutine(struct.original_array);
struct.original_array=modified_array;
You can also do this in just one line. Conceptually, the less data you pass to the subroutine, the smaller the memory footprint is. I'd also recommend reading up on in-place operations, as it relates to this.
Also, as a general rule, don't use global variables in Matlab. I have not personally experienced, nor read of an instance in which they were genuinely faster.

Omnet++: getting .ned file after simulations has ended

When my simulation, using Omnet++, starts, there is a .ned file describing the initial scenario (in my case it shows a certain type of a network configuration). During the simulation this scenario changes, come times it changes very much. Is there a way to get a .ned file describing the final scenario after the simulation has ended? So that I can analyze it with a script...
thnaks
Not the NED file, because that describes the initial network structure in an abstract way (i.e. it can contain vectors, loops, conditional statements etc.)
What you need is a simple dump of all or selected objects. You should use the 'snapshot feature' for this. It produces a nicely formatted XML output. You can read more about it in the manual:
http://www.omnetpp.org/doc/omnetpp/manual/usman.html#sec299