Matlab: Error using parallel_function: Out of Memory [duplicate] - matlab

This question already has answers here:
Saving time and memory using parfor?
(2 answers)
Closed 7 years ago.
I am using Matlab R2011b version on Windows 7 64 bit, Core i7 CPU with 8 GB RAM. I am running Approximate Nearest Neighbor algorithm called the Locality Sensitive Hashing using Matlabpool. Upon starting Matlab pool, I get the output
Starting matlabpool using the 'local' configuration ... connected to 4 labs.
When the control reaches the for loop, Matlab throws errro
Error using parallel_function (line 598)
Out of memory. Type HELP MEMORY for your options.
Error stack:
remoteParallelFunction.m at 29
Error in Evaluate (line 19)
parfor i=1:query_num
I have no clue how to solve this problem. Please help. Thank you

That is because the parfor requires a lot more memory.
All the workers/labs in a parfor loop are independent so each of them needs his amount of memory. Also, there is overhead involved due to the fact that the pool must spread and collect data from/to the workers.
Try using a regular for or open a pool with 2 workers instead of 4.
Also, it depends on how optimized your some_function() is: try using as few variables as possible.

Related

Matlab out of memory error while solving ODE

I have to integrate an ODE of 8 variable in matlab. My simulation time is 5e9 with a time step of 0.1. But it shows memory error. I am working with i7 core ,2.6Ghz CPU with 8GB RAM. How can I simulate ODEs for a large time samples?
Assuming you're working on 64 Bit version of MATLAB you might want to let MATLAB squeeze the memory to the edge using the Preferences -> MATLAB -> Workspace -> MATLAB Array Size Limit.
If you are getting this erro because you really mximized the memory in the system do the following:
Make sure you're using 64 Bit OS and 64 Bit version of MATLAB.
Before you call the ODE function, clear manually (using the clear() function) variables you don't need any more (Or can recreate once the function finishes).
Increase the swap file of your system. It will help with larger memory consumption but might make things much slower.
You can find more tips and tricks in Resolve "Out of Memory" Errors and memory().

MATLAB: controlling for number core / threads

Suppose that I have a program that is to be run on a linux machine with 32 cores (64 threads), of which I'm only allowed to use 10 cores (20 threads). So I would like to specify this before I run the program.
I googled and found maxNumCompThreads but it doesn't appear to work when I test it on a machine with MATLAB 2016a, core i5 (2 cores, 4 threads). That is, I get the same output for feature('numCores') when I do any of the following
maxNumCompThreads(1)
maxNumCompThreads(2)
maxNumCompThreads(4)
maxNumCompThreads('Automatic')
Then I tried parpool (each time I closed the current parpool session with delete(gcp('nocreate'))). I got an error when running parpool(4) (I think I understand why: parpool takes in the number of cores and hyper-threading is enabled automatically and the testing machine has just 2 physical cores). So I tested with parpool(1) and parpool(2). Again, the output for feature('numCores') did not change.
Question: so what is a right tool for the job for the situation described in the first paragraph above? And is feature('numCores') the right monitoring tool to see if the appropriate specification is in effect?
The same feature('numCores') output I keep referring to above is:
MATLAB detected: 2 physical cores.
MATLAB detected: 4 logical cores.
MATLAB was assigned: 4 logical cores by the OS.
MATLAB is using: 2 logical cores.
MATLAB is not using all logical cores because hyper-threading is enabled.
Edit: when I run parpool(10) on the linux machine I got the following error
Starting parallel pool (parpool) using the 'local' profile ... Error using parpo ol (line 103)
Couldn't interpret output from psname.sh: ""
Error in parpool_test_2016_10_03 (line 3)
parpool(10);
No, it's not the right monitoring tool. Look at feature('numthreads') instead:
>> feature('numcores')
MATLAB detected: 4 physical cores.
MATLAB detected: 8 logical cores.
MATLAB was assigned: 8 logical cores by the OS.
MATLAB is using: 4 logical cores.
MATLAB is not using all logical cores because hyper-threading is enabled.
ans =
4
>> feature('numthreads')
ans =
4
>> maxNumCompThreads(1)
ans =
4
>> feature('numcores')
MATLAB detected: 4 physical cores.
MATLAB detected: 8 logical cores.
MATLAB was assigned: 8 logical cores by the OS.
MATLAB is using: 4 logical cores.
MATLAB is not using all logical cores because hyper-threading is enabled.
ans =
4
>> feature('numthreads')
ans =
1
In general, be careful using feature as it's undocumented and prone to change without warning. Take a look at this post and this StackOverflow question for more information about feature.

lsqcurvefit fails depending on platform

I am trying to process an extremely large dataset which requires that I do several million non-linear curve fits. I have acquired a dedicated piece of code that is designed to be used for the data I have collected, which at its heart uses the MATLAB function lsqcurvefit. All works well when I run it on my laptop, except that the fitting is too slow to be useful to me right now, which is not too surprising considering that the model function is quite complicated. To put this in perspective, my laptop can only process about 8000 fits per hour, and I have on the order of tens of millions of fits to do.
Fortunately I have access to a computing cluster at my institution, which should enable me to process this data in a more reasonable time frame. The issue that has arisen is that - despite being cross-platform - there seems to be some significant difference between what the MATLAB code is doing on my Windows laptop and the cluster. Despite running the exact same code, on exactly the same data, with the same version of MATLAB, the code running on the Unix cluster fails with the following error message:
Error using eig
Input to EIG must not contain NaN or Inf.
Error in trust (line 29)
[V,D] = eig(H);
Error in trdog (line 109)
[st,qpval,po,fcnt,lambda] = trust(rhs,MM,delta);
Error in snls (line 311)
[sx,snod,qp,posdef,pcgit,Z] = trdog(x,g,A,D,delta,dv,...
Error in lsqncommon (line 156)
snls(funfcn,xC,lb,ub,flags.verbosity,options,defaultopt,initVals.F,initVals.J,caller,
...
Error in lsqcurvefit (line 254)
lsqncommon(funfcn,xCurrent,lb,ub,options,defaultopt,caller,...
I can confirm that there are no infinities or NaNs in my data, which this error message might initially seem to suggest. I can only conclude that using a different platform leads to some differing accuracy in execution, which probably leads to a divide by zero error somewhere along the way. My question is - how can I make this code run on the cluster?
For reference, my laptop is running Windows 7 Professional 64-bit, with an Intel i5 5200U 2.20GHz x4, and the cluster runs Scientific Linux 6.7 x86_64, with various Intel Xeon proccessors, with both running MATLAB R2015b.

Not enough RAM for parfor [duplicate]

This question already has answers here:
Saving time and memory using parfor?
(2 answers)
Closed 6 years ago.
If a parfor reckons that the computer will not have enough ram to run the code in parallel will it automatically serialize it? That definitely seems to be the case.
I have two identical parfor loops (except regarding the size of the matrices within them). On the first one it easily reaches 100% CPU and half my RAM, on the second one it reaches 12-20% CPU and all my RAM, and the codes are exactly equal (except for the size of the matrices inside them).
I have addressed the same issue in this question here.
In short, being each worker in the Matlab pool independent from the others, each worker needs his own amount of memory.
And no, Matlab does not automatically serialise your for-loop if it goes out-of-memory. If Matlab throws a proper error (as my knowledge it does happen on Windows PCs) you can do some sort of try-catch statement. The try-catch simply tries to execute the code in the try branch and if some error happens it executes automatically the catch block. In your case it'll be something like
try
% parfor here
catch
% standard for here
end

Matlab - Out Of Memory Error

I have a problem which occurs when I write the command line of the rbf (radial basis function) neural network
net = newrb(T, D);
I get the error
**??? Error using ==> unknown
Out of memory. Type HELP MEMORY for your options.
Error in ==> dist>apply at 119
z = zeros(S,Q);
Error in ==> boiler_weight at 38
result = apply(a,b,c);
Error in ==> dist at 90
boiler_weight
Error in ==> newrb>designrb at 143
P = radbas(dist(p',p)*b);
Error in ==> newrb at 127
[w1,b1,w2,b2,tr] = designrb(p,t,goal,spread,mn,df);**
I'm working with 2 GB RAM
Virtual Memory Initial size 4 GB & Maximum size 8 GB
I tried
Maximizing the virtual memory
Under Windows XP x32, I managed to almost double the amount of memory available to Matlab by editing boot.ini to add the switch /3GB /USERVA=3030
/fastdetect /3GB /USERVA=3030
pack (for memory defragmentation)
but all this with no use
Any help please ?!!!!!!
Thanx in advance
I don't have a fix, but here are some debugging techniques for OOMs in Matlab that seem germane.
Pack doesn't work nearly as well as its doco says it does. If memory is fragmented at a low level (not uncommon), you must restart Matlab to fix it. "Memory" and "feature memstats" will give some indication of low level fragmentation. Try restarting and running from a fresh Matlab session to see if it's fragmentation, or it's really peak memory usage.
Try a "dbstop if all error" so you break in to the debugger when you run out of memory. Then you can examine the stack frames with dbup and dbdown to see what's holding down memory, and see if there are any surprisingly large arrays. OOMs are sometimes from miscomputed indexes or array sizes that end up allocating extra-big arrays.
The undocumented "profile on -memory" option can tell you about memory usage during execution, which may help.
And your data set might just be too big. See if you can break it in to smaller parts and loop over them, reducing peak memory requirements.
Good luck.
Maybe one of the solutions offered by The MathWorks solves your issue:
http://www.mathworks.com/support/tech-notes/1100/1107.html