Matlab out of memory error while solving ODE - matlab

I have to integrate an ODE of 8 variable in matlab. My simulation time is 5e9 with a time step of 0.1. But it shows memory error. I am working with i7 core ,2.6Ghz CPU with 8GB RAM. How can I simulate ODEs for a large time samples?

Assuming you're working on 64 Bit version of MATLAB you might want to let MATLAB squeeze the memory to the edge using the Preferences -> MATLAB -> Workspace -> MATLAB Array Size Limit.
If you are getting this erro because you really mximized the memory in the system do the following:
Make sure you're using 64 Bit OS and 64 Bit version of MATLAB.
Before you call the ODE function, clear manually (using the clear() function) variables you don't need any more (Or can recreate once the function finishes).
Increase the swap file of your system. It will help with larger memory consumption but might make things much slower.
You can find more tips and tricks in Resolve "Out of Memory" Errors and memory().

Related

How to store giant matrix in matlab

I'm using a matlab script to create and store a large matrix of floating point numbers. When I tried running this program on my personal laptop, the program ended hours later with the message 'out of memory'. Supposedly, Matlab has a limit for the maximum-sized array it can store, which makes sense.
My question is: how to store large matrix in matlab? Specifically, I'm using a 64-bit linux OS, and I need to store a 5-6 GB matrix.
I am not an expert in this, but as I understand it the most simple solution would be to get more RAM. However, you could try to check the available memory at the time of the error with
dbstop if error
memory
This should tell you how much Memory is available for Matlab, how much is currently used and how large your biggest array can be. If you exceed this I don't think there is a software solution for that besides storing the data in multiple smaller files.
If you get the "Out of memory: Java Heap Space" error you can increase the memory which is available for Java under (Home -> Preferences -> General -> Java Heap Memory)
Also check if your array side is limited to a certain percentage of your available memory under (Home -> Preferences -> Workspace -> MATLAB array size limit) and set it to 100%.
Similar question in Matlab forum

MATLAB - codeHints taking 99.9% of runtime (R2013a)

I'm not sure what is going on. I am running my neural network simulations on my laptop, which has MATLAB R2013a on it.
The code runs fast on my desktop (R2012a though), but very very slow on the laptop. I ran it with performance and timing thing because this seems abnormal, here are the screenshots I took of the functions spending the most time doing something:
This is located in the codeHints.m file, so it isn't something I wrote. Is there any way I can disable this? I googled it but maybe I am not searching for the right things... I couldn't find anything. I can't get any work done because it is so slow :(
Would appreciate some advice!
Update: I have also attempted to run it on my desktop at work (same MATLAB version as laptop, also 8GB of RAM), and I get the same issue. I checked the resource monitor and it seems like the process is triggering a lot of memory faults (~40/sec), even though not even half of my RAM is being used.
I typed in "memory" in MATLAB and got the following information:
Maximum possible array: 11980 MB (1.256e+10 bytes) *
Memory available for all arrays: 11980 MB (1.256e+10 bytes) *
Memory used by MATLAB: 844 MB (8.849e+08 bytes)
Physical Memory (RAM): 8098 MB (8.491e+09 bytes)
So it seems like there should be sufficient room. I will try to put together a sample file.
Update #2: I ran my code on 2012a on the work computer with the following "memory" info:
Maximum possible array: 10872 MB (1.140e+10 bytes) *
Memory available for all arrays: 10872 MB (1.140e+10 bytes) *
Memory used by MATLAB: 846 MB (8.874e+08 bytes)
Physical Memory (RAM): 8098 MB (8.491e+09 bytes)
The run with more iterations than above (15000 as opposed to 10000) completed much faster and there are no extraneous calls for memory allocation:
So it seems to me that it is an issue exclusively with 2013a. For now I will use 2012a (because I need this finished), but if anyone has ideas on what to do with 2013a to stop those calls to codeHints, I would appreciate it.
Though this would scream memory problems at first sight, it seems like your test have made a lack of memory improbable. In this case the only reasonable explanation that I can think off is that the computer is actually trying to do 2 different things, thus taking more time.
Some possibilities:
Actually not using exactly the same inputs
Actually not using exactly the same functions
The first point can be detected by putting some breakpoints in the codes whilst running it on 2 computers and verifying that the inputs are exactly the same. (Consider using visdiff if you have a lot of variables)
The second one could almost only be caused by having overloaded zeros. Make sure to stop at this line and see which function is being called.
If both these points don't solve the problem, try reducing the code as much as possible till you have only one or a few lines that create the difference. If it turns out that the difference just comes from this one line, try using the zeros function with the right size input on both computers and time the result with the timeit File Exchange Submission
If you find that you are using the builtin function on both computers, with plenty of memory and there still is a huge performance difference, it is probably time to contact mathworks support and hear what they have to say about it.

Very slow execution of Matlab code under ubuntu

I was using MATLAB 2012a under windows 7 and I was executing some intense code, and I mean by intense in terms of memory usage and processing time, however, the code was working fine on Windows. Now, I changed my OS to ubuntu 12.04 and I installed Matlab 2013a. The amount of memory used is considerably less than the way it was in Windows, but the time taken by matlab to execute the same code is extremely high-really high.
I need to mention that my code contain nothing that may take such huge time except a statement of sparse with symbolic substitution as one of the arguments as follows
K=zeros(Np,Np);
for i=1:ord
K=K+sparse(t(1:ord,:),repmat(t(i,:),ord,1),double(subs(Kv(:,i),Arg(Kv,1,1,6),Arg(Kv,1,2,6))),Np,Np);
end
Note: that Kv is a symbolic matrix and Arg is a function to provide OLD and NEW and it depends on a number of global variables.
I have the feeling that I missed to add something to ubuntu that might help accelerate the execution of the Matlab codes.
Any ideas ?
I had a similar problem at windows, but I believe the solution is same on Ubuntu LTS.
So, if you increase the Java Heap Memory of Matlab, the Matlab will consume more memory from your system but it will be faster.
To do that go to:
File->preferences->General->Java Heap Memory and increase to the maximum.
The default value is 128, that is too little.
If heap memory limit doesn't fix the issue, then try increasing matlab process.
First start matlab, then do
ps aux|grep MATLAB
In my case the result is:
comtom 9769 28.2 19.8 4360632 761808 tty2 S<l+ 14:00 1:50 /usr/local/MATLAB/MATLAB_Production_Server/R2015a/bin/glnxa64/MATLAB -desktop
Look at first number (PID). Then use it with command renice to change process priority:
renice -3 -p 9769
That's it. The GUI is very slow because it's built against outdated Xorg libs. So changing priority helps, you may notice some gnome effect's tear, but matlab's interface will work a lot better.

Matlab Preallocation

I'm running a simulation of a diffusion-reaction equation in MATLAB, and I pre-allocate the memory for all of my vectors beforehand, however, during the loop, in which I solve a system of equations using BICG, the amount of memory that MATLAB uses is increasing.
For example:
concentration = zeros(N, iterations);
for t = 1:iterations
concentration(:,t+1) = bicg(matrix, concentration(:,t));
end
As the program runs, the amount of memory MATLAB is using increases, which seems to suggest that the matrix, concentration, is increasing in size as the program continues, even though I pre-allocated the space. Is this because the elements in the matrix are becoming doubles instead of zeros? Is there a better way to pre-allocate the memory for this matrix, so that all of the memory the program requires will be pre-allocated at the start? It would be easier for me that way, because then I would know from the start how much memory the program will require and if the simulation will crash the computer or not.
Thanks for all your help, guys. I did some searching around and didn't find an answer, so I hope I'm not repeating a question.
EDIT:
Thanks Amro and stardt for your help guys. I tried running 'memory' in MATLAB, but the interpreter said that command is not supported for my system type. I re-ran the simulation though with 'whos concentration' displayed every 10 iterations, and the allocation size of the matrix wasn't changing with time. However, I did notice that the size of the matrix was about 1.5 GB. Even though that was the case, system monitor was only showing MATLAB as using 300 MB (but it increased steadily to reach a little over 1 GB by the end of the simulation). So I'm guessing that MATLAB pre-allocated the memory just fine and there are no memory leaks, but system monitor doesn't count the memory as in use until MATLAB starts writing values to it in the loop. I don't know why that would be, as I would imagine that writing zeros would trigger the system monitor to see that memory as 'in use,' but I guess that's not the case here.
Anyway, I appreciate your help with this. I would vote both of your answers up as I found them both helpful, but I don't have enough reputation points to do that. Thanks guys!
I really doubt it's a memory leak, since most "objects" in MATLAB clean after themselves once they go out of scope. AFAIK, MATLAB does not use a GC per se, but a deterministic approach to managing memory.
Therefore I suspect the issue is more likely to be caused by memory fragmentation: when MATLAB allocates memory for a matrix, it has to be contiguous. Thus when the function is repeatedly called, creating and deleting matrices, and over time, the fragmentation becomes a noticeable problem...
One thing that might help you debug is using the undocumented: profile on -memory which will track allocation in the MATLAB profiler. Check out the monitoring tool by Joe Conti as well. Also this page has some useful information.
I am assuming that you are watching the memory usage of matlab in, for example, the task manager on windows. The memory usage is probably increasing due to the execution of bicg() and variables that have not been garbage collected after it ends. The memory allocated to the concentration matrix stays the same. You can type
whos concentration
before and after your "for" loop to see how much memory is allocated to that variable.

MATLAB "out of memory" error

When I run a sample script in MATLAB, it says:
Out of memory. Type HELP MEMORY for your options.
When I type "memory", it reports:
Maximum possible array: 156 MB (1.638e+008 bytes) *
Memory available for all arrays: 740 MB (7.756e+008 bytes) **
Memory used by MATLAB: 1054 MB (1.105e+009 bytes)
Physical Memory (RAM): 3070 MB (3.219e+009 bytes)
* Limited by contiguous virtual address space available.
** Limited by virtual address space available.
Is there any way to get around this error? I'm using Windows XP x32 with MATLAB 2009a.
pack does a memory defragmentation. It might help you a bit as far as the contiguous memory available.
Remember, when MATLAB says it's out of memory, it means it's out of contiguous memory, so rebooting or restarting MATLAB may work.
But, I'd recommend optimizing your code and identifying how you're eating up so much memory. It could be an ill-designed recursive loop, or a bad indexing function (using doubles instead of logicals to index a huge matrix).
I practically lived with memory errors for a while since I was dealing with huge datasets, but there's always a workaround, ask specific questions and you'll be surprised.
Problem fixed.
Under Windows XP x32, I managed to almost double the amount of memory available to MATLAB by editing boot.ini to add the switch /3GB /USERVA=3030
[boot loader]
timeout=30
default=multi(0)disk(0)rdisk(0)partition(1)\WINDOWS
[operating systems]
multi(0)disk(0)rdisk(0)partition(1)\WINDOWS="Microsoft Windows XP Professional" /noexecute=optin /fastdetect /3GB /USERVA=3030
Together with reducing our array sizes, this completely fixed the problem :)
I could have also fixed the problem by upgrading to Windows x64 or Windows 7 x64. This act also doubles the amount of memory available to MATLAB, even if you stick with MATLAB x32 and don't upgrade to MATLAB x64. Windows x64 is just far more memory efficient, even with systems that only have 4 GB of physical RAM installed.
Try this, it works well for me.
Go to Home -> Preference icon -> General -> Java Heap Memory -> Allocate what size of memory you want
In Preference window, go to "Workspace" (out of Java heap memory level) -> See "Matlab Array size limit"
Make sure uncheck the 'Limit the maximum array size to a percentage of RAM'. Because you want to extend memory
so we don't need this feature.
Done.
What are you attempting to allocate when it runs out of memory (OOM)? Do you have code to reproduce? A wide range of problems can cause out of memory errors.
To diagnose, use "dbstop if all error" to set a breakpoint on errors. The out of memory will trigger this, and you can use dbup, dbdown, and whos() to see what's consuming memory. Often an OOM is caused by a bad array size or index calculation, not just by big data structures. E.g. this will trigger an OOM in pretty much any 32-bit MATLAB.
>> x = 1;
>> x(2^30) = 2
??? Out of memory. Type HELP MEMORY for your options.
I faced a similar error while running an (old) C file in MATLAB using mex.
I found my solution at this issue on GitLab.
First, uncheck the option "Limit the maximum Array Size to a % of RAM" located under Preferences -> Workspace, as also indicated in this earlier answer.
Once applied, run your C file in the command window using
mex filename.c -compatibleArrayDims