Is the memory limit affected when running multiple instances of Matlab? - matlab

I can run multiple instances of Matlab by simply opening the program multiple times. An instance of Matlab has a memory limit.
If I open two Matlab programs on my computer, will this limit be affected and how? E.g. will it be split in two?

As far as I observe, the memory limit per instance is calculated dynamically based on the actual available RAM. For instance, we run one instance per thread on a 12 thread CPU (and 64GB of RAM) and never had problems with lacking memory.
I did a simple test:
Run first matlab instance and use the memory command to get memory information:
memory
Maximum possible array: 7651 MB (8.023e+09 bytes)
Memory available for all arrays: 7651 MB (8.023e+09 bytes)
Memory used by MATLAB: 2268 MB (2.378e+09 bytes)
Physical Memory (RAM): 16263 MB (1.705e+10 bytes)
Open the second instance and use the memorycommand in both instances shows, that the available memory decreased in the first instance and is nearly the same as in the second instance.
Open some other programs which use some memory and use the memory command in both instances shows again, that the available memory decreases.
Creating some huge variables in one or both instances also reduces the memory in each instance.
I hope this answer helps, although it is more experimental.

Related

How can I use only part of my total RAM during MATLAB computation?

I would like to dedicate 8 GB of RAM instead of the full (12) for a very long computation, in order to use the remainder for another operation. Is it possible?
Is there maybe a MATLAB command that forces the maximum limit of memory usage?
I would like to work with 2 separate editors.
See here for a possible solution on "limit the memory of a process on windows":
Set Windows process (or user) memory limit
Matlab has no command to limit the memory usage, it will aquire as much memory as needed to do the computation. On some operating systems you can limit the memory usage, for example using ulimit on Linux. But be aware, when Matlab needs more than 8gb it will not be slow when reaching the limit, it will throw an exception and stop computing.

Matlab: use of memory

I have several "out of memory" problems using MATLAB. I don't understand exactly if Matlab can use (or not) all the ram memory of my computer. This is the problem: my computer has 4gb of ram memory and 2 gb for Swap memory (my OS is Linux/Ubuntu 12.10), but Matlab only uses up to 2.6 gb and then shows the warning: "out of memory".
Is it possible to fix this and allow Matlab to use all the "available" memory?
Thanks.
It sounds like you're running 32bit linux and or 32bit MATLAB.
If you allow for enough swap, a process can take up to its virtual memory address space worth of memory.
Generally for 32bit linux you're limited to 3gb of address space for any process (the last gb is kernel memory space). It's entirely possible, depending on usage patterns, that at 2.6gb the next request for memory can't complete because there isn't enough /contiguous/ memory to satisfy it. This is especially common when growing large arrays.
Upgrading to a 64bit version of linux/windows/macOS with 64bit MATLAB should solve this problem but even so, using 3gb+ of virtual address space on a system with 4gb of ram is probably going to start making things very slow.
Some googling brought up this:
MATLAB will use as much memory as your OS lets it use;
the only way to increase the amount of memory MATLAB can use is to reduce
the amount of memory occupied by other applications or to give the OS more
memory to partition out to the applications.
So no, there's no easy way to tell matlab to use more memory.
You either have to buy more memory, optimize your code, run your scripts/functions with less output to store at once or reduce memory usage by other procedures that are running.
Here are some helpful links though:
memory management functions
memory allocation
related discussion on the mathworks forum

Matlab: your opinion about a small memory issue working with matrix

I have a small question regarding MATLAB memory consumption.
My Architecture:
- Linux OpenSuse 12.3 64bit
- 16 GB of RAM
- Matlab 2013a 64 bit
I handle a matrix of double with size: 62 x 11969100 (called y)
When I try the following:
a = bsxfun(#minus,y,-1)
or simply
a = minus(y, -1)
I got a OUT of MEMORY error (in both cases).
I've just computed the ram space allocated for the matrix:
62 x 11969100 x 8 = 5.53 GB
Where am I wrong?!
Thanks a lot!
I'm running on Win64, with 16GB RAM.
Starting with a fresh MATLAB, with only a couple of other inconsequential applications open, my baseline memory usage is about 3.8GB. When I create y, that increases to 9.3GB (9.3-3.8 = 5.5GB, about what you calculate). When I then run a = minus(y, -1), I don't run out of memory, but it goes up to about 14.4GB.
You wouldn't need much extra memory to have been taken away (1.6GB at most) for that to cause an out of memory error.
In addition, when MATLAB stores an array, it requires a contiguous block of memory to do so. If your memory was a little fragmented - perhaps you had a couple of other tiny variables that happened to be stored right in the middle of one of those 5.5GB blocks - you would also get an out of memory error (you can sometimes avoid that issue with the use of pack).
The output of memory on windows platform:
>> memory
Maximum possible array: 2046 MB (2.145e+009 bytes) *
Memory available for all arrays: 3226 MB (3.382e+009 bytes) **
Memory used by MATLAB: 598 MB (6.272e+008 bytes)
Physical Memory (RAM): 3561 MB (3.734e+009 bytes)
* Limited by contiguous virtual address space available.
** Limited by virtual address space available.
The output of computer on linux/mac:
>> [~,maxSize] = computer
maxSize =
2.814749767106550e+14 % Max. number of elements in a single array
with some hacks (found here):
>> java.lang.Runtime.getRuntime.maxMemory
ans =
188416000
>> java.lang.Runtime.getRuntime.totalMemory
ans =
65011712
>> java.lang.Runtime.getRuntime.freeMemory
ans =
57532968
As you can see, aside from memory limitations per variable, there are also limitations on total storage for all variables. This is not different for Windows or Linux.
The important thing to note is that for example on my Windows machine, it is impossible to create two 1.7GB variables, even though I have enough RAM, and neither is limited by maximum variable size.
Since carrying out the minus operation will assign a result of equal size to a new variable (a in your case, or ans when not assigning to anything), there need to be at least two of these humongous things in memory.
My guess is you run into the second limit of total memory space available for all variables.
bsxfun is vectorized for efficiency. Typically vectorized solutions require more than just minimal memory.
You could try using repmat, or if that does not work a simple for loop.
In general I believe the for loop will require the least memory.

How can I resolve out of memory error in MATLAB?

I want to calculate 2 covariance matrices with size (10304,1034) and matlab creates the first one but when it runs the second command, this error occurs:
>> j=ones(10000,10000);
>> jj=ones(10000,10000);
??? Out of memory. Type HELP MEMORY for your options.
My laptop's RAM is 2GB, but it still has 1 GB free. I am using Windows 7 and 32-bit MATLAB 2009b.
How can I resolve this error?
A 10k-by-10k array of doubles uses 1e8*8 bytes, which corresponds to 800MB. MATLAB needs these 800MB to be contiguous. Most likely, your 1GB free memory is a little fragmented, so MATLAB cannot fit the new array into RAM.
Use the command MEMORY to find out the maximum variable size that MATLAB can handle at a given moment.
Try to use sparse matrices, in that case MATLAB doesn't allocate the entire space.
Try any of these two options little bit increase in memory allocated for matlab.exe processing.
1- Give higher priority to Matlab.exe task. You can do that by going to task manager, Processes tab, right click the Matlab.exe Task, select priority and set it to higher priority (say real time), this tells Windows to allocate more resources to this process.
2- Increase the page file size of your applications in general. You can do this by right clicking MyComputer ->properties->Advanced System Settings ->Advanced-> Performance->Virtual Memory (change..). Then the tick from the Automatic .... and set the initial and maximum page size to say 10000 MB.
Go to Matlab-->file-->Preferences-->general-->Java heap memory--> and increase the level.. This solved my problem

MATLAB "out of memory" error

When I run a sample script in MATLAB, it says:
Out of memory. Type HELP MEMORY for your options.
When I type "memory", it reports:
Maximum possible array: 156 MB (1.638e+008 bytes) *
Memory available for all arrays: 740 MB (7.756e+008 bytes) **
Memory used by MATLAB: 1054 MB (1.105e+009 bytes)
Physical Memory (RAM): 3070 MB (3.219e+009 bytes)
* Limited by contiguous virtual address space available.
** Limited by virtual address space available.
Is there any way to get around this error? I'm using Windows XP x32 with MATLAB 2009a.
pack does a memory defragmentation. It might help you a bit as far as the contiguous memory available.
Remember, when MATLAB says it's out of memory, it means it's out of contiguous memory, so rebooting or restarting MATLAB may work.
But, I'd recommend optimizing your code and identifying how you're eating up so much memory. It could be an ill-designed recursive loop, or a bad indexing function (using doubles instead of logicals to index a huge matrix).
I practically lived with memory errors for a while since I was dealing with huge datasets, but there's always a workaround, ask specific questions and you'll be surprised.
Problem fixed.
Under Windows XP x32, I managed to almost double the amount of memory available to MATLAB by editing boot.ini to add the switch /3GB /USERVA=3030
[boot loader]
timeout=30
default=multi(0)disk(0)rdisk(0)partition(1)\WINDOWS
[operating systems]
multi(0)disk(0)rdisk(0)partition(1)\WINDOWS="Microsoft Windows XP Professional" /noexecute=optin /fastdetect /3GB /USERVA=3030
Together with reducing our array sizes, this completely fixed the problem :)
I could have also fixed the problem by upgrading to Windows x64 or Windows 7 x64. This act also doubles the amount of memory available to MATLAB, even if you stick with MATLAB x32 and don't upgrade to MATLAB x64. Windows x64 is just far more memory efficient, even with systems that only have 4 GB of physical RAM installed.
Try this, it works well for me.
Go to Home -> Preference icon -> General -> Java Heap Memory -> Allocate what size of memory you want
In Preference window, go to "Workspace" (out of Java heap memory level) -> See "Matlab Array size limit"
Make sure uncheck the 'Limit the maximum array size to a percentage of RAM'. Because you want to extend memory
so we don't need this feature.
Done.
What are you attempting to allocate when it runs out of memory (OOM)? Do you have code to reproduce? A wide range of problems can cause out of memory errors.
To diagnose, use "dbstop if all error" to set a breakpoint on errors. The out of memory will trigger this, and you can use dbup, dbdown, and whos() to see what's consuming memory. Often an OOM is caused by a bad array size or index calculation, not just by big data structures. E.g. this will trigger an OOM in pretty much any 32-bit MATLAB.
>> x = 1;
>> x(2^30) = 2
??? Out of memory. Type HELP MEMORY for your options.
I faced a similar error while running an (old) C file in MATLAB using mex.
I found my solution at this issue on GitLab.
First, uncheck the option "Limit the maximum Array Size to a % of RAM" located under Preferences -> Workspace, as also indicated in this earlier answer.
Once applied, run your C file in the command window using
mex filename.c -compatibleArrayDims