Matlab - Assignment into array causes error: "Maximum variable size allowed by the program is exceeded" - matlab

I am running a script which creates a lot of big arrays. Everything runs fine until the following lines:
%dist is a sparse matrix
inds=dist~=0;
inserts=exp(-dist(inds).^2/2*sig_dist);
dist(inds)=inserts;
The last line causes the error: ??? Maximum variable size allowed by the program is exceeded.
I don't understand how could the last line increase the variable size - notice I am inserting into the matrix dist only in places which were non-zero to begin with. So what's happening here?

I'm not sure why you are seeing that error. However, I suggest you use the Matlab function spfun to apply a function to the nonzero elements in a sparse matrix. For example:
>>dist = sprand(10000,20000,0.001);
>>f = #(x) exp(-x.^2/2*sig_dist);
>>dist = spfun(f,dist)

MATLAB implements a "lazy copy-on-write" model. Let me explain with an example.
First, create a really large vector
x = ones(5*1e7,1);
Now, say we wanted to create another vector of the same size:
y = ones(5*1e7,1);
On my machine, this will fail with the following error
??? Out of memory. Type HELP MEMORY for your options.
We know that y will require 5*1e7*8 = 400000000 bytes ~ 381.47 MB (which is also confirmed by which x), but if we check the amount of free contiguous-memory left:
>> memory
Maximum possible array: 242 MB (2.540e+008 bytes) *
Memory available for all arrays: 965 MB (1.012e+009 bytes) **
Memory used by MATLAB: 820 MB (8.596e+008 bytes)
Physical Memory (RAM): 3070 MB (3.219e+009 bytes)
* Limited by contiguous virtual address space available.
** Limited by virtual address space available.
we can see that it exceeds the 242 MB available.
On the other hand, if you assign:
y = x;
it will succeed almost instantly. This is because MATLAB is not actually allocating another memory chunk of the same size as x, instead it creates a variable y that shares the same underlying data as x (in fact, if you call memory again, you will see almost no difference).
MATLAB will only try to make another copy of the data once one of the variables changes, thus if you try this rather innocent assignment statement:
y(1) = 99;
it will throw an error, complaining that it ran out of memory, which I suspect is what is happening in your case...
EDIT:
I was able to reproduce the problem with the following example:
%# a large enough sparse matrix (you may need to adjust the size)
dist = sparse(1:3000000,1:3000000,1);
First, lets check the memory status:
» whos
Name Size Bytes Class Attributes
dist 3000000x3000000 48000004 double sparse
» memory
Maximum possible array: 394 MB (4.132e+008 bytes) *
Memory available for all arrays: 1468 MB (1.539e+009 bytes) **
Memory used by MATLAB: 328 MB (3.440e+008 bytes)
Physical Memory (RAM): 3070 MB (3.219e+009 bytes)
* Limited by contiguous virtual address space available.
** Limited by virtual address space available.
say we want to apply a function to all non-zeros elements:
f = #(X) exp(-X.^2 ./ 2);
strange enough, if you try to slice/assign then it will fail:
» dist(dist~=0) = f( dist(dist~=0) );
??? Maximum variable size allowed by the program is exceeded.
However the following assignment does not throw an error:
[r,c,val] = find(dist);
dist = sparse(r, c, f(val));
I still don't have an explanation why the error is thrown in the first case, but maybe using the FIND function this way will solve your problem...

In general, reassigning non-sparse elements does not change the memory footprint of the matrix. Call whos before and after assignment to check.
dist = sparse(10, 10);
dist(1,1) = 99;
dist(6,7) = exp(1);
inds = dist ~= 0;
whos
dist(inds) = 1;
whos
Without a reproducible example it is hard to determine the cause of the problem. It may be that some intermediate assignment is taking place that isn't sparse. Or you have something specific to your problem that we can't see here.

Related

Does MATLAB's implicit broadcasting optimise based on surrounding code?

A question asked today gave a surprising result with respect to implicit array creation:
array1 = 5*rand(496736,1);
array2 = 25*rand(9286,1);
output = zeros(numel(array1), numel(array2)); % Requires 34GB RAM
output = zeros(numel(array1), numel(array2),'logical'); % Requires 4.3GB RAM
output = abs(bsxfun(#minus, array1.', array2)) <= 2; % Requires 34GB RAM
output = pdist2(array1(:), array2(:)) <= 2; % Requires 34GB RAM
So far so good. An array containing 496736*9286 double values should be 34GB, and a logical array containing the same amount of elements requires just 4.3GB (8 times smaller). This happens for the latter two because they use an intermediate matrix containing all the distance pairs in double precision, requiring the full 34GB, whereas a logical matrix is directly pre-allocated as just logicals, and needs 4.3GB.
The surprising part comes here:
output = abs(array1.' - array2); % Requires 34GB RAM
output = abs(array1.' - array2) <= 2; % Requires 4.3GB RAM ?!?
What?!? Why doesn't implicit expansion require the same 34GB RAM due to creation of the intermediate double matrix output = abs(array1.' - array2)?
This is especially strange since implicit expansion is, as I understood it, a short way of writing the old bsxfun solutions. Thus why does bsxfun create the full, 34GB, matrix, whereas implicit expansion does not?
Does MATLAB somehow recognise the output of the operation should be a logical matrix?
All tests performed on MATLAB R2018b, Ubuntu 18.04, 16GB RAM (i.e. 34GB arrays error out)

Matlab Horzcat - Out of memory

Any trick to avoid an out of memory error in matlab?
I am assuming that the reason it shows up is because matlab is very inefficient in using horzcat and actually needs to temporarily duplicate matrices.
I have a matrix A with size 108977555 x 25. I want to merge this with three vectors d, m and y with size 108977555 x 1 each.
My machine has 32GB ram, and the above matrice + vectors occupy 18GB.
Now I want to run the following command:
A = [A(:,1:3), d, m, y, A(:,5:end)];
But that yields the error:
Error using horzcat
Out of memory. Type HELP MEMORY for your options.
Any trick to do this merge?
Working with Large Data Sets. If you are working with large data sets, you need to be careful when increasing the size of an array to avoid getting errors caused by insufficient memory. If you expand the array beyond the available contiguous memory of its original location, MATLAB must make a copy of the array and set this copy to the new value. During this operation, there are two copies of the original array in memory.
Restart matlab, I often find it doesn't fully clean up its memory or it get's fragmented, leading to lower maximal array sizes.
Change your datatype (if you can). E.g. if you're only dealing with numbers 0 - 255, use uint8, the memory size will reduce by a factor 8 compared to an array of doubles
Start of with A already large enough (i.e. 108977555x27 instead of 108977555x25 and insert in place:
A(:, 4) = d;
clear d
A(:, 5) = m;
clear m
A(:, 6) = y;
Merge the data in one datatype to reduce total memory requirement, eg a date easily fits into one uint32.
Leave the data separated, think about why you want the data in one matrix in the first place and if that is really necessary.
Use C-code to do the data allocation yourself (only if you're really desperate)
Further reading: https://nl.mathworks.com/help/matlab/matlab_prog/memory-allocation.html
Even if you could make it using Gunther's suggestions, it will just occupy memory. Right now it takes more than half of available memory. So, what are you planning to do then? Even simple B = A+1 doesn't fit. The only thing you can do is stuff like sum, or operations on part of array.
So, you should consider going to tall arrays and other related big data concepts, which are exactly meant to work with such large datasets.
https://www.mathworks.com/help/matlab/tall-arrays.html
You can first try the efficient memory management strategies as mentioned on the official mathworks site : https://in.mathworks.com/help/matlab/matlab_prog/strategies-for-efficient-use-of-memory.html
Use Single (4 bytes) or some other smaller data type instead of Double (8 bytes) if your code can work with that.
If possible use block processing (like rows or columns) i.e. store blocks as separate mat files and load and access only those parts of the matrix which are required.
Use matfile command for loading large variables in parts. Perhaps something like this :
save('A.mat','A','-v7.3')
oldMat = matfile('A.mat');
clear A
newMat = matfile('Anew.mat','Writeable',true) %Empty matfile
for i=1:27
if (i<4), newMat.A(:,i) = oldMat.A(:,i); end
if (i==4), newMat.A(:,i) = d; end
if (i==5), newMat.A(:,i) = m; end
if (i==6), newMat.A(:,i) = y; end
if (i>6), newMat.A(:,i) = oldMat.A(:,i-2); end
end

Out of memory - default matlab database and code

I'm learning nn toolbox with matlab examples and i've got all time error
Out of memory. Type HELP MEMORY for your
options. Error in test2 (line 10) xTest = zeros(inputSize,numel(xTestImages));
Here is my simply code
% Get the number of pixels in each image
imageWidth = 28;
imageHeight = 28;
inputSize = imageWidth*imageHeight;
% Load the test images
[xTestImages, outputs] = digittest_dataset;
% Turn the test images into vectors and put them in a matrix
xTest = zeros(inputSize,numel(xTestImages));
for i = 1:numel(xTestImages)
xTest(:,i) = xTestImages{i}(:);
end
code is written according to
mathwork example (but im trying to do my own custom network). I reinstall matlab, make maximum java RAM storage, clean some disk space and delate rest of neural network. Still not working. Any ideas how to fix this problem?
As written above, the line:
xTest = zeros(inputSize,numel(xTestImages)); # xTestImages is 1x5000
would yield a matrix of size 28^2*5000= 3,920e6 elements. Every element has a double precision (8byte), hence the matrix would consume around 30mb...
You stated, that the command memory shows the following:
Maximum possible array: 29 MB (3.054e+07 bytes)
* Memory available for all arrays: 467 MB (4.893e+08 bytes)
** Memory used by MATLAB: 624 MB (6.547e+08 bytes)
Physical Memory (RAM): 3067 MB (3.216e+09 bytes)
So the first line shows the limitation for ONE single array.
So a few things to consider:
I guess clear all or quitting some other running applications does not improve the situation!?
Do you use a 64 or 32bit OS? And/or MATLAB 32/64 bit?
Did you try to change the Java Heap Settings? https://de.mathworks.com/help/matlab/matlab_external/java-heap-memory-preferences.html
I know this won't fix the problem, but maybe it will help you to keep on working in the meanwhile: You could create the matrix with single precision which should work for your testcase. Simply pass single as second option while creating the matrix.
Out of memory was created by Levenberg–Marquardt algorithm - it's create huge Jacobian matrix for calculations when data is big.

MATLAB Error:Out of Memory

So I'm trying to perform STFT on a piano recording using matlab, but I get the following error.
Warning: Input arguments must be scalar.
In test3 at 35
??? Error using ==> zeros
Out of memory. Type HELP MEMORY for your options.
Error in ==> test3 at 35
song = cat(1,song,zeros(n_of_padding,1));
The coding I've used is taken from a sample code found on the net.
clc;
clear all;
[song,FS] = wavread('c scale fast.wav');
song = sum(song,2);
song = song/max(abs(song));
wTime = 0.05;
ZP_exp = 1;
P_OL = 50;
% Number of STFT samples per STFT slice
N_window = floor(wTime*FS);
% Number of overlapping points
window_overlap = floor(N_window*(P_OL/100));
wTime = N_window/FS;
%size checking
%make sure there are integer number of windows if not zero pad until they are
L = size(song);
%determine the number of times-1 the overlapping window will fit the song length
N_of_windows = floor(L - N_window/(N_window - window_overlap));
%determine the remainder
N_of_points_left = L - (N_window + N_of_windows*(N_window - window_overlap));
%Calculate the number of points to zero pad
n_of_padding = (N_window - window_overlap) - N_of_points_left;
%append the zeros to the end of the song
song = cat(1,song,zeros(n_of_padding,1));
clear n_of_windows n_of_points_left n_of_padding
n_of_windows = floor((L - N_window)/(N_window - window_overlap))+1;
windowing = hamming(N_window);
N_padding = 2^(nextpow2(N_window)+ZP_exp);
parfor k = 1:N_of_windows
starting = (k-1)*(N_window -window_overlap) +1;
ending = starting+N_window-1;
%Define the Time of the window, i.e., the center of window
times(k) = (starting + ceil(N_window/2))/Fs;
%apply windowing function
frame_sample = music(starting:ending).*windowing;
%take FFT of sample and apply zero padding
F_trans = fft(frame_sample,N_padding);
%store FFT data for later
STFT_out(:,k) = F_trans;
end
Based on some assumptions I would reason that:
- n_of_padding should be smaller than N_window
- N_window is much smaller FS
- Fs is not too high (frequency of your sound, so should not exceed a few thousand?!)
- Your zeros matrix will not be huge
This should mean that the problem is not that you are creating a too large matrix, but that you already filled up the memory before this call.
How to deal with this?
First type dbstop if error
Run your code
When it stops check all variable sizes to see where the space has gone.
If you don't see anything strange (and the big storage is really needed) then you may be able to process your song in parts.
In line 35 you are trying to make an array that exceeds your available memory. Note that a 1 by n array of zeros alone, is n*8 bytes in size. This means if you make such an array, call it x, and check it with whos('x'), like:
x = zeros(10000,1);
whos('x');
You will likely find that x is 80000 bytes. Maybe by adding such an array to your song variable is adding the last bytes that breaks the memory-camel's back. Using and whos('variableName') take whatever the size of song is before line 35, separately add the size of zeros(n_of_padding,1), convert that to MB, and see if it exceeds your maximum possible memory given by help memory.
The most common implication of Out of memory errors on Matlab is that it is unable to allocate memory due to the lack of a contiguous block. This article explains the various reasons that can cause an Out of memory error on MATLAB.
The Out of memory error often points to a faulty implementation of code that expands matrices on the fly (concatenating, out-of-range indexing). In such scenarios, MATLAB creates a copy in memory i.e memory twice the size of the matrix is consumed with each such occurrence.
On Windows this problem can be alleviated to some extent by passing the /3GB /USERVA=3030 switch during boot as explained here. This enables additional virtual memory to be addressed by the application(MATLAB in this case).

How to run the code for large variables

I have one code like below-
W = 3;
i = 4;
s = fullfact(ones(1,i)*(W + 1)) - 1;
p2 = unique(sort(s(sum(s,2) == i,:),2),'rows');
I can run this code only upto "i=11" but i want to run this code for upto "i=25".When i run these code for i=12 it shows error message "Out of Memory".
I need to keep these code as it is.How can i modify these code for larger value of "i"?
Matlab experts need your valuable suggestion.
Just wanting to do silly things is not enough. You are generating arrays that are simply too large to fit into memory.
See that the size of the matrix s is a function of i here. size(s) will be 2^(2*i) by i. (By the way, some will argue it is a bad idea to use i as a variable, which is normally sqrt(-1), for such variables.)
So when i = 4, s is only 256x4.
When i = 11, s is 4194304x11. This array takes 369098752 bytes of space, so 370 megabytes.
When i = 25, the array will be of size
2^50*25
ans =
2.8147e+16
Multiply that by 8 to get the memory needed. Something like 224 petabytes of memory! If you have that much memory, then send me a few terabytes of RAM. You won't miss them.
Yes, there are times when MATLAB runs out of memory. You can get the amount of memory available at any point of time by executing the following:
memory
However, I would suggest follow one of the strategies to reduce memory usage available here. Also, you might want to clear the variables which are not required in every iteration by
clear variable_name