Unload matrix and free memory - matlab

I can load a matrix from text file:
load mydata.txt
The problem is my matrix file is about 250Mb and after several such loads I have no memory to work with next files.
How could unload it and free resources for further use?

Use clear, or clearvars. By default, MATLAB will create a variable called mydata as a result of your statement, so
clear mydata

Find the variables in your workspace that contain the large data sets and in your script or from the console type
clear whateverVariableName
To clear all memory use
clear all
You can even right-click individual variables in he workspace editor and delete them using the IDE if you wish.

What you have to do clear mydata and then issue pack . The first command says to Matlab that the reference to the memory held for mydata is not needed anymore. The second command instruct the Matlab to free unused memory. If you don't issue pack, then memory will be deallocated when the Matlab memory manager decides to.

Related

How can I make a saving code faster? -MatLab

I'm running a short code to open one by one a list of files and saving back only one of the variables contained in the files. The process seems to me much slower than I expected and getting slower with time, I don't fully understand why and how I could make it run faster. I always struggle with optimization. I'd appreciate if you have suggestions.
The code is the following (the ... substitute the actual path just for example):
main_dir=dir(strcat('\\storage2-...\Raw\DAQ5\'));
filename={};
for m=7:size(main_dir,1)
m
second_dir=dir([main_dir(m).folder '\' main_dir(m).name '\*.mat']);
for mm=1:numel(second_dir)
filename{end+1}=[second_dir(mm).folder '\' second_dir(mm).name];
for mmm=1:numel(filename)
namefile=sprintf(second_dir(mm,1).name);
load(string(filename(1,mmm)));
save(['\\storage2-...\DAQ5\Ch1_',namefile(end-18:end-4),'.mat'], 'Ch_1_y')
end
end
end
The original file is about 17 MB and once the single variable is saved it is about 6 MB in size.
The Matlab load function takes an optional additional argument to specify just a selected variable to read from the input file.
s = load('path/to/file.mat', 'Ch_1_y');
That way you don't have to spend time loading in all the other variables from those input .mat files that you're just going to immediately throw away.
And using save to save MAT-files over SMB shares can be slow. You might want to call save to write it to a temporary local file first, and then copy the completed file to the final destination. Sounds like more I/O, but it can actually be a net win, depending on your particular system and network. Measure it both ways to see if it's a win in your particular situation.

Free physical memory in matlab [duplicate]

I can load a matrix from text file:
load mydata.txt
The problem is my matrix file is about 250Mb and after several such loads I have no memory to work with next files.
How could unload it and free resources for further use?
Use clear, or clearvars. By default, MATLAB will create a variable called mydata as a result of your statement, so
clear mydata
Find the variables in your workspace that contain the large data sets and in your script or from the console type
clear whateverVariableName
To clear all memory use
clear all
You can even right-click individual variables in he workspace editor and delete them using the IDE if you wish.
What you have to do clear mydata and then issue pack . The first command says to Matlab that the reference to the memory held for mydata is not needed anymore. The second command instruct the Matlab to free unused memory. If you don't issue pack, then memory will be deallocated when the Matlab memory manager decides to.

Matlab - Force to Retain Breakpoints

I wish to know, if there is a way to force Matlab to retain all historically placed breakpoints -red dots which enable code debugging- in the Matlab Editor/Debugger inside functions, classes, etc. from one session to another, for example, and without being deleted with clear all commands.
It would be easy for debugging huge pieces of software while changes are introduced, and because Matlab sometimes simply shut down because of internal errors.
Thanks fellows.
dbstop is the cleaner solution. Just insert it in the place you want the debugging to stop and this will not be removed until you edit or comment the line.
You need to save the breakpoints and reload them in next session. You can use dbstatus to get a structure that contains information about all breakpoints and save it into a file:
s = dbstatus('-completenames');
save FILENAME s
and later retrieve them using dbstop
load FILENAME
dbstop(s);
You can automate it by including it in startup.m and finish.m files (create them on default user path if they don't exist).

Not enough storage is available to complete this operation - matlab

I am using Matlab 2014a under Windows7. I am running a loop that reads very big xlsx files (~40MB each). After I am done with a file I use 'clear' in order to free the memory taken by reading the file. The thing is that every once in a while the script is stops and giving me an error message:
Error using xlsread (line 247)
Error: Not enough storage is available to complete this operation.
I want to emphasis that after each time I am finishing with a file I clear all the variables, so each iteration only one file is loaded. If I restart Matlab the script may work again - making me believe that some how 'clear' command doesn't free all the memory that was allocated. is there a way to really free the memory that once was allocated in matlab?
thank you very much
Ariel
If restarting Matlab is not an option, the "pack" function should help. Otherwise you could also use matlab without the gui and write a shell script that starts and matlab for each file.

load .mat file - run out of memory

I have a matrix cube which I load in my program to read data from. The size of this .mat file is 2.8 GB. I am not being able to load it with the error of 'running out of memory'. Is there a way to fix this?
You can use the matfile class to work on ranges within variables inside MatLab files. See
Load and save parts of variables in MAT-files
Here's some additional discussion that discloses that this feature is new with R2011b.
If the size of the data exceeds the available memory on your machine, then you are in trouble - this is unavoidable. However, if you only want certain variables inside the .mat file you can try to load just those variables using the
load(filename, variables)
version of the load function. It really depends on the contents of your .mat file. If the file is 2.8GB and you need ALL of the variables in the file and your machine does not have enough memory to cope, your only option is to buy more RAM.
EDIT Apparently this answer is incorrect if you are running R2011b and above as explained in the answer of Ben Voight