I'm new in programming with matlab and trying to do the following:
I continously capture an image (size 1024x1024) with a camera to have a live image using the getdata function.
To do a measurement I would like to store only 100 images using a circular buffer- more precisely I'm thinking of storing 100 images and erasing the oldest images if new data is acquired and do a measurement on the last 100 images.
Hope my concern is understandable...
Thanks for an answer!
This question has been answered here by a worker from MathWorks : Create a buffer matrix for continuous measurements. ( He also made a video of it : http://blogs.mathworks.com/videos/2009/05/08/implementing-a-simple-circular-buffer/
The code :
buffSize = 10;
circBuff = nan(1,buffSize);
for newest = 1:1000;
circBuff = [newest circBuff(1:end-1)]
end
Check the update made by gnovice which applies the circular buffer to image processing.
What you call a "circular buffer" is known as a queue or FIFO (First In, First Out). Usually this would be stored in a linked list data structure, where every object (matrix, in your case) points to the next object. In Matlab however, there is not built-in linked list structure, but Matlab arrays (vectors/matrices) are pretty flexible and efficient when it comes to manipulating them.
So you can simply store each image as a matrix inside an array of length 100, giving you a 3 dimensional matrix of dimensions 100x1024x1024. Then, when you get new data you simply remove the last matrix from the array and insert a new matrix at the beginning of the array. Hopefully this will be fast enough for you.
Good luck!
May you can create an array of 100 1024x1024-matrices. and refer the following link to maintain the read and write position.
logic of circular buffer
Related
This is a cross-post from here:
Link to post in the Mathworks community
Currently I'm working with large data sets, I've saved those data set as matlab files with the two biggest files being 9.5GB and 5.9GB.
These files contain a cell array each of 1x8 (this is done for addressibility and to prevent mixing up data from each of the 8 cells and I specifically wanted to avoid eval).
Each cell then contains a 3D double matrix, for one it's 1001x2002x201 and the other it is 2003x1001x201 (when process it I chop of 1 row at the end to get it to 2002).
Now I'm already running my script and processing it on a server (64 cores and plenty of RAM, matlab crashed on my laptop, as I need more than 12GB ram on windows). Nonetheless it still takes several hours to finish running my script and I still need to do some extra operations on the data which is why I'm asking advice.
For some of the large cell arrays, I need to find the maximum value of the entire set of all 8 cells, normally I would run a for loop to get the maximum of each cel and store each value in a temporay numeric array and then use the function max again. This will work for sure I'm just wondering if there's a better more efficient way.
After I find the maximum I need to do a manipulation over all this data as well, normally I would do something like this for an array:
B=A./maxvaluefound;
A(B > a) = A(B > a)*constant;
Now I could put this in a for loop, adress each cell and run this, however I'm not sure how efficient that would be though. Do you think there's a better way then a for loop that's not extremely complicated/difficult to implement?
There's one more thing I need to do which is really important, each cell as I said before is a slice (consider it time), while inside each slide is the value for a 3D matrix/plot. Now I need to integrate the data so that I get more slices. The reason I need to do this that I need to create slices/frames/plots to create a movie/gif. I'm planning on plotting the 3d data using scatter3 where this data is represented by color. I plan on using alpha values to make it see through so that one can actually see the intensity in this 3d plot. However I understand how to use griddata but apparently it's quite slow. Some of the other methods where hard to understand. Thus what would be the best way to interpolate these (time) slices in an efficient way over the different cells in the cell array? Please explain it if you can, preferably with an example.
I've added a pic for the Linux server info I'm running it on below, note I can not update the matlab version unfortunately, it's R2016a:
I've also attached part of my code to give a better idea of what I'm doing:
if (or(L03==2,L04==2)) % check if this section needs to be executed based on parameters set at top of file
load('../loadfilewithpathnameonmypc.mat')
E_field_650nm_intAll=cell(1,8); %create empty cell array
parfor ee=1:8 %run for loop for cell array, changed this to a parfor to increase speed by approximately 8x
E_field_650nm_intAll{ee}=nan(szxit(1),szxit(2),xres); %create nan-filled matrix in cell 1-8
for qq=1:2:xres
tt=(qq+1)/2; %consecutive number instead of spacing 2
T1=griddata(Xsall{ee},Ysall{ee},EfieldsAll{ee}(:,:,qq)',XIT,ZIT,'natural'); %change data on non-uniform grid to uniform gridded data
E_field_650nm_intAll{ee}(:,:,tt)=T1; %fill up each cell with uniform data
end
end
clear T1
clear qq tt
clear ee
save('../savelargefile.mat', 'E_field_650nm_intAll', '-v7.3')
end
if (L05==2) % check if this section needs to be executed based on parameters set at top of file
if ~exist('E_field_650nm_intAll','var') % if variable not in workspace load it
load('../loadanotherfilewithpathnameonmypc.mat');
end
parfor tt=1:8 %run for loop for cell array, changed this to a parfor to increase speed by approximately 8x
CFxLight{tt}=nan(szxit(1),szxit(2),xres); %create nan-filled matrix in cells 1 to 8
for qq=1:xres
CFs=Cafluo3D{tt}(1:lxq2,:,qq)'; %get matrix slice and tranpose matrix for point-wise multiplication
CFxLight{tt}(:,:,qq)=CFs.*E_field_650nm_intAll{tt}(:,:,qq); %point-wise multiple the two large matrices for each cell and put in new cell array
end
end
clear CFs
clear qq tt
save('../saveanotherlargefile.mat', 'CFxLight', '-v7.3')
end
I'm trying to stack different DICOM files into one multi-slice series, so to visualize them on ITK-Snap. However, I don't seem to be able to obtain a functioning DICOM series.
I have sorted all files with regards to their slice positioning, and I have a number of ordered single .dcm files, with their original info. I substituted all their original series instance UIDs with one single uid, and their series number with one custom series number I have set to '999' (so to make them belong to one series). Image orientation is set to [1;0;0;0;1;0] for all files, and slice thickness at 8 mm for all files.
I have then created an array of info structures, with the original slice positionings [info(num)].
I have tried something like:
for i=1:num %where num is the number of dicom files
k = num2str(i);
dicomwrite(imm,k,info(i),'CreateMode','Copy'); %where imm is the matrix I obtained with dicomread
end
I have obtained a new set of dicom files, named as numbers from 1 to num, however when I try to open the series on ITK-snap, it runs into an exception stating the vector is too long. I can open single dicom files on ITK-snap, however when more than one images are part of the series, and the series is visualized as 256x212xnum where num is the number of files, I run into the exception.
What am I doing wrong?
What you are trying to do is called Multi-frame in the DICOM standard. In short you need to, next to making sure that all your image meta data is still correct, specify Number of Frames (0028,0008) and Frame Increment Pointer (0028,0009). Unfortunately the wording on how the Frame Increment Pointer tag exactly works is a bit vague:
The frames within a Multi-frame Image shall be conveyed as a logical sequence. The information that determines the sequential order of the frames shall be identified by the Data Element Tag or Tags conveyed by the Frame Increment Pointer (0028,0009). Each specific Image IOD that supports the Multi-frame Module specializes the Frame Increment Pointer (0028,0009) to identify the Attributes that may be used as sequences.
Even if only a single frame is present, Frame Increment Pointer (0028,0009) is still required to be present and have at least one value, each of which shall point to an Attribute that is also present in the Data Set and has a value. 1
Hope that helps.
I have some problems with matlab 2015a Win10 X64 16GB Ram.
There is a bunch of images (1280x960x8bit) and I want to load them into a 3d matrix. In theory that matrix should store ~1.2GB for 1001 images.
What I have so far is:
values(:,:,:)= zeros(960, 1280, 1001, 'uint8');
for i = Start:Steps:End
file = strcat(folderStr, filenameStr, num2str(i), '.png');
img = imread(file);
values(:,:,i-Start+1) = img;
end
This code is working for small amount of images but using it for all 1001 images I get "Out of memory" error.
Another problem is the speed.
Reading 50 images and save them takes me ~2s, reading 100 images takes ~48s.
What I thought this method does is allocating the memory and change the "z-elements" of the matrix picture by picture. But obviously it holds more memory than needed to perform that single task.
Is there any method I can store the grey values of a 2d picture sequence to a 3d matrix in matlab without wasting that much time and ressources?
Thank you
The only possibility I can see is that your idexes are bad. But I can only guess because the values of start step and End are not given. If End is 100000000, Start is 1 and step is 100000000, you are only reading 2 images, but you are accessing values(:,:,100000000) thus making the variable incredibly huge. That is, most likely, your problem.
To solve this, create a new variable:
imagenames=Start:Step:End; %note that using End as a variable sucks, better ending
for ii=1:numel(imagenames);
file = strcat(folderStr, filenameStr, num2str(imagenames(ii)), '.png');
img = imread(file);
values(:,:,ii) = img;
end
As Shai suggests, have a look to fullfile for better filename accessing
I have a large stack of 800 16bit gray scale images with 2048x2048px. They are read from a single BigTIFF file and the whole stack barely fits into my RAM (8GB).
Now I need do a median projection. That means I want to compute the median of each pixel across all 800 frames. The Matlab median function fails because there is not enough memory left make a copy of the whole array for the function call. What would be an efficient way to compute the median?
I have tried using a for loop to compute the median one pixel at a time, but this is still terribly slow.
Iterating over blocks, as #Shai suggests, may be the most straightforward solution. If you do have this problem frequently, you may want to consider converting the image to a mat-file, so that you can access the pixels as n-d array directly from disk.
%# convert to mat file
matObj = matfile('dest.mat','w');
matObj.data(2048,2048,numSlices) = 0;
for t = 1:numSlices
matObj.data(:,:,t) = imread(tiffFile,'index',t);
end
%# load a block of the matfile to take median (run as part of a loop)
medianOfBlock = median(matObj.data(1:128,1:128,:),3);
I bet that the distributions of the individual pixel values over the stack (i.e. the histograms of the pixel jets) are sparse.
If that's the case, the amount of memory needed to keep all the pixel histograms is much less than 2K x 2K x 64k: you can use a compact hash map to represent each histogram, and update them loading the images one at a time. When all updates are done, you go through your histograms and compute the median of each.
If you have access to the Image Processing Toolbox, Matlab has a set of tool to handle large images called Blockproc
From the docs :
To avoid these problems, you can process large images incrementally: reading, processing, and finally writing the results back to disk, one region at a time. The blockproc function helps you with this process.
I will try my best to provide help (if any), because I don't have an 800-stack TIFF image, nor an 8GB computer, but I want to see if my thinkings can form a solution.
First, 800*2048*2048*8bit = 3.2GB, not including the headers. With your 8GB RAM it should not be too difficult to store it at once; there might be too many programs running and chopping up the contiguous memories. Anyway, let's treat the problem as Matlab can't load it as a whole into the memory.
As Jonas suggests, imread supports loading a TIFF image by index. It also supports a PixelRegion parameter, so you can also consider accessing parts of the image by this parameter if you want to utilize Shai's idea.
I came up with a median algo that doesn't use all the data at the same time; it barely scans through a sequence of un-ordered data, one at each time; but it does keep a memory of 256 counters.
_
data = randi([0,255], 1, 800);
bins = num2cell(zeros(256,1,'uint16'));
for ii = 1:800
bins{data(ii)+1} = bins{data(ii)+1} + 1;
end
% clearvars data
s = cumsum(cell2mat(bins));
if find(s==400)
med = ( find(s==400, 1, 'first') + ...
find(s>400, 1, 'first') ) /2 - 1;
else
med = find(s>400, 1, 'first') - 1;
end
_
It's not very efficient, at least because it uses a for loop. But the benefit is instead of keeping 800 raw data in memory, only 256 counters are kept; but the counters need uint16, so actually they are roughly equivalent to 512 raw data. But if you are confident that for any pixel the same grayscale level won't count for more than 255 times among the 800 samples, you can choose uint8, and hence reduce the memory by half.
The above code is for one pixel. I'm still thinking how to expand it to a 2048x2048 version, such as
for ii = 1:800
img_data = randi([0,255], 2048, 2048);
(do stats stuff)
end
By doing so, for each iteration, you only need these kept in memory:
One frame of image;
A set of counters;
A few supplemental variables, with size comparable to one frame of image.
I use a cell array to store the counters. According to this post, a cell array can be pre-allocated while its elements can still be stored in memory non-contigously. That means the 256 counters (512*2048*2048 bytes) can be stored separately, which is quite reasonable for your 8GB RAM. But obviously my sample code does not make use of it since bins = num2cell(zeros(....
I have a MATLAB variable that is a 3x6 cell array. One of the columns of the cell array holds at most 150-200 small RGB images, like 16x20 pixel size (again, at most). The rest of the columns are:
an equal number of labels that are strings of a 3 or 4 characters,
an image mask, which is about 350x200
3 integers
For some reason saving this object is taking a very long time, or at least for the size of the object. It has already been 10 minutes(which isn't too bad, but I plan on expanding the size of the object to hold several thousand of those small images) and MATLAB doesn't seem to be making any progress. In fact, when I open the containing directory of the variable, its size is cycling between 0 bytes to about 120kB. (i.e. it will increase to 120 in steps of 30 or 40 kB, then restart).
Is this normal behavior? Do MATLAB variables always take so long to save? What's going on here?
Mistake: I'm saving AllData, not my SVM variable. AllData has the same data as the SVM keeper, less the actual SVM itself and one integer.
What particular points of the code would be helpful to show for solving this? The code itself is a few hundred lines and broken up in several functions. What would be important to consider to troubleshoot this? When the variable is created? when it's saved? The way I create the smaller images?
Hate to be the noob who takes a picture of their desktop. but the machine I'm working has problems taking screenshots. Anyway, here it is
Alldata/curdata are just subsets of the 3x7 array... actually it's a 3x8, but the last is just an int.
Interesting side point: I interrupted the saving process and the variable seemed to save just fine. I trained a new svm on the saved data and it works just fine. I'd like to not do that in the future though.
Using whos:
Name Size Bytes Class Attributes
AllData 3x6 473300 cell
Image 240x352x3 253440 uint8
RESTOREDEFAULTPATH_EXECUTED 1x1 1 logical
SVMKeeper 3x8 2355638 cell
ans 3x6 892410 cell
curData 3x6 473300 cell
dirpath 1x67 134 char
im 240x352x3 1013760 single
s 1x1 892586 struct
Updates:
1.Does this always happen, or did you only do it once?
-It always happens
2.Does it take the same time when you save it to a different (local) drive?
-I will investigate this more when I get back to that computer
3.How long does it take to save a 500kb matrix to that folder?
-Almost instantaneous
4.And as asked above, what is the function call that you use?
-Code added below
(Image is a rgb image)
MaskedImage(:,:,1)=Image(:,:,1).*Mask;
MaskedImage(:,:,2)=Image(:,:,2).*Mask;
MaskedImage(:,:,3)=Image(:,:,3).*Mask;
MaskedImage=im2single(MaskedImage);
....
(I use some method to create a bounding box around my 16x20 image)
(this is in a loop of that occurs about 100-200 times)
Misfire=input('is this a misfire?','s');
if (strcmpi(Misfire,'yes'))
curImageReal=MaskedImage(j:j+Ybound,i:i+Xbound,:);
Training{curTrainingIndex}=curImageReal; %Training is a cell array of images
Labels{curTrainingIndex}='ncr';
curTrainingIndex=curTrainingIndex+1;
end
(the loop ends)...
SaveAndUpdate=input('Would you like to save this data?(say yes,definitely)','s');
undecided=1;
while(undecided)
if(strcmpi(SaveAndUpdate,'yes,definitely'))
AllData{curSVM,4}=Training;
AllData{curSVM,5}=Labels;
save(strcat(dirpath,'/',TrainingName),'AllData'); <---STUCK HERE
undecided=0;
else
DontSave=input('Im not going to save. Say YESNOSAVE to NOT SAVE','s')
if(strcmpi(DontSave,'yesnosave'))
undecided=0;
else
SaveAndUpdate=input('So... save? (say yes,definitely)','s');
end
end
end
It is a bit unclear if you are doing some custom file saving or not. If it is the first, I'm guessing that you have a really slow save loop going on, maybe some hardware issues. Try to save the data using MATLAB's save function:
tic
save('test.mat', 'AllData')
toc
if that works fine try to work your way from there e.g. saving one element at a time etc.
You can profile your code by using the profiler, open it with the command profile viewer and then type in the code, script or function that you want to profile in the input text field.
This isn't a great answer, but it seems that the problem was that I was saving the version of my image after I had converted it to a single. I don't know why this would cause such a dramatic slowdown (after removing this line of code it worked instantly) so if someone could edit my answer to shed more light on the situation, that would be appreciated.