An error while importing an image as a variable into Matlab - matlab

I wanted to import a tif image into Matlab workspace as a variable using File/import data tool. But I got the following error "Warning: The datatype for tag SamplesPerPixel should be TIFF_SHORT instead of TIFF_LONG. This may cause data corruption". The image type is float single, 32 bit. and size is really big (4144,12619,7). Can matlab read and display such an image. What does this error mean? and how can I correct it?
Thank you so much

Read the TIFF specification.
From the warning message it appears there's some problem with the format chosen. When reading a TIFF file, each IFD has many entries, one of them being SamplesPerPixel (see page 24 of the specification). This should be of type SHORT (see page 15 for the list of types and what they are). However, apparently, you have type LONG there. That seems to be causing problem. Either matlab is identifying it incorrectly, or the software you used to save the image is not following the TIFF specification.

Related

Problem in .fig to .eps conversion (MATLABR 2015b)

I am generating a matlab figure with features
Separate background regions generated using patch command
A couple of arrows using arrow.m (https://www.mathworks.com/matlabcentral/fileexchange/278-arrow)
Annotations,Labels,etc
I have tried Matlab option to convert to .eps format as well as print2eps.m from https://www.mathworks.com/matlabcentral/fileexchange/23629-export_fig. In both cases I get following warning
Warning: Loading EPS file failed, so unable to perform post-processing. This is usually because the figure contains a large number of patch objects. Consider exporting to a bitmap format in this
case.
Following is the screenshot of two images. Left is Matlab, Right is in latex
As one can obverse, output file is not anywhere near to original matlab figure. Any thoughts on addressing this issue would be much appreciated

Error using caffe Invalid input size

I tried to train my own neural net using my own imagedatabase as described in
http://caffe.berkeleyvision.org/gathered/examples/imagenet.html
However when I want to check the neural net after training on some standard images using the matlab wrapper I get the following output / error:
Done with init
Using GPU Mode
Done with set_mode
Elapsed time is 3.215971 seconds.
Error using caffe
Invalid input size
I used the matlab wrapper before to extract cnn features based on a pretrained model. It worked. So I don't think the input size of my images is the problem (They are converted to the correct size internally by the function "prepare_image").
Has anyone an idea what could be the error?
Found the solution: I was referencing the wrong ".prototxt" file (Its a little bit confusing because the files are quite similar.
So for computing features using the matlab wrapper one needs to reference the following to files in "matcaffe_demo.m":
models/bvlc_reference_caffenet/deploy.prototxt
models/bvlc_reference_caffenet/MyModel_caffenet_train_iter_450000.caffemodel
where "MyModel_caffenet_train_iter_450000.caffemodel" is the only file needed which is created during training.
In the beginning I was accidently referencing
models/bvlc_reference_caffenet/MyModel_train_val.prototxt
which was the ".prototxt" file used for training.

How to extract image features using caffe matlab wrapper?

Does anyone use the matlab wrapper for the caffe framework? Is there a way how to extract an 4096 dimensional feature vector from an image?
I was already following
https://github.com/BVLC/caffe/issues/432
and also tried to remove the last lines in imagenet_deploy.prototxt to remove layers as suggested in another forum on github.
But still when I run "matcaffe_demo(im, 1)" I only get a 1000 dim vector of scores (for the image net classes).
Any help would be appreciated
Kind regards
It seems that you might not be calling the correct prototxt file. If the last layer defined in the prototxt has the top blob of 4096 dimension, there is no way for the output to be 1000 dimension.
To be sure, try creating a bug in the prototxt file and see whether the program crashes. If it doesn't then the program indeed is reading some other prototxt file.

error reading a large binary file in MATLAB

I have to read in a large Binary file whose size is 92,504 KB. When I am using fread command MATLAB is giving me error:
Error using fread Out of memory. Type HELP MEMORY for your options.
I tried to restart MATLAB also so that if I am using any virtual memory it should be cleared but still the problem persists.
How can I solve this problem of reading data.
The problem is the code that you are using to read the data:
[data,count] = fread(fid,'uint8');
The above line tells matlab to read in as many uint8s as it can and put them into a vector.
The trouble is that matlab will put it into a vector of doubles. So rather than a vector where each element is one byte, you have a vector where each element is 8 bytes. This ends up making your 92Mb of data take up 92*8 = 736mb which is probably going to be bigger than the maximum possible array size shown by the memory command.
The solution here is to tell matlab to put the data you are reading into a vector of uint8 which can be achieved as follows:
[data,count] = fread(fid,'*uint8');
This method for reading in the data tells matlab that the output vector should be the same type as the input data. You can read more about it in the precision section of the fread documentation.
In a 32-bit system, you may have very less memory available to MATLAB. The fread command you are using reads the entire file at once. This is probably a bad idea, since you system is not having enough memory. A better way to implement would be to read file part by part. See,
A = fread(fileID, sizeA)
in link below[1]. You can put this code inside a loop. In case you want to read whole file at once, what i would recommend is to use a 64-bit system with 3GB RAM.
[1] http://www.mathworks.in/help/matlab/ref/fread.html

How can I read a text file of image intensity values and convert to a cv::Mat?

I am working on a project that requires reading intensity values of several images from a text file that has 3 lines of file header, followed by each image. Each image consists of 15 lines of header followed by the intensity values that are arranged in 48 rows, where each row has 144 tab-delimited pixel values.
I have already created a .mat file to read these into Matlab and create a structure array for each image. I'd like to use OpenCV to track features in the image sequence.
Would it make more sense to create a .cpp file that will read the text file or use OpenCV and Matlab mex files in order to accomplish my goal?
I'd recommend writing C++ code to read the file directly independent of Matlab. This way, you don't have to mess with row major vs. column major ordering and all that jazz. Also, are there specs for the image format? If it turns out to be a reasonably common format, you may be able to find an off-the-self reader/library for it.
If you want the visualization and/or other image processing capabilities of Matlab, then mex file might be a reasonable approach.