Histogram matching of two Images without using imhistmatch - matlab

I know imhistmatch can be used for RGB hsitogram matching of 2 images. I am trying to perform this same operation without using histeq or any other histogram-related functions and write everything from scratch

Related

How can i write my own matlab code to degrade an image by atmospheric turbulence function

How can I write a degradation function in Matlab to yield a degraded image?
But I want a degraded image by atmospheric turbulence.
Here below is the function
H(u,v)=e^(-k(u^2 + v^2)^5/6)
so how can I express this in Matlab?
Perhaps you can try some Matlab functions which allow you to blur the image using a define filter. You can try fspecial() to define your combination of filters. And use imfilter() to apply the filter to your image.
If you want to do with your own script, you can use these functions to get an idea, but it will be difficult. Likely, you will have to apply some convolution and transformations.

Matlab Image Histogram Analysis: how do I test for an underlying bimodal distribution?

I am working with image processing in MATLAB. I have two different images whose histogram plots are as shown below.
Image 1:
and
Image 2:
I have multiple images like those and the only distinguishing(separating) features is that some have single peak and others have two peaks.
In other words some can be thresholded (to generate good results) while others cannot. Is there any way I can separate the two images? Are there any functions that do so in MATLAB or any reference code that will help?
The function used is imhist()
If you mean "distinguish" by "separate", then yes: The property you describe is called bimodality, i.e. you have 2 peaks that can be seperated by one threshold. So your question is actually "how do I test for an underlying bimodal distribution?"
One option to do this programmatically is Binning. This is not the most robust method but the easiest. It might work, it might not.
Kernel Smoothing is probably the more robust solution. You basically shift and scale a certain function (e.g. Gaussian) to fit the data. This can be done with histfit in matlab.
There's more solutions for this problem which you can research for yourself since you now know the terms needed. Be aware though that your problem is not a trivial one if you want to do it properly.

Multiscale search for HOG+SVM in Matlab

first of all this is my first question here, so I hope I can explain it in a clear way.
My goal is to detect different classes of traffic signs in images. For that purpose I have trained binary SVMs following these steps:
First I got a database of cropped traffic signs like the one in the link. I considered different classes (prohibition, danger, etc), and negative images. All of them were scaled to 40x40 pixels.
http://i.imgur.com/Hm9YyZT.jpg
I trained linear-SVM models for each class (1-vs-all), using HOG as feature. Each image is described with a 1728-dimensional feature. (I append the three feature vectors for all three image planes). I did crossvalidation to set parameter C, and tested on previously unseen 40x40 images, and I got very accurate results (F1 score over 0.9 for all classes). I used libsvm for training and testing.
Now I'd want to detect signs in full road images, sliding a window in different image scales. The problem I'm facing is that I couldn't find any function that can do it for me (as DetectMultiScale in OpenCV), and my solution is very slow and rudimentary (I'm just doing a triple for loop, and for each scale I crop consecutive and overlapping 40x40 images, obtain HOG features and apply svmpredict for each one).
Can someone give me a clue to find a faster way to do it? I thought too about getting the HOG feature vector of the whole input image, and then reorder that vector to a matrix where each row will have the features corresponding to each 40x40 window, but I couldn't find a straightforward way of doing it.
Thanks,
I would suggest using SURF feature detection, however I don't know if this would also be too slow your needs.
See : http://morf.lv/modules.php?name=tutorials&lasit=2 for more information on how to implement and weather it is a viable solution for you.

Image cross-correlation with Matlab GPGPU, indexing into 3d array

The problem I'm encountering is writing code such that the built-in features of Matlab's GPU programming will correctly divide data for parallel execution. Specifically, I'm sending N 'particle' images to the GPU's memory, organized in a 3-d array with the third dimension representing each image, and attempting to compare each of the N images with one single image that represents the target, also in the GPU memory.
My current implementation, really more or less how I'd like to see it implemented, is with one line of code:
particle_ifft = ifft2(particle_fft.*target_fft);
Note this is after taking the fft of each of the uploaded images. Herein lies the indexing problem: This statement requires equally sized "particle_fft" and "target_fft" matrices to use the '.*' operator. It would be inefficient in terms of memory usage to have multiple copies of the same target image for the sake of comparing with each particle image. I have used this inefficient method to get good performance results but it significantly affects the number of particle images I can upload to the GPU.
Is there a way that I can tell matlab to compare each 2d element of the particle images 3d array (each image) with only the single target image?
I have tried using a for loop to index into the 3d array and access each of the particle images individually for comparison with the single target but Matlab does not parallelize this type of operation on the GPU, i.e. it runs nearly 1000 times slower than equivalent code using the memory inefficient target array.
I realize I could write my own kernel that would solve this indexing problem but I'm interested in finding a way to leverage matlab's existing capabilities (specifically to not rewrite the fft2 and ifft2 functions). Ideas?
In Parallel Computing Toolbox release R2012a, bsxfun was added - I think that's what you need, i.e.
bsxfun(#times, particle_fft, target_fft);
See: http://www.mathworks.co.uk/help/toolbox/distcomp/bsxfun.html

T test in image segmentation

how to use T test in image processing? i am working on image segmentation using split and merge algorithm using Matlab. to merge adjacent regions i need to t test to compare mean..
You probably need an F-test if you need to check if several samples have the same mean. The formulas you need to use depend on assumptions about your data, check http://en.wikipedia.org/wiki/F-test article. The F distribution can be calculated in Matlab using fcdf.