OpenCV and Latent SVM Detector - matlab

I was wondering if anyone has managed to use the OpenCV implementation of Latent SVM Detector (http://docs.opencv.org/modules/objdetect/doc/latent_svm.html) successfully. There is a sample code that shows how to utilize the library but the problem is that the sample code uses a ready-made detector model that was generated using MatLab. Can some one guide me through the steps on how to generate my own detector model?

The MATLAB implementation of LatSVM by the authors of the paper has a train script called pascal. There is a README with the tarball explaining its usage:
Using the learning code
=======================
1. Download and install the 2006-2011 PASCAL VOC devkit and dataset.
(you should set VOCopts.testset='test' in VOCinit.m)
2. Modify 'voc_config.m' according to your configuration.
3. Start matlab.
4. Run the 'compile' function to compile the helper functions.
(you may need to edit compile.m to use a different convolution
routine depending on your system)
5. Use the 'pascal' script to train and evaluate a model.
example:
>> pascal('bicycle', 3); % train and evaluate a 6 component bicycle model
The learning code saves a number of intermediate models in a model cache
directory defined in 'voc_config.m'.
For more information, visit the authors website. The page also contain the paper of this method.

Related

Can MATLAB fcnLayers function be compared with MatConvNet?

In order to leverage FCN facilities for an image processing project, firstly I headed to use MatConvNet. However, in the preparation steps I found that MATLAB provided a new function (fcnLayers) to do so.
Can fcnLayers and its functionality be compared with MatConvNet? Specifically, I mean that is it possible to train models or use pre-trained ones?
Finally, may I achieve same result by using each of them?

How to import and export data from Matlab to Weka to predict class?

This is my proposed project:
- I have developed a classifier in Weka after some experimentation.
- Now I want to develop a project in Matlab which will take input
attributes from users.
- This input will be given to Weka after loading saved classification model.
- Weka will predict the class for that instance.
- Finally I want to pick this predicted class and display it on screen developed by Matlab.
Now I don't know how to import and export data from Matlab to Weka. Immediate help will be appreciated.
Some thoughts on your problem.
As far as i know matlab has a machine learning toolbox(haven't used it). So you can build your system entirely on matlab.
If you want to stick with Weka and use it programmatically with Java, then you can call Java methods from matlab so as to get the classification result. For how to call java methods from matlab check here
If you use Weka from GUI then you should be able to save the classification result to a file straight from GUI. Then you can load this file to matlab, retrieve the result and visualize it in matlab.
Hope this helps.

libSVM outputs "Line search fails in two-class probability estimates"

When I tried to train a SVM(trainsvm function) with RBF kernel,
The libSVM library outputs "Line search fails in two-class probability estimates" during training.
After training, the training accuracy of the model is just 20%.
I think I might miss something and it is related to the message.
For more information about my project,
I'm dealing with PASCAL VOC action classification problem.
I'm trying to follow this method.
http://www.ifp.illinois.edu/~jyang29/papers/CVPR09-ScSPM.pdf
There are 1300 training images and 11 classes.
After making codebooks and sparse coding,
The dimension of feature vector is 2688.
The number of training example is 1370.
You need to do a grid search, either using cross validation, or using a separate validation data set to get good values for C and gamma. Libsvm has a script called grid.py that is useful for this. I noticed you tagged this with matlab, using grid.py needs command line tools and a python installation (IMO this generally works out better than with matlab, especially if you have a some big machines to run many jobs in parallel).
I recommend that you read the libsvm guide if you haven't already done so: http://www.csie.ntu.edu.tw/~cjlin/papers/guide/guide.pdf‎.
I also suggest you initially use the same dataset as used for the paper as occasionally published algorithms only work well on the dataset chosen for the paper.
Lastly, you could contact the authors of the paper.
I asked about this warning the author of LIBSVM, and he replied that this warning can be ignored.

Creating a classifier in MATLAB to be used with classperf

I'm working on a new model and would like to use classperf to check the performance of my classifier. How do I make it use my classifier as opposed to one of the built-in ones? All the examples I found online use classifiers that are included in MATLAB. I want to use K-fold to test it.
It isn't clear form the MATLAB documentation how to do this, though you can edit functions like knnclassify or svmclassify to see how they were written, and try to emulate that functionality.
Alternatively, there's a free MATLAB pattern recognition toolbox that uses objects to represent classifiers:
http://www.mathworks.com/matlabcentral/linkexchange/links/2947-pattern-recognition-toolbox
And you can make a new classifier by sub-classing the base classifier object: prtClass.
Then you can do:
c = myClassifier;
yGuess = c.kfolds(dataSet,10); %10 fold X-val
(Full disclosure, I'm an author of the PRT toolbox)

Feature Selection methods in MATLAB?

I am trying to do some text classification with SVMs in MATLAB and really would to know if MATLAB has any methods for feature selection(Chi Sq.,MI,....), For the reason that I wan to try various methods and keeping the best method, I don't have time to implement all of them. That's why I am looking for such methods in MATLAB.Does any one know?
svmtrain
MATLAB has other utilities for classification like cluster analysis, random forests, etc.
If you don't have the required toolbox for svmtrain, I recommend LIBSVM. It's free and I've used it a lot with good results.
The Statistics Toolbox has sequentialfs. See also the documentation on feature selection.
A similar approach is dimensionality reduction. In MATLAB you can easily perform PCA or Factor analysis.
Alternatively you can take a wrapper approach to feature selection. You would search through the space of features by taking a subset of features each time, and evaluating that subset using any classification algorithm you decide (LDA, Decision tree, SVM, ..). You can do this as an exhaustively or using some kind of heuristic to guide the search (greedy, GA, SA, ..)
If you have access to the Bioinformatics Toolbox, it has a randfeatures function that does a similar thing. There's even a couple of cool demos of actual use cases.
May be this might help:
There are two ways of selecting the features in the classification:
Using fselect.py from libsvm tool directory (http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/#feature_selection_tool)
Using sequentialfs from statistics toolbox.
I would recommend using fselect.py as it provides more options - like automatic grid search for optimum parameters (using grid.py). It also provides an F-score based on the discrimination ability of the features (see http://www.csie.ntu.edu.tw/~cjlin/papers/features.pdf for details of F-score).
Since fselect.py is written in python, either you can use python interface or as I prefer, use matlab to perform a system call to python:
system('python fselect.py <training file name>')
Its important that you have python installed, libsvm compiled (and you are in the tools directory of libsvm which has grid.py and other files).
It is necessary to have the training file in libsvm format (sparse format). You can do that by using sparse function in matlab and then libsvmwrite.
xtrain_sparse = sparse(xtrain)
libsvmwrite('filename.txt',ytrain,xtrain_sparse)
Hope this helps.
For sequentialfs with libsvm, you can see this post:
Features selection with sequentialfs with libsvm