I'm working on a new model and would like to use classperf to check the performance of my classifier. How do I make it use my classifier as opposed to one of the built-in ones? All the examples I found online use classifiers that are included in MATLAB. I want to use K-fold to test it.
It isn't clear form the MATLAB documentation how to do this, though you can edit functions like knnclassify or svmclassify to see how they were written, and try to emulate that functionality.
Alternatively, there's a free MATLAB pattern recognition toolbox that uses objects to represent classifiers:
http://www.mathworks.com/matlabcentral/linkexchange/links/2947-pattern-recognition-toolbox
And you can make a new classifier by sub-classing the base classifier object: prtClass.
Then you can do:
c = myClassifier;
yGuess = c.kfolds(dataSet,10); %10 fold X-val
(Full disclosure, I'm an author of the PRT toolbox)
Related
In order to leverage FCN facilities for an image processing project, firstly I headed to use MatConvNet. However, in the preparation steps I found that MATLAB provided a new function (fcnLayers) to do so.
Can fcnLayers and its functionality be compared with MatConvNet? Specifically, I mean that is it possible to train models or use pre-trained ones?
Finally, may I achieve same result by using each of them?
I was trying to use fitcsvm to train and classify my data. However, I notice - correct me if I'm wrong - that fitcsvm could only be used with 2 classes (groups).
My data have more than 2 classes. Is there away to do classify them in matlab?
I did some googling and I read that some recommend to use fitcecoc, while others recommend to use out of the box code multisvm
Morover, other recommend to use discriminant analysis
Please, advise on best approach to go.
you are correct, fitcsvm is for one or two classes, you may use svmtrain which is matlab's svm classifier for more then two classes, also there is a famous toolbox named libsvm, if you google it would be found easily.
https://github.com/cjlin1/libsvm
Recently I saw some new method for multiply svm classifer named DSVM, it's nice new method,this would be found in matlab's files exchange.
http://www.mathworks.com/matlabcentral/fileexchange/48632-multiclass-svm-classifier
good luck
I am trying to implement SVM for multiclass problems in Matlab. I know that there is an inbuilt code for SVM in matlab but I don't know how to use it. Need some help in getting started with Matlab SVM.
SVM classifies into two classes. If you want to create a multiclass SVM, you will have to hack it yourself. You could for instance do AdaBoost with SVMs as your "cheap classifiers", although they are not that cheap to train (contrary to decision trees or even decision stumps).
Speaking of AdaBoost, you'll probably end up using ensemble methods in matlab if you really don't want to program it yourself:
For classification with three or more classes:
'AdaBoostM2'
'LPBoost' (requires an Optimization Toolbox license)
'TotalBoost' (requires an Optimization Toolbox license)
'RUSBoost'
'Subspace'
'Bag'
The ensemble toolbox is really simple and there's a ton of documentation on matlab's help pages. Basically you state your X and Y, the type of learner you want (for instance SVM) and the ensemble method, which is the method you want to use to combine the different weak learners. AdaBoost is one way, but you could also just do Bagging where the majority vote of all your weak learners counts.
So some questions you can answer here or at least ask yourself are: Why do you want to to multiclass SVM? Is it a homework assignment? Do you know how SVM and other machine learning alorithms work? Do you need help picking the right algorithm?
I have a question about CascadeObjectDetector in MATLAB. In source code of CascadeObjectDetector in MATLAB I see:
pCascadeClassifier; % OpenCV pCascadeClassifier
Then I see:
%------------------------------------------------------------------
% Constructor
%------------------------------------------------------------------
function obj = CascadeObjectDetector(varargin)
obj.pCascadeClassifier = vision.internal.CascadeClassifier;
...
end
And in stepImpl:
bbox = double(obj.pCascadeClassifier.detectMultiScale(I, ...
double(obj.ScaleFactor), ...
uint32(obj.MergeThreshold), ...
uint32(obj.MinSize), ...
uint32(obj.MaxSize)));
Do you know, what is vision.internal.CascadeClassifier? Is it simply OpenCV CascadeClassifier? And where is source code of detectMultiScale function?
The thing is that matlab provides the following object detectors
template matching
blob analysis
viola-jones algorithm
More info here : http://www.mathworks.ch/products/computer-vision/description4.html
Now to talk about opencv. The opencv function cv.HaarDetectObjects() which is used for faces detection (and in general for object detection) uses the viola jones algorithm which inturn uses harr like features.
My personal opinion is that the implementations may be slightly different but they essentially have the same algorithm.
If you are still not convinced and would like to use opencv function from matlab, u can use MEX. So this way u can use the cv.HaarDetectObjects() from matlab. More details are available at : http://www.mathworks.ch/discovery/matlab-opencv.html
I am trying to do some text classification with SVMs in MATLAB and really would to know if MATLAB has any methods for feature selection(Chi Sq.,MI,....), For the reason that I wan to try various methods and keeping the best method, I don't have time to implement all of them. That's why I am looking for such methods in MATLAB.Does any one know?
svmtrain
MATLAB has other utilities for classification like cluster analysis, random forests, etc.
If you don't have the required toolbox for svmtrain, I recommend LIBSVM. It's free and I've used it a lot with good results.
The Statistics Toolbox has sequentialfs. See also the documentation on feature selection.
A similar approach is dimensionality reduction. In MATLAB you can easily perform PCA or Factor analysis.
Alternatively you can take a wrapper approach to feature selection. You would search through the space of features by taking a subset of features each time, and evaluating that subset using any classification algorithm you decide (LDA, Decision tree, SVM, ..). You can do this as an exhaustively or using some kind of heuristic to guide the search (greedy, GA, SA, ..)
If you have access to the Bioinformatics Toolbox, it has a randfeatures function that does a similar thing. There's even a couple of cool demos of actual use cases.
May be this might help:
There are two ways of selecting the features in the classification:
Using fselect.py from libsvm tool directory (http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/#feature_selection_tool)
Using sequentialfs from statistics toolbox.
I would recommend using fselect.py as it provides more options - like automatic grid search for optimum parameters (using grid.py). It also provides an F-score based on the discrimination ability of the features (see http://www.csie.ntu.edu.tw/~cjlin/papers/features.pdf for details of F-score).
Since fselect.py is written in python, either you can use python interface or as I prefer, use matlab to perform a system call to python:
system('python fselect.py <training file name>')
Its important that you have python installed, libsvm compiled (and you are in the tools directory of libsvm which has grid.py and other files).
It is necessary to have the training file in libsvm format (sparse format). You can do that by using sparse function in matlab and then libsvmwrite.
xtrain_sparse = sparse(xtrain)
libsvmwrite('filename.txt',ytrain,xtrain_sparse)
Hope this helps.
For sequentialfs with libsvm, you can see this post:
Features selection with sequentialfs with libsvm