Matlab SVM training for muliclasses dataset - matlab

I have a question about the SVM MATLAB toolbox 2009b! the question is:
How I can train SVM classifier for classifying multi-classes datasets in MATLAB toolbox 2009b?
I just want to work with MATLAB toolbox, so please answer it if there is a way to implement it. For example, the below code is for classifying two classes datasets:
svmtrain( training data, ...
labels of training data, ...
'Kernel_Function', ...
'rbf', ...
'RBF_Sigma', ...
sigma value, ...
'Method', ...
'LS', ...
'BoxConstraint', ...
C ...
);
I want to know is there a way for training SVM for multi-classes dataset with writing a code such as above code, or should I write some code for training a SVM for each class versus the other classes?
It means, should I consider 1 for the label of the selected class and set the label of the other classes to 0, and train a SVM with above code, and do it for all classes!?
Thanks for your consideration :-)

I have not used SVM in Matlab, so other people can likely provide a more informed response, but I will share what I have learned.
Matlab Bioinformatics Toolbox SVM
From reading the documentation, the SVM in the Bioinformatics Toolbox appears to only support binary classification. As suggested in the question, a binary classifier can, with some effort, be used to classify into multiple classes. There is some discussion on approaches for doing this in the context of SVM here.
Alternate options
LIBSVM does support multi-class classification and comes with a Matlab interface. You could try installing and using it.
Additionally, while looking into this, I did come across several other Matlab toolboxes with SVM implementations. If LIBSVM is not a good option for you, it may be worth looking around to see if a different SVM implementation fits your needs.

If you have MATLAB release R2014b or later you can use the fitcecoc function in the Statistics and Machine Learning Toolbox to train a multi-class SVM.

Yup, the way for solving your problem - is to implement one vs all strategy. One of the SVM's lacks is that it has no direct multiclassification implementation.
But you can implement it through the binary classification.
I didn't see any function for svm multi classification in matlab. But i think it is not hard to implement it by yourself

Related

How can I train SVM in Matlab, with more than 2 classes

I was trying to use fitcsvm to train and classify my data. However, I notice - correct me if I'm wrong - that fitcsvm could only be used with 2 classes (groups).
My data have more than 2 classes. Is there away to do classify them in matlab?
I did some googling and I read that some recommend to use fitcecoc, while others recommend to use out of the box code multisvm
Morover, other recommend to use discriminant analysis
Please, advise on best approach to go.
you are correct, fitcsvm is for one or two classes, you may use svmtrain which is matlab's svm classifier for more then two classes, also there is a famous toolbox named libsvm, if you google it would be found easily.
https://github.com/cjlin1/libsvm
Recently I saw some new method for multiply svm classifer named DSVM, it's nice new method,this would be found in matlab's files exchange.
http://www.mathworks.com/matlabcentral/fileexchange/48632-multiclass-svm-classifier
good luck

SVM for multi-class in Matlab

I am trying to implement SVM for multiclass problems in Matlab. I know that there is an inbuilt code for SVM in matlab but I don't know how to use it. Need some help in getting started with Matlab SVM.
SVM classifies into two classes. If you want to create a multiclass SVM, you will have to hack it yourself. You could for instance do AdaBoost with SVMs as your "cheap classifiers", although they are not that cheap to train (contrary to decision trees or even decision stumps).
Speaking of AdaBoost, you'll probably end up using ensemble methods in matlab if you really don't want to program it yourself:
For classification with three or more classes:
'AdaBoostM2'
'LPBoost' (requires an Optimization Toolbox license)
'TotalBoost' (requires an Optimization Toolbox license)
'RUSBoost'
'Subspace'
'Bag'
The ensemble toolbox is really simple and there's a ton of documentation on matlab's help pages. Basically you state your X and Y, the type of learner you want (for instance SVM) and the ensemble method, which is the method you want to use to combine the different weak learners. AdaBoost is one way, but you could also just do Bagging where the majority vote of all your weak learners counts.
So some questions you can answer here or at least ask yourself are: Why do you want to to multiclass SVM? Is it a homework assignment? Do you know how SVM and other machine learning alorithms work? Do you need help picking the right algorithm?

How to perform multi-class cross-validation for LIBSVM in MatLab

I want to use LIBSVM in MatLab to do some multi-class classification. I have read that LIBSVM use One vs. One by default when provided with multiple labels, and I am fine with it.
My question is about the parameter search and the model validation. When doing a 2-class validation to find the parameters C and gamma (when using RBF as kernel), I would use the built-in cross validation to find the best (C,gamma)-pair, using a simple grid search. I have read the LIBSVM documentation but I have no idea how validation works for multiclass SVM.
Does the built-in option returns the multi-class accuracy? How can I provide the best parameters to each of the OvO models it will automaticaly built?
The answer is given there http://www.csie.ntu.edu.tw/~cjlin/libsvm/faq.html#f507. I did not read the FAQ of LibSVM enough.

Matlab - Create RBF Network without using Neural Network Toolbox

In the lectures we only mention how to train the RBF network with Gausian function and how to use the "newrb" tool box in Matlab. But in the assignemnet I need to create my own RBF network which using the NN toolbox is forbidden. Basically I not even know how to start it and our professor not willing to provide any information.
With some tips I have write my own program but the performance is very bad, I am wonder if any one can give me some helpful tutorial or guides that how to create the RBF network with Gaussian function without using NN toolbox.
I have used k-means to obtain the centers and gaussian function to caculuate the weights, the main probrlem is that I have no idea how to design the method that transform the Input matrix to the RBF matrix. Hope you can help.
This is clearly homework, and it's not clear what your question is. But I think you are wondering how to create the Gram matrix. If so, see:
http://en.wikipedia.org/wiki/Gramian_matrix
You should have the math for how to do each step in your textbook and/or notes.

Multilabel AdaBoost for MATLAB

I am currently looking for a multilabel AdaBoost implementation for MATLAB or a technique for efficiently using a two-label implementation for the multilabel case. Any help in that matter would be appreciated.
You can use the same approach used in Support Vector Machines. SVMs are originally binary classifiers, several approaches were proposed for handling multiclass data:
one-against-all: construct one binary classifier per class, and train with instances in this class as positive cases and all other instances as negative cases (ie: 1-vs-not1, 2-vs-not2, 3-vs-not3). Finally use the posterior probability of each classifier to predict the class.
one-against-one: construct several binary classifiers for each pair of classes (ie: 1-vs-2, 1-vs-3, 2-vs-3, ..) by simply training over the instances from both classes. Then you can combine the individual results using a majority vote.
Error Correcting Output Codes: based on the theory of error correction (Hamming code and such), it relies on coding the output of several binary classifier using some redundancy to increase accuracy.
Note these are generic method and can applied to any binary classifier.
Otherwise you can search for a specific implementation of multiclass Adaboost, which I'm sure there are plenty out there.. A quick search revealed this one: Multiclass GentleAdaboosting
You can use Adaboost.M2, its a multiclass adaboost, you can found an implementation in Balu toolbox here the command is Bcl_adaboost this toolbox has other useful stuff, just remember to reference. Hope it helps.
Theoretically speaking, the only correct multi-class boosting is the one defined in A theory of multiclass boosting