Multi-class SVM classification in Matlab - are graphs possible? - matlab

I want to classify a data set (which has four classes) using the SVM method. I've done it using the coding below (using a 1 against all). It isn't terribly accurate but I'm thankful for anything at this stage.
http://www.mathworks.co.uk/matlabcentral/fileexchange/39352-multi-class-svm
I was wondering if there is a way to plot the support vectors and training points. I've managed this for a 2 class SVM classification but can't find a way of doing it with >2 classes.
Any help/advice re. how to achieve a semi-pretty graph would be very much appreciated!

Related

Can we use a single layer perceptron on multiclass classification probleme?

Can we use a single layer perceptron on multiclass classification problem?
How can we show that those classes are non-linearly separable if we had 30 features ?
Yes ye can use single layer perceptron (slp) for multi-class classification. We can employ one-vs-all or one-vs-one strategy for this. SLP are like logistic classifiers which are linearly separable so if the dataset is not linearly separable then you might wanna consider using Multi-layer perceptron.
I am not sure to what you are asking, but as to my understanding, if we can separate 2 classes by a straight line, they are linearly separable. This means that if in your dataset you can draw straight lines which separates examples from each other, then the problem is linearly separable. It is not usually the case though.
I really liked this blog. You might wanna check it out.

Classification for machine learning

Is there any way to classify a data that looks like this with 10 different classes? is there any preprocessing required to make the data more separable to enhance the classification accuracyenter image description here?
SVM and logistic regression can deal better with multi-classification.
Data Recommendation:
(If I am dealing with your data: I don't see there is much evidence to classify them into 10 categories, would recommend grouping the categories of the dependent variable into three or four at least that might help to a user instead of the wrong classification )

SVM Matlab classification

I'm approaching a 4 class classification problem, it's not particularly unbalanced, no missing features a lot of observation.. It seems everything good but when I approach the classification with fitcecoc it classifies everything as part of the first class. I try. to use fitclinear and fitcsvm on one vs all decomposed data but gaining the same results. Do you have any clue about the reason of that problem ?
Here are a few recommendations:
Have you normalized your data? SVM is sensitive to the features being
from different scales.
Save the mean and std you obtain during the training and use
those values during the prediction phase for normalizing the test
samples.
Change the C value and see if that changes the results.
I hope these help.

som toolbox + prediction missing valuse and outliers

i wanna use SOM toolbox (http://www.cis.hut.fi/somtoolbox/theory/somalgorithm.shtml) for predicting missing values or outliers . but i can't find any function for it.
i wrote a code for visualizaition and getting BMU(Best maching unit) but i'don't know how to use it in prediction. could you help me?
thank you in advance .
If still interests you here goes one solution.
Train your network with a training set with all the inputs that you will further on analyze. After learning, you give the new test data to classify with only the inputs that you have. The network give you back which was the best matching unit (for the features you have), and with this you can access to which of the features you do not have/outliers the BMU corresponds to.
This of course leads to a different learning and prediction implementation. The learning you implement straightforward as suggested in many tutorials. The prediction you need to make the SOM ignore NaN and calculate the BMU based on only the other values. After that, with the BMU you can get the corresponding features and use that to predict missing values or outliers.

How to see which Atribute (Feature) contribute most to the performance of the classification with PCA in Matlab?

I would like to perform classification on a small data set 65x9 using some of the Machine Learning Classification Methods (SVM, Decision Trees or any other).
So, before starting with the classification I would like to do attribute analyses with PCA in Matlab or Weka (preferred MatLab). I would like to obtain which Attribute contribute most to the performance of the classifier. So I can maybe reduce the number of some Attribute or/and include more in the future. Any example of PCA can find regarding this in MatLab or Weka?
Thanks
PCA is a unsupervised feature extraction method.
If your question is on selecting attributes to use with PCA, i don't know what your purpose is but it is unnecessary to do something like that to improve classification performance. Just use the whole attributes. PCA will give you best attributes in decreasing order for each instance.
If your question is on selecting attributes after PCA, you can chose a treshold (for example 0.95) and calculate #attributes enough for treshold beginning from the first attribute to last one. You can use the eigenvalues of covariance matrix to calculate and achive treshold in PCA.
After running PCA, we know that the first attribute is the best one, the second attribute is the best one after first etc...