when i use the libsvm in matlab for multiclass classification, the svmpredict command consists of also the testing labels. As I dont have the labels for test set, is it possible to predict it somehow using the libsvm in matlab?
Yes, just provide a meaningless label vector. The only use of the labels is so the prediction function can report some statistics. They are not actually required for prediction in any way.
Related
I'm relatively new to using SVM and I have a question regarding how to use the results of the SVM regression. I have found many easy-to-understand documentation on SVM classification, and I can understand how to use the result of SVM for binary classification (i.e. data on one side of the support vector is labeled as one, data on the other side of the support vector is labeled as another), but I have no been able to find such hints on SVM regression- which is why I have run into the following question:
Using both libsvm package and the fitrsvm function in MATLAB, I was able to successfully generate models that are capable of fitting the abalone data set. the result of the libsvm (using svmtrain function) was used along with svmpredict to the successfully predict with new input parameters as followed:
model=svmtrain(age_train,X_train,['-s 3 -t 2 -c 2 -g 2']);
[prediction,accuracy,~]=svmpredict(age_eval,X_eval,model);
Also, as I've said, I was able to achieve the same results using the fitrsvm function as followed:
model1=fitrsvm(X_train,age_train,'OptimizeHyperparameters','auto',...
'HyperparameterOptimizationOptions',struct('AcquisitionFunctionName',...
'expected-improvement-plus'),'KernelFunction','rbf');
age_predict1=predict(model1,X_eval);
Now my question is, how do the svmpredict function (in the case of libsvm package) and the predict function (in the case of fitrsvm function in MATLAB) take the values within the trained models and apply them to the new input data? For example, is there a mathematical equation in which I apply the parameters of the trained model (such as the 'Mu' and the 'Sigma' parameters in the fitrsvm result) to the new input data to obtain the results?
It would be greatly appreciated if someone could help me with this or refer me to someone/somewhere who can help me, thank you very much in advance.
I've used Matlab Classification Learner App to train my SVM classifier and i have 99.9% of accuracy in prediction (i tested it with the function predict on matlab). What i wanted to do now was to predict without usind this function but using the hyperplane. I exported the trained classifier and so i have all the weights and the bias to find the hyperplane. Which formula should i use to predict new data? I tryed computing the sign of w'x but it works only in few cases. Can you help me understand what should i do?
Thanks a lot!
I have a train dataset and a test dataset, and I train a SVM with fitcsvm in MATLAB. Then, I proceed to test the trained model with predict. I'm always using the same datasets, but I keep getting different AUCs for the same model, which makes me wonder where in the process is there a random component. Note that
I'm aware of the fact that formally there isn't such thing as ROC curve or AUC and
I'm not asking for the statistical background of the SVM problem. It is relative to the matlab implementation of the training/test algorithm. I expected to have the same results because the training algorithm is, afaik, a deterministic process.
I am classifying gender using a KNN classifier.
I want to add an SVM classifier instead of KNN classifier with the same labels of 0 and 1 (0 for women and 1 for men)
I have a matrix of test examples, sample, a matrix of training examples, training, and a vector with the labels for the training examples group. I want class, a vector of the labels for the test examples.
class = knnclassify(sample, training, group);
if class==1
x='Male';
else
x='Female';
end
How can I change this code to find class using an SVM?
To train an SVM, you will need the Statistics and Machine Learning Toolbox.
The biggest difference between the knnclassify and using an SVM classifier is that training and classifying new labels will be two separate steps.
1. Train your SVM : fitcsvm
This step teaches the classifier how to distinguish between your two classes. It is learning a linear separator (or a weighted combination of the features) which has the largest margin between positive and negative examples. All the examples you give it need to have ground truth labels.
SVM's have many tunable parameters that you can adjust during the training step. There are several good tutorials in the Matlab documentation which describe the differences, but for the most basic version, you can just use your training examples
model = fitcsvm(training,group);
This model will be used in the next step.
2. Classify new examples : predict
To classify your new example, run
class = predict(sample, model);
Notes:
Using your model, you can also run cross-fold validation, useful for accuracy analysis.
cvModel = crossval(model);
classError = kfoldLoss(cvModel);
You can also save your model, like any other Matlab variable for future use.
save('model.m', 'model');
knnclassify comes from the bioinformatics toolbox. In the Statistics and Machine Learning Toolbox, there is also a KNN model which you train with fitcknn and classify with predict. The benefit is that you can reuse your KNN model with several sets of data, compare cross-validation results, and save it for future use.
I am a beginner in MATLAB and doing my Programming project in Digital Image Processing,i.e. Magnetic Resonance image classification using wavelet features+SVM+PCA+ANN. I executed the example SVM classification from MATLAB tool and modified that to fit my requirements. I am facing problems in storing more than one feature in an input vector and in giving new input to SVM. Please help.
Simply feed multidimensional feature data to svmtrain(Training, Group) function as Training parameter (Training can be matrix, each column represents separate feature). After that use svmclassify(SVMStruct, Sample) for testing data classification.