Usage of Libsvm model - matlab

I've developed a model using Libsvm in Matlab. I've choose best parameters using CV and I obtained the model training the whole dataset. I use normalization to get better results:
maximum=max(TR)+0.00001;
minimum=min(TR);
for i=1:size(TR,2)
training(1:size(TR,1),i)=double(TR(1:size(TR,1),i)-maximum(i))/(maximum(i)-minimum(i));
end
Now how can I use directly my model to obtain classification for new data? I mean for records that haven't class label. Do I have to manually build functions from model information?

Are you using libsvmtrain to train on your training data? If so, there is an output argument that you can use to classify test/future data. Then pass that output structure to svmpredict along with test data.

Related

How can I choose the best model in cross validation in matlab?

I have two datasets and I want to train a SVM classification model (fitcsvm) by one of them and then predict labels for the other one. I use 10-fold cross-validation (crossval) to train my model so I have 10 different models. My question is which one of these models are the best for prediction and how can I find that?
here is my code:
Mdl = fitcsvm(trainingData,labels);
CVMdl = crossval(Mdl);
You may have mixed up something here. The function fitcsvm trains a single model and the function crossval validates this single model. It will then return an evaluation value.
In general, you cannot train a model by cross-validation (as it says, it is a validation technique). However, you can use cross-validation to train good models.
What you are looking for is a sort of hyperparameter optimization. Those are methods that automatically train multiple models on a given data set to find the best tuning values for the SVM. Have a look at the docs here
You can turn it on like this
Mdl = fitcsvm(trainingData,labels,'OptimizeHyperparameters','auto')
You may want to use cross-validation to train multiple models with the same tuning parameters but I guess, you'll have to write this yourself. Perhaps this already helps you.

How to predict labels for new data (test set) by the PartitionedEnsemble model in Matlab?

I trained a ensemble model (RUSBoost) for a binary classification problem by the function fitensemble() in Matlab 2014a. The training by this function is performed 10-fold cross-validation through the input parameter "kfold" of the function fitensemble().
However, the output model trained by this function cannot be used to predict the labels of new data if I use the predict(model, Xtest). I checked the Matlab documents, which says we can use kfoldPredict() function to evaluate the trained model. But I did not find any input of the new data through this function. Also, I found the structure of the trained model with cross-validation is different from that model without cross-validation. So, could anyone please advise me how to use the model, which is trained with cross-validation, to predict labels of new data? Thanks!
kfoldPredict() needs a RegressionPartitionedModel or ClassificationPartitionedEnsemble object as input. This already contains the models and data for kfold cross validation.
The RegressionPartitionedModel object has a field Trained, in which the trained learners that are used for cross validation are stored.
You can take any of these learners and use it like predict(learner, Xdata).
Edit:
If k is too large, it is possible that there is too little meaningful data in one or more iteration, so the model for that iteration is less accurate.
There are no general rules for k, but k=10 like in the MATLAB default is a good starting point to play around with it.
Maybe this is also interesting for you: https://stats.stackexchange.com/questions/27730/choice-of-k-in-k-fold-cross-validation

Multi-Class SVM. Binary Decision Tree. Issues with LIBSVM

So I'm trying to implement a multi-class SVM.
Matlab didn't like having more than two classes to classify data into, so I'm using a Binary Decision Tree to classify data.
I have three classes and am splitting the data into two and one, I'll then classify the first results using a SVM and then classify the results of this to the one unclassified class.
However, when using LIBSVM, I'm getting an error when using svmpredict:
td= a{1,1};
tc = b{1,1};
td1 = a{1,2}; %data to test svm
testdatatest = td1(1:30,1:4); %data to test svm
data = td(1:80, 1:4); %split data
target = tc(1:80); %split data
model = svmtrain(data, target); %train
[predicted_label, accuracy, decision_values]=svmpredict(testdatatest,target, model);
The error I get is:
Undefined function 'svmpredict' for input arguments of type 'struct'.
Any suggestions would be great, thanks.
you must download and make libsvm, open the zip file and select your langauges like Matlab etc. then make it! it would give you two files, now you are using Matlab SVM not libsvm.
good luck

how to save libsvm model in matlab

I'm using libsvm in matlab, and it seems there no existing method to save the model formed from svmtrain. Instead, the functions provided forces me to retrain everytime. Just saving the svmtrain model variable in a .mat does not work. What should I be doing?
The nice thing about SVM, unlike NN is that training is fast and in some cases can be even done online. I have a multiclass SVM and I save the training vectors, classes and the kernel parameters into a txt file.

Libsvm vs Weka (WLSVM)

I've got to deal with an unbalanced dataset (95% record of negative class and 5% positive). I developed a model using decision tree and Weka framework. Now I'd like to try SVM and Libsvm to get better results. I'm trying to use Libsvm for matlab an Libsvm weka wrapper. I'd like to know how to compare results that I get from them. In weka a model is built from the whole dataset and after a 10-fold cross validation is performed. How can I do it with Libsvm? From Libsvm FAQ's I discovered that CV is made only to discover best parameters for kernels,not during train/predict, so what is the exact sequence of action that I should do in Matlab to obtain similar results in order to compare them with Weka?