what "target" do i put in iris dataset nntool matlab? - matlab

I am new in using matlab so this might be easy. I am trying to make an iris dataset neural network in matlab using nntool(feed-forward back propagation network). but i cant find out what the target matrix should be. I also am trying to find (tried to create but still did nothing) a code for programming the same thing instead of using nntools.
Can anyone help me out?

The targets are the correct class labels. However, the Fisher iris dataset in Matlab has its target data in an cell array of strings (species), while nntool wants a numerical vector. So you'll have to convert it.
clear all;
load('fisheriris');
classnames = unique(species);
targets = zeros(1, numel(species));
for i = 1:3
class(strcmp(species, classnames{i})) = i;
end
You now have a vector targets that can be loaded in nntool.

Related

Training Neural network to predict sin(x) matlab

It's been 3 days since i'm trying to train many neural networks to predict sin(x) function, i'm using matlab 2016b (i have to work with it in my assignement)
what i did :
change layers
duplicate dataset (big , small)
add/sub periods
shuffle the data
change neural's number per layer
change learning function
change the transfer function and mapped the target
all that with no good prediction, can anyone explain me what i'm doing wrong ,
and it would be very helpful to paste any good book for ("preparing dataset befor traing", "knowing the best NN's structure for your project",...
and any book seems helpful)
my actual code : (i'm using nntool for the training )
%% input and target
input = 0:pi/100:8*pi;
target = sin(input) ;
plot(input,sin(input)),
hold on,
inputA = input;
targetA = target;
plot(inputA,targetA),
hold on,
%simulate input
output=sim(network2,inputA);
plot(inputA,output,'or')
hold off

K-NN classification using Fisher Iris dataset

I am working on a Pattern Recognition project and I face some problems. I have loaded the Fisher's Iris data set on my project and I want to run the k-NN classifier(for k = 1,3,5) on the above data set. But I want the following division: 80% training set and 20% test set. I want the partition to be repeated 5 times. How can it be done?
I have some code about that, but I do not even know whether I am in the right way or not.
% Regarding the random permutation that I want
[Xtrain,Xval,Xtest] = dividerand(150,0.8,0,0.2);
% Regarding the k-NN classification
X = meas;
Y = species;
z1 = fitcknn(X,Y,'NumNeighbors',5,'Standardize',1);
I do not know if my code is in the right way, what is missing or even whether there is a better way to do my job than mine or not.
Could anyone help me to complete my task?

Implementing Naive Bayes Nearest Neighbor (NBNN) in MATLAB

I posted this question on the CV SO a few days ago but it has gone basically unobserved by the forum. I'm trying to implement NBNN in MATLAB to do image classification on the CIFAR-10 image dataset. The algorithm is pretty simple, and I'm confident in it's correctness, however, I'm receiving terrible accuracy rates with 22-28%. I'm unsure why and I'm hoping someone who has experience with image classification or the algorithm could point me in the right direction. The algorithm learns off of SIFT image descriptors, which could be one of the reasons why it's under-performing. Matlab only has a SURF feature detector. From what I've read, SURF/SIFT are basically equivalent. I've been using features, (nDescriptros x 64) obtained from the function below to train my model.
points = detectSURFFeatures(rgb2gray(image));
[features,valid_points] = extractFeatures(image,points);
Another possible issue with the CIFAR dataset and this approach is the small size of the images. Each image is 32 x 32 images which, I beleive, makes feature detection very difficult. I've played around with the different octave settings in the detectSURFFeatures() but nothing has brought my accuracy above 28%.
The annotated code for the approach is below. It's difficult to understand but still might be helpful.
Hopefully someone can help me out.
for i = 3001:4000 %Train on the first 3000 instances and test on the remaining 1000.
closeness = [];
for j = 1:10 %loop over the 10 catergories
class = train(trainLabel==j,:); % Pull out all descriptors that belong to class j
descriptor = test(test_id==i,:); % Pull out all descriptors for image i
[idx,dist] = knnsearch(class,descriptor,'K',1); % Find the distance between the descriptors and the closest labeled descriptor in class j.
total = sum(dist); % sum up distances
closeness = [closeness,sum(total)]; % append a vector of the image-to-class distances.
end
[val,cat] = min(closeness); % Find choose the class that resulted in the lowest, summed distance.
predLabel = [predLabel;cat]; % Append to a vector of labels.
i
end
If your images are 32x32 pixels, then trying to detect interest points is not a good idea. As you have observed, you would get very few features, if any. Upsampling the images is one option. Another option is to use a global descriptor like HOG (extractHOGFeatures).

Plotting from 3D matrix in Matlab

I have a matrix which is 1*1*10000, the slightly odd dimensions are the result of the matrix algebra used to calculate it.
I simply want to be able to plot the 10000 data points contained in it, but matlab seems unable to do it?
Can someone please tell me how I can plot the data?
Seems simple but I really can't figure out how to do it!
Baz
yes you need to reduce the dimensions to a vector:
A = zeros(1,1,100)
vector = squeeze(A(1,1,:))
as when you'd access the third dimension this would only return a 3D-Matrix again:
z = A(1,1,:)
would NOT work. So use squeeze() ;-) Then plot as usual.
Doc-Link: http://www.mathworks.de/de/help/matlab/ref/squeeze.html
And as Ander pointed out in comments, no need to give any dimensions, as it removes singleton-dimensions by itself. So just use vector = squeeze(A). MATLAB recognizes the way to go itself.

How to use SVM in Matlab?

I am new to Matlab. Is there any sample code for classifying some data (with 41 features) with a SVM and then visualize the result? I want to classify a data set (which has five classes) using the SVM method.
I read the "A Practical Guide to Support Vector Classication" article and I saw some examples. My dataset is kdd99. I wrote the following code:
%% Load Data
[data,colNames] = xlsread('TarainingDataset.xls');
groups = ismember(colNames(:,42),'normal.');
TrainInputs = data;
TrainTargets = groups;
%% Design SVM
C = 100;
svmstruct = svmtrain(TrainInputs,TrainTargets,...
'boxconstraint',C,...
'kernel_function','rbf',...
'rbf_sigma',0.5,...
'showplot','false');
%% Test SVM
[dataTset,colNamesTest] = xlsread('TestDataset.xls');
TestInputs = dataTset;
groups = ismember(colNamesTest(:,42),'normal.');
TestOutputs = svmclassify(svmstruct,TestInputs,'showplot','false');
but I don't know that how to get accuracy or mse of my classification, and I use showplot in my svmclassify but when is true, I get this warning:
The display option can only plot 2D training data
Could anyone please help me?
I recommend you to use another SVM toolbox,libsvm. The link is as follow:
http://www.csie.ntu.edu.tw/~cjlin/libsvm/
After adding it to the path of matlab, you can train and use you model like this:
model=svmtrain(train_label,train_feature,'-c 1 -g 0.07 -h 0');
% the parameters can be modified
[label, accuracy, probablity]=svmpredict(test_label,test_feaure,model);
train_label must be a vector,if there are more than two kinds of input(0/1),it will be an nSVM automatically.
train_feature is n*L matrix for n samples. You'd better preprocess the feature before using it. In the test part, they should be preprocess in the same way.
The accuracy you want will be showed when test is finished, but it's only for the whole dataset.
If you need the accuracy for positive and negative samples separately, you still should calculate by yourself using the label predicted.
Hope this will help you!
Your feature space has 41 dimensions, plotting more that 3 dimensions is impossible.
In order to better understand your data and the way SVM works is to begin with a linear SVM. This tybe of SVM is interpretable, which means that each of your 41 features has a weight (or 'importance') associated with it after training. You can then use plot3() with your data on 3 of the 'best' features from the linear svm. Note how well your data is separated with those features and choose a basis function and other parameters accordingly.