How to get validation test and training errors of a neural network? - matlab

I have created and trained a neural network using the following code .I want to know how to get the training testing and validation errors/mis-classifications the way we get using the matlab GUI.
trainFcn = 'trainscg'; % Scaled conjugate gradient backpropagation.
% Create a Pattern Recognition Network
hiddenLayerSize = 25;
net = patternnet(hiddenLayerSize);
% Setup Division of Data for Training, Validation, Testing
net.divideParam.trainRatio = trainper/100;
net.divideParam.valRatio = valper/100;
net.divideParam.testRatio = testper/100;
% Train the Network
[net,tr] = train(net,x,t);

Related

Learning vector quantization doesn't work well in matlab

I want to use learning vector quantization (LVQ) to classify F_CK data with 7 classes.
When I use MLP, error is about 15% . but when I use LVQ, error is about 75% :(
I see that LVQ only classifies one class very good but doesn't classify other classes.
my code:
data = load('F_CK+');
x = data.X';
y_data = data.Y';
t = ind2vec(y_data);
net = lvqnet(4,0.1,'learnlv2');
net.divideFcn = 'dividerand';
net.divideMode = 'sample';
net.divideParam.trainRatio = 85/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 0/100;
net.trainParam.epochs = 15;
net = train(net, x, t);
y = net(x);
classes = vec2ind(y);
figure, plotconfusion(t,y);
confusion matrix of my result.
FC_K
Can any one help me, why this network only classify one class and what is my fault ?
dataset link:
https://dl.dropboxusercontent.com/u/100069389/File/Stackoverflow/F_CK.rar
https://mega.nz/#!J8ES1DRS!NZwDsD0FFojeZiI-OpORzxGLbMp9rx0XKsfOvGDOaR0
I don't know what is my fault but I do something that improve the accuracy of classification.
1. normalize data between -1 and 1
2. increase the subclasses/ LVQ neurons to 64 to cover all of image class.
as far as I'm remembered, the LVQ network must more accurate than MLP, but my accuracy with LVQ is increased to 80%.

Matlab: neural network time series prediction?

Background: I am trying to use MATLAB's Neural Network toolbox to predict future values of data. I run it from the GUI, but I have also included the output code below.
Problem: My predicted values lag behind the actual values by 2 time periods, and I do not know how to actually see a "t+1" (predicted) value.
Code:
% Solve an Autoregression Time-Series Problem with a NAR Neural Network
% Script generated by NTSTOOL
% Created Tue Mar 05 22:09:39 EST 2013
%
% This script assumes this variable is defined:
%
% close_data - feedback time series.
targetSeries = tonndata(close_data_short,false,false);
% Create a Nonlinear Autoregressive Network
feedbackDelays = 1:3;
hiddenLayerSize = 10;
net = narnet(feedbackDelays,hiddenLayerSize);
% Choose Feedback Pre/Post-Processing Functions
% Settings for feedback input are automatically applied to feedback output
% For a list of all processing functions type: help nnprocess
net.inputs{1}.processFcns = {'removeconstantrows','mapminmax'};
% Prepare the Data for Training and Simulation
% The function PREPARETS prepares timeseries data for a particular network,
% shifting time by the minimum amount to fill input states and layer states.
% Using PREPARETS allows you to keep your original time series data unchanged, while
% easily customizing it for networks with differing numbers of delays, with
% open loop or closed loop feedback modes.
[inputs,inputStates,layerStates,targets] = preparets(net,{},{},targetSeries);
% Setup Division of Data for Training, Validation, Testing
% For a list of all data division functions type: help nndivide
net.divideFcn = 'dividerand'; % Divide data randomly
net.divideMode = 'time'; % Divide up every value
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
% Choose a Training Function
% For a list of all training functions type: help nntrain
net.trainFcn = 'trainlm'; % Levenberg-Marquardt
% Choose a Performance Function
% For a list of all performance functions type: help nnperformance
net.performFcn = 'mse'; % Mean squared error
% Choose Plot Functions
% For a list of all plot functions type: help nnplot
net.plotFcns = {'plotperform','plottrainstate','plotresponse', ...
'ploterrcorr', 'plotinerrcorr'};
% Train the Network
[net,tr] = train(net,inputs,targets,inputStates,layerStates);
% Test the Network
outputs = net(inputs,inputStates,layerStates);
errors = gsubtract(targets,outputs);
performance = perform(net,targets,outputs)
% Recalculate Training, Validation and Test Performance
trainTargets = gmultiply(targets,tr.trainMask);
valTargets = gmultiply(targets,tr.valMask);
testTargets = gmultiply(targets,tr.testMask);
trainPerformance = perform(net,trainTargets,outputs)
valPerformance = perform(net,valTargets,outputs)
testPerformance = perform(net,testTargets,outputs)
% View the Network
view(net)
% Plots
% Uncomment these lines to enable various plots.
%figure, plotperform(tr)
%figure, plottrainstate(tr)
%figure, plotresponse(targets,outputs)
%figure, ploterrcorr(errors)
%figure, plotinerrcorr(inputs,errors)
% Closed Loop Network
% Use this network to do multi-step prediction.
% The function CLOSELOOP replaces the feedback input with a direct
% connection from the outout layer.
netc = closeloop(net);
[xc,xic,aic,tc] = preparets(netc,{},{},targetSeries);
yc = netc(xc,xic,aic);
perfc = perform(net,tc,yc)
% Early Prediction Network
% For some applications it helps to get the prediction a timestep early.
% The original network returns predicted y(t+1) at the same time it is given y(t+1).
% For some applications such as decision making, it would help to have predicted
% y(t+1) once y(t) is available, but before the actual y(t+1) occurs.
% The network can be made to return its output a timestep early by removing one delay
% so that its minimal tap delay is now 0 instead of 1. The new network returns the
% same outputs as the original network, but outputs are shifted left one timestep.
nets = removedelay(net);
[xs,xis,ais,ts] = preparets(nets,{},{},targetSeries);
ys = nets(xs,xis,ais);
closedLoopPerformance = perform(net,tc,yc)
Proposed Solution: I believe the answer lies in the last part of the code "Early Prediction Network". I'm just not sure how to remove 'one delay'.
Additional question: Is there a function that can be output from this so I can use it over and over? Or would I just have to keep retraining once I get the next time period of data?
To ensure that this question does not remain open whilst the answer is already present I will post the comment that seems to address the issue:
Credits to #DanielTheRocketMan
I believe that you should work in steps:
see if the data is stationary
if not, deal with it (for instance, differentiate the data)
test the most possible model, for instance, ar model
try nonlinear model, for instance, nar
go to a nn model.
Try a simpler version. I have tested this code and this code works fine for me.
inputs = X; %define input and target
targets = y;
hiddenLayerSize = 10;
net = patternnet(hiddenLayerSize);
% Set up Division of Data for Training, Validation, Testing
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
[net,tr] = train(net,inputs,targets);
outputss(x,:) = net(inputs);
errors = gsubtract(targets,outputss);
mse(errors)

How to apply Back propagation for 3 class classification task in matlab 2012a?

I want to solve a classification problem with 3 classes using multi layer neural network with back propagation algorithm. I'm using matlab 2012a. I'm facing trouble with newff function. I want to build a network with one hidden layer and there will be 3 neurons in the output layer, one for each class. Please advise me with example.
Here is my code
clc
%parameters
nodesInHL=7;
nodesInOutput=3;
iteration=1000;
HLtranfer='tansig';
outputTranser='tansig';
trainFunc='traingd';
learnRate=0.05;
performanceFunc='mse';
%rand('seed',0);
%randn('seed',0);
rng('shuffle');
net=newff(trainX,trainY,[nodesInHL],{HLtranfer,outputTranser},trainFunc,'learngd',performanceFunc);
net=init(net);
%setting parameters
net.trainParam.epochs=iteration;
net.trainParam.lr=learnRate;
%training
[net,tr]=train(net,trainX,trainY);
Thanks.
The newff function was made obsolete. The recommended function is feedforwardnet, or in your case (classification), use patternnet.
You could also use the GUI of nprtool, which provides a wizard-like tool that guides you step-by-step to build your network. It even allows for code generation at the end of the experiment.
Here is an example:
%# load sample dataset
%# simpleclassInputs: 2x1000 matrix (1000 points of 2-dimensions)
%# simpleclassTargets: 4x1000 matrix (4 possible classes)
load simpleclass_dataset
%# create ANN of one hidden layer with 7 nodes
net = patternnet(7);
%# set params
net.trainFcn = 'traingd'; %# training function
net.trainParam.epochs = 1000; %# max number of iterations
net.trainParam.lr = 0.05; %# learning rate
net.performFcn = 'mse'; %# mean-squared error function
net.divideFcn = 'dividerand'; %# how to divide data
net.divideParam.trainRatio = 70/100; %# training set
net.divideParam.valRatio = 15/100; %# validation set
net.divideParam.testRatio = 15/100; %# testing set
%# training
net = init(net);
[net,tr] = train(net, simpleclassInputs, simpleclassTargets);
%# testing
y_hat = net(simpleclassInputs);
perf = perform(net, simpleclassTargets, y_hat);
err = gsubtract(simpleclassTargets, y_hat);
view(net)
note that NN will automatically set the number of nodes in the output layer (based on the target class matrix size)

MATLAB Neural Network pattern recognition

I've made simple neural network for mouse gestures recognition (inputs are angles)and I've used nprtool (function patternnet for creating). I saved the weights and biases of the network:
W1=net.IW{1,1};
W2=net.LW{2,1};
b1=net.b{1,1};
b2=net.b{2,1};
and for calculating result I used tansig(W2*(tansig(W1*in+b1))+b2);
where in is an input. But the result is awful (each number is approximately equal to 0.99). Output from commend net(in) is good. What am I doing wrong ? It's very important for me why first method is bad (the same I do in my C++ program). I'm asking for help:)
[edit]
Below there's generated code from nprtool GUI. Maybe for someone it would be helpful but I don't see any solution to my problem from this code. For hidden and output layers neurons is used tansig activation function (is there any parameter in MATLAB network ?).
% Solve a Pattern Recognition Problem with a Neural Network
% Script generated by NPRTOOL
% Created Tue May 22 22:05:57 CEST 2012
%
% This script assumes these variables are defined:
%
% input - input data.
% target - target data.
inputs = input;
targets = target;
% Create a Pattern Recognition Network
hiddenLayerSize = 10;
net = patternnet(hiddenLayerSize);
% Choose Input and Output Pre/Post-Processing Functions
% For a list of all processing functions type: help nnprocess
net.inputs{1}.processFcns = {'removeconstantrows','mapminmax'};
net.outputs{2}.processFcns = {'removeconstantrows','mapminmax'};
% Setup Division of Data for Training, Validation, Testing
% For a list of all data division functions type: help nndivide
net.divideFcn = 'dividerand'; % Divide data randomly
net.divideMode = 'sample'; % Divide up every sample
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
% For help on training function 'trainlm' type: help trainlm
% For a list of all training functions type: help nntrain
net.trainFcn = 'trainlm'; % Levenberg-Marquardt
% Choose a Performance Function
% For a list of all performance functions type: help nnperformance
net.performFcn = 'mse'; % Mean squared error
% Choose Plot Functions
% For a list of all plot functions type: help nnplot
net.plotFcns = {'plotperform','plottrainstate','ploterrhist', ...
'plotregression', 'plotfit'};
% Train the Network
[net,tr] = train(net,inputs,targets);
% Test the Network
outputs = net(inputs);
errors = gsubtract(targets,outputs);
performance = perform(net,targets,outputs)
% Recalculate Training, Validation and Test Performance
trainTargets = targets .* tr.trainMask{1};
valTargets = targets .* tr.valMask{1};
testTargets = targets .* tr.testMask{1};
trainPerformance = perform(net,trainTargets,outputs)
valPerformance = perform(net,valTargets,outputs)
testPerformance = perform(net,testTargets,outputs)
% View the Network
view(net)
% Plots
% Uncomment these lines to enable various plots.
%figure, plotperform(tr)
%figure, plottrainstate(tr)
%figure, plotconfusion(targets,outputs)
%figure, ploterrhist(errors)
As can be seen in your code, the network applies automated preprocessing of the input and postprocessing of the targets - look for the lines which define processFcns. It means that the trained parameters are valid for input which is preprocessed, and that the output of the network is postprocessed (with the same paramaters as the targets were). So in your line tansig(W2*(tansig(W1*in+b1))+b2); you can't use your original inputs. You have to preprocess the input, use the result as the network's input, and postprocess the output using the same parameters that were used to postprocess the targets. Only then will you get the same result as calling net(in).
You can read more here: http://www.mathworks.com/help/toolbox/nnet/rn/f0-81221.html#f0-81692

neural network on matlab performance problem

I'm using this code to do a NN in order to train my network to give me the classifications on images:
net = newff(p,t,15,{},'traingd');
net.divideParam.trainRatio = 70/100; % Adjust as desired
net.divideParam.valRatio = 15/100; % Adjust as desired
net.divideParam.testRatio = 15/100; % Adjust as desired
net.trainParam.epochs = 10000;
net.trainParam.goal = 0.01;
net.trainParam.show = 25;
net.trainParam.time = inf;
net.trainParam.min_grad = 1e-10;
net.trainParam.max_fail = 10;
net.trainParam.sigma = 5.0e-5;
net.trainParam.lambda = 5.0e-7;
net.trainParam.mu_max = 1e-20;
net.trainParam.lr = 0.001;
% Train and Apply Network
[net,tr] = train(net,p,t);
outputs = sim(net,p);
% Create P.
% Plot
plotperf(tr)
plotfit(net,p,t)
plotregression(t,outpts)
But my performance never goes bellow 0.5. Tryed to do PCA on the data but I think something is not right on the code? Is it possible to change the initial value of the performance that shows on the nntraintool?
thank you
Paulo
It's hard to say without having your data, but from my experience with neural nets only one of a few things can possibly be happening:
You don't have enough hidden nodes to represent your data
Your time step is too high
Your error space is complicated due to your data and you're reaching lots of local minima. This is a similar but slightly different way of saying 1.
Your data is degenerate, in that you have training samples with different labels but exactly the same features.
If 1, then increase the number of hidden nodes.
If 2, decrease the time step
If 3, you can try initializing better with Nguyen-Widrow initialization perhaps (this used to be in the function initnw.)
If 4, figure out why your data is like this and fix it.
Thanks to #sazary for pointing out some details about initnw being the default when you create a new network with newff or newcf.