I am trying to train a neural network, by using the train function. The thing is that I want to do this remotely over the internet by using a SSH connection.
However, I am receiving the following error:
??? Error using ==> nntraintool at 28
NNTRAINTOOL requires Java which is not available
Error in ==> trainbr>train_network at 257
[userStop,userCancel] = nntraintool('check');`
Error in ==> trainbr at 116`
[net,tr] = train_network(net,tr,data,fcns,param);`
Error in ==> network.train at 107`
[net,tr] = feval(net.trainFcn,net,X,T,Xi,Ai,EW,net.trainParam);`
Error in ==> ClassifierScript at 28`
[MFLDefectSNetwork, tr] = train(MFLDefectSNetwork, TrainingInputSet,
TrainingSTargets);`
I think I receive this error because of the training interface which is displayed when you want to perform a neural net training. If so, could you please tell me, how can I turn that visual interface off so that I can run this by using ssh connection.
I believe you can solve this by setting the trainParam.showWindow parameter of your network object to false before calling nntraintool. For example, if your network object is stored in the variable net, you would do this before you train:
net.trainParam.showWindow = false;
This MATLAB Newsgroup thread also suggests that you may have to comment out some lines in nntraintool, which you can open in the editor with the command edit nntraintool.
(Disclaimer: the following is untested. I currently only have access to a Windows installation of MATLAB)
Try the following sequence of commands to start MATLAB (note that you should NOT use the -nojvm option):
# on your machine
ssh -x user#host
# on the host
unset DISPLAY
matlab -nodisplay
Once in MATLAB, you can explicitly check that Java is available:
>> usejava('jvm')
>> java.lang.String('str')
Next, proceed to create and use the neural network (you just have to suppress training feedback):
%# load sample dataset
load simpleclass_dataset
%# create and train neural network
net = newpr(simpleclassInputs, simpleclassTargets, 20);
net.trainParam.showWindow = false; %# no GUI (as #gnovice suggested)
net.trainParam.showCommandLine = true; %# display in command line
net.trainParam.show = 1; %# display every iteration
net = train(net, simpleclassInputs, simpleclassTargets);
%# predict and evaluate performance
simpleclassOutputs = sim(net, simpleclassInputs);
[c,cm] = confusion(simpleclassTargets,simpleclassOutputs)
As a side note, even though we disabled all display, we can still plot stuff (although invisible) and export figures to files, as I have shown in previous related questions...
Related
I trying to develop a program using neural network to solve any real-life problem.
so, I took retinopathy detection using Probabilistic neural network using MATLAB. I took some help from professor and developed the program. I have trained the neural network and created the dataset but when testing the my neural network I m not getting output...
I m new to MATLAB and this is the 1st program I wrote in MATLAB .
I have created training dataset and providing an input image to get the affected area
clc;
clear all;
close all;
img=imread('nor4.jpg');
m=impixel(img);
dlmwrite('D:\Retinopathy detection\Training.csv',m,'-append');
%figure(1),imshow(img);
---CODE FOR ACTUCAL IMPLEMENTATION---
clc;
clear all;
close all;
fileID = fopen('Training.csv');
C = textscan(fileID,'%f%f%f%f','Delimiter',',');
fclose(fileID);
x=[C{1} C{2} C{3}];
t=[C{4}];
s=input('Enter spread : ');
net = newpnn(x',t',s);
img=imread('trr.jpg');
[m,n,p]=size(img);
R=img(:,:,1);
G=img(:,:,2);
B=img(:,:,3);
RR=reshape(R,m*n,1);
GG=reshape(G,m*n,1);
BB=reshape(B,m*n,1);
Xtest=double([RR GG BB]);
Y = sim(net,Xtest');
Im=reshape(Y,m,n);
for i=1:1:m
for j=1:1:n
if Im(i,j)==1
Newimg(i,j,:)=[230,166,122];
else
Newimg(i,j,:)=img(i,j,:);
end
end
end
figure(1),imshow(img);
figure(2),imshow(Newimg);`
when I run this program ..I should get two image window one with input image and second window will have the retinopathy detected area ..
but when I run the program, I m getting on image window correct but...on the second window, am only getting green color or if I change the value ill get different color….
can you please help me out with this I m really stuck !! I m not getting help from my professor.
imshow with double data assumes that the data is in the range [0,1] for purposes of mapping your data to a color.
Try:
imshow(mat2gray(Newimg));
Which will rescale your data to the range [0,1] before displaying it.
I have downloaded some datasets from UCI for classification of RVM task.However,I am not sure about how to use it.I guess that these datasets must be normalized or do some other job before using it for training and testing.
For example,I have downloaded 'banknote authentication Data Set' on UCI.And use svmtrain in matlab to obtain a svm model(use svm model for testing data and then use rvm codes if result of svm classification is ok).
>> load banknote
>> meas = banknote(:,1:4);
>> species = banknote(:,5);
>> data = [meas(:,1), meas(:,2), meas(:,3), meas(:,4)];
>> groups = ismember(species,1);
>> [train, test] = crossvalind('holdOut',groups);
>> cp = classperf(groups);
>> svmStruct = svmtrain(data(train,:),groups(train),'showplot',true);
These is what I do in matlab,and get the following message:
??? Error using ==> svmtrain at 470
Unable to solve the optimization problem:
Maximum number of iterations exceeded; increase options.MaxIter.
To continue solving the problem with the current solution as the
starting point, set x0 = x before calling quadprog.
And here are a part of the dataset(total lines 1372 and use some for training and the rest for testing):
3.6216,8.6661,-2.8073,-0.44699,0
4.5459,8.1674,-2.4586,-1.4621,0
3.866,-2.6383,1.9242,0.10645,0
3.4566,9.5228,-4.0112,-3.5944,0
0.32924,-4.4552,4.5718,-0.9888,0
4.3684,9.6718,-3.9606,-3.1625,0
3.5912,3.0129,0.72888,0.56421,0
2.0922,-6.81,8.4636,-0.60216,0
3.2032,5.7588,-0.75345,-0.61251,0
1.5356,9.1772,-2.2718,-0.73535,0
1.2247,8.7779,-2.2135,-0.80647,0
3.9899,-2.7066,2.3946,0.86291,0
1.8993,7.6625,0.15394,-3.1108,0
-1.5768,10.843,2.5462,-2.9362,0
3.404,8.7261,-2.9915,-0.57242,0
So, any good advice about this problem?Thank you all for helping.
Later to commit.Use scale function to normalization the feature.And if the datasets have too many features,we can use PCA to reduce dimension.
I am using the nftool GUI to set up a regression neural network.
My database has various NaN (missing values). When I run the GUI, everything seems to go right. It gives me the performance and the regression graph.
I read that by line code you can add a processFcn named 'fixunknowns' to the network.
My question is: In the GUI, is the neural network making the fixunknows? How the GUI is procesing this NaN?
When I generate the script, the fixunknows function does not appear.
I wonder if it is only possible to treat this NaN values on line code? Or... perhaps the GUI implements the fixnknowns automatically?
Thank you.
When you get to the end of nftool, click 'advanced script'. This script will repeat exactly what you have done in nftool. In it you will see
% Choose Input and Output Pre/Post-Processing Functions
% For a list of all processing functions type: help nnprocess
net.input.processFcns = {'removeconstantrows','mapminmax'};
net.output.processFcns = {'removeconstantrows','mapminmax'};
This would indicate that fixunknowns is not run, (unless it is called by a function we can't see in this script). So you can add this line
net.input.processFcns = {'fixunknowns'};
Note that you should not run fixunknowns on the output. If there are NaNs in the output, delete the entire row/sample.
If you make a change in the GUI, you will have to add the fixunknowns each time.
I have Matlab .m script that sets and trains Neural network ("nn") using Matlab's Neural network toolbox. The script launches some GUI that shows trainig progress etc. The training of nn usually takes long time.
I'm doing these experiments on computer with 64 processor cores. I want to train several networks at the same time without having to run multiple Matlab sessions.
So I want to:
Start training of neural network
Modify script that creates network to create different one
Start training of modified network
Modify script to create yet another network...
Repeat steps 1-4 several times
The problem is that when I run the scrip it blocks Matlab terminal so I cannot do anything else until the script executes its last command - and that takes long. How can I run all those computations in parallel? I do have Matlab parallel toolbox.
EDIT: Matlab bug??
Update: This problem seems to happen only on R2012a, looks like fixed on R2012b.
There is very strange error when I try command sequence recommended in Edric's answer.
Here is my code:
>> job = batch(c, #nn, 1, {A(:, 1:end -1), A(:, end)});
>> wait(job);
>> r = fetchOutputs(job)
Error using parallel.Job/fetchOutputs (line 677)
An error occurred during execution of Task with ID 1.
Caused by:
Error using nntraintool (line 35)
Java is not available.
Here are the lines 27-37 of nntraintool (part of Matlab's Neural networks toolkit) where error originated:
if ~usejava('swing')
if (nargin == 1) && strcmp(command,'check')
result = false;
result2 = false;
return
else
disp('java used');
error(message('nnet:Java:NotAvailable'));
end
end
So it looks like the problem is that GUI (because Swing is not available) cannot be used when job is executed using batch command. The strange thing is that the nn function does not launch any GUI in it's current form. The error is caused by train that launches GUI by default but in nn I have switched that off:
net.trainParam.showWindow = false;
net = train(net, X, y);
More interestingly if the same nn function is launched normally (>> nn(A(:, 1:end -1), A(:, end));) it never enters the outer if-then statement of nntraintool on line 27 (I have checked that using debugger). So using the same function, the same arguments expression ~usejava('swing') evaluates to 0 when command is launched normally but to 1 when launched using batch.
What do you think about this? It looks like ugly Matlab or Neural networks toolbox bug :(((
With Parallel Computing Toolbox, you can run up to 12 'local workers' to execute your scripts (to run more than that, you'd need to purchase additional MATLAB Distributed Computing Server licences). Given your workflow, the best thing might be to use the BATCH command to submit a series of non-interactive jobs. Note that you will not be able to see any GUI from the workers. You might do something like this (using R2012a+ syntax):
c = parcluster('local'); % get the 'local' cluster object
job = batch(c, 'myNNscript'); % submit script for execution
% now edit 'myNNscript'
job2 = batch(c, 'myNNscript'); % submit script for execution
...
wait(job); load(job) % get the results
Note that the BATCH command automatically attaches a copy of the script to run to the job, so that you are free to make changes to it after submission.
I'm doing some cross-validation using a Matlab Weka Interface that I got from file exchange. My loop structure seems to work fine for Weka's Logistic classifier. However, when I try to do the exact same thing for AdaBoostM1, it throws the following error:
??? Java exception occurred: java.lang.ArrayIndexOutOfBoundsException
Error in ==> wekaClassify at 24 classProbs(t+1,:) = (classifier.distributionForInstance(testData.instance(t)))';
Error in ==> classifier_search at 225 [pred ~] = wekaClassify(matlab2weka('instance', featurelabels, tester), classifier);
I have determined through some testing that this only occurs when the number of instances in the training set is greater than the number of instances in the test set. I am sure you can see why that is a problem for me, since in most situations the training set is greater than the test set in size.
Is there something different about how I should format my inputs when using Adaboost rather than Logistic? Any information you can give regarding this problem would be so helpful.
I downloaded this code from this page: http://www.mathworks.com/matlabcentral/fileexchange/21204-matlab-weka-interface
Emails bounce from the account of the guy who made it, and he doesn't seem to respond to comments on the page - I'm hoping that maybe someone here has used this.
EDIT: Here is the code that I use to train and test the classifier:
classifier = trainWekaClassifier(matlab2weka('training', featurelabels, train), 'meta.AdaBoostM1', { strcat('-P 100 -S 1 -I ', num2str(r), '-W weka.classifiers.trees.DecisionStump')});
[pred ~] = wekaClassify(matlab2weka('instance', featurelabels, tester), classifier);
I haven't used this combination of software, so I can only take a guess at what could cause this.
Are your training/testing data matrices the right way round? They should be N-by-D (N instances, D features).
If you were passing in a D-by-N training matrix and a D-by-M testing matrix, then I would expect it to work only when M < N - which is what you describe - and even then, it wouldn't give a meaningful result.