I've been studying up on neural networks and trying to make use of MATLAB's Neural Network Toolbox and the examples Mathworks offers. I've been finding the subject very interesting thus far.
I have a basic understanding of MATLAB code. However, I've been having trouble understanding how I would feed my own test data into a neural network once it has been trained. The examples from Mathworks all seem to end once the network is trained.
For example: http://in.mathworks.com/help/nnet/examples/wine-classification.html#zmw57dd0e324
In the link above there is a section for "Testing the Neural Network" but no information is given about the test data being used.
I realize this might be a newbie question, but I'd appreciate any help understanding this.
You use the sim command from the Neural Networks toolbox. Basically, you would submit a M x N matrix where M is the total number of features and N is the total number of samples. M is associated with the total number of input layer neurons you have. Therefore, for each input, you provide a column into this matrix.
As such, once you train your network and if you follow the example, your neural network should be stored in net. Simply create your input data, and "simulate" the neural network with your desired inputs.
As such, you would do this, given that your input data is stored in X:
Y = sim(net, X);
However, if you wanted to see how your neural network performs with the same input, try using the variable x as that is the where the input data is stored in that dataset mentioned in the post:
Y = sim(net, x);
Alternatively, net is an object that can be callable. You can achieve the same thing as sim by just using the object net directly:
Y = net(x);
If you want to do a small test, try loading up the housing price dataset:
[x,t] = house_dataset;
This returns a data matrix of 13 features (rows) and there are 506 samples. This is stored in the matrix x. The vector t is the output layer values, or the target values after running through the neural network. You can train a neural network for this dataset using a network of 1 hidden layer with 10 neurons, then we can see how well the neural network did with the same input data:
net = feedforwardnet(10);
net = train(net, x, t); %// Brings up NN Train Tool and trains
y = sim(net, x); %// Or y = net(x);
You can then display the results side by side and see how they compare
disp([y; t]);
The top row is the predicted values and the bottom row is the true values:
Columns 1 through 7
23.1000 22.5739 34.6595 32.9133 33.3979 24.0131 19.6418
24.0000 21.6000 34.7000 33.4000 36.2000 28.7000 22.9000
Columns 8 through 14
19.3123 15.3914 18.8765 19.6154 18.8690 20.0817 18.0593
27.1000 16.5000 18.9000 15.0000 18.9000 21.7000 20.4000
Columns 15 through 21
17.4043 17.7989 19.4390 17.8489 18.5007 17.3547 14.3393
18.2000 19.9000 23.1000 17.5000 20.2000 18.2000 13.6000
Columns 22 through 28
16.9866 17.5983 15.2679 16.3294 13.2075 15.6425 14.3992
19.6000 15.2000 14.5000 15.6000 13.9000 16.6000 14.8000
Columns 29 through 35
18.8453 20.4586 14.5208 15.4776 12.6204 14.5306 12.2213
18.4000 21.0000 12.7000 14.5000 13.2000 13.1000 13.5000
Columns 36 through 42
21.5945 21.7292 21.3951 22.8980 29.5480 35.4459 33.3658
18.9000 20.0000 21.0000 24.7000 30.8000 34.9000 26.6000
Columns 43 through 49
24.9201 25.2191 22.0473 20.2307 21.9648 19.1073 16.4167
25.3000 24.7000 21.2000 19.3000 20.0000 16.6000 14.4000
Columns 50 through 56
19.1333 19.4673 19.6465 26.6304 20.6860 18.0080 34.7084
19.4000 19.7000 20.5000 25.0000 23.4000 18.9000 35.4000
Columns 57 through 63
23.9052 28.8725 23.2335 21.1107 18.5425 17.7766 22.5957
24.7000 31.6000 23.3000 19.6000 18.7000 16.0000 22.2000
Columns 64 through 70
24.6137 30.3093 24.5964 20.3996 21.0408 18.2597 19.9876
25.0000 33.0000 23.5000 19.4000 22.0000 17.4000 20.9000
Columns 71 through 77
25.4133 21.4175 22.4705 24.1181 25.2216 22.2482 20.4357
24.2000 21.7000 22.8000 23.4000 24.1000 21.4000 20.0000
Columns 78 through 84
21.7641 20.8850 20.6352 27.3050 22.6721 23.7894 21.7782
20.8000 21.2000 20.3000 28.0000 23.9000 24.8000 22.9000
Columns 85 through 91
24.1746 26.2748 22.9204 23.2649 27.7500 30.4014 23.7201
23.9000 26.6000 22.5000 22.2000 23.6000 28.7000 22.6000
Columns 92 through 98
23.1654 24.7010 25.0065 20.9675 26.6610 22.2313 41.1054
22.0000 22.9000 25.0000 20.6000 28.4000 21.4000 38.7000
Columns 99 through 105
43.3199 33.8214 22.5828 23.9000 19.4996 19.0978 19.2686
43.8000 33.2000 27.5000 26.5000 18.6000 19.3000 20.1000
Columns 106 through 112
17.6835 18.1297 19.9324 19.4827 18.8635 22.3501 23.0958
19.5000 19.5000 20.4000 19.8000 19.4000 21.7000 22.8000
Columns 113 through 119
19.2355 19.4148 21.4072 18.6230 21.0624 19.4977 19.8504
18.8000 18.7000 18.5000 18.3000 21.2000 19.2000 20.4000
Columns 120 through 126
19.7301 22.3515 21.7456 19.6909 17.3642 18.6674 21.4590
19.3000 22.0000 20.3000 20.5000 17.3000 18.8000 21.4000
Columns 127 through 133
15.7843 14.9932 19.6013 14.6810 19.5595 18.3563 18.6026
15.7000 16.2000 18.0000 14.3000 19.2000 19.6000 23.0000
Columns 134 through 140
15.2750 11.8104 18.7860 16.7552 19.7692 15.9850 17.8815
18.4000 15.6000 18.1000 17.4000 17.1000 13.3000 17.8000
Columns 141 through 147
18.1044 11.2998 13.7603 16.7915 14.2739 15.8247 12.1940
14.0000 14.4000 13.4000 15.6000 11.8000 13.8000 15.6000
Columns 148 through 154
14.8796 15.9726 17.0902 20.0606 12.5238 15.0568 15.3508
14.6000 17.8000 15.4000 21.5000 19.6000 15.3000 19.4000
Columns 155 through 161
16.8791 17.1680 7.5081 38.1429 27.6726 23.4515 33.1939
17.0000 15.6000 13.1000 41.3000 24.3000 23.3000 27.0000
Columns 162 through 168
49.9039 49.8868 49.4694 20.0338 20.5124 51.1646 19.0164
50.0000 50.0000 50.0000 22.7000 25.0000 50.0000 23.8000
Columns 169 through 175
23.1614 23.9231 18.1501 19.0578 24.0764 25.6909 24.2217
23.8000 22.3000 17.4000 19.1000 23.1000 23.6000 22.6000
Columns 176 through 182
29.0459 23.7307 24.3887 28.9583 36.7788 41.2092 29.0054
29.4000 23.2000 24.6000 29.9000 37.2000 39.8000 36.2000
Columns 183 through 189
36.1596 29.3311 24.9179 26.8610 46.6416 30.5403 29.2822
37.9000 32.5000 26.4000 29.6000 50.0000 32.0000 29.8000
Columns 190 through 196
32.3537 30.5739 28.0238 33.5408 32.5201 30.3314 47.7960
34.9000 37.0000 30.5000 36.4000 31.1000 29.1000 50.0000
Columns 197 through 203
35.5760 29.9340 32.4614 23.7173 24.6185 22.3271 40.2040
33.3000 30.3000 34.6000 34.9000 32.9000 24.1000 42.3000
Columns 204 through 210
48.7640 51.9776 21.6903 23.1472 19.7713 20.8484 19.3971
48.5000 50.0000 22.6000 24.4000 22.5000 24.4000 20.0000
Columns 211 through 217
19.2595 20.7310 21.9656 24.4268 23.4818 22.6366 23.3319
21.7000 19.3000 22.4000 28.1000 23.7000 25.0000 23.3000
Columns 218 through 224
27.5792 19.9003 21.9243 27.6622 25.8529 27.0425 26.9247
28.7000 21.5000 23.0000 26.7000 21.7000 27.5000 30.1000
Columns 225 through 231
43.9926 46.3891 41.9625 31.2884 42.9222 31.0752 24.0971
44.8000 50.0000 37.6000 31.6000 46.7000 31.5000 24.3000
Columns 232 through 238
33.9499 45.8143 44.0646 27.7080 24.5834 24.8484 33.5038
31.7000 41.7000 48.3000 29.0000 24.0000 25.1000 31.5000
Columns 239 through 245
27.1918 25.1814 24.1256 18.9528 20.8431 27.5930 15.9916
23.7000 23.3000 22.0000 20.1000 22.2000 23.7000 17.6000
Columns 246 through 252
15.4457 21.3048 18.9629 22.2684 26.9249 26.6310 28.3915
18.5000 24.3000 20.5000 24.5000 26.2000 24.4000 24.8000
Columns 253 through 259
31.1707 42.3749 21.8852 20.0264 41.3473 51.4709 37.5230
29.6000 42.8000 21.9000 20.9000 44.0000 50.0000 36.0000
Columns 260 through 266
31.7275 35.0836 40.2452 48.8115 34.8549 36.0551 21.6751
30.1000 33.8000 43.1000 48.8000 31.0000 36.5000 22.8000
Columns 267 through 273
30.0784 48.2556 43.5570 21.8858 19.8884 23.6277 24.2718
30.7000 50.0000 43.5000 20.7000 21.1000 25.2000 24.4000
Columns 274 through 280
36.2902 33.7512 31.2525 32.4099 33.0711 26.3885 36.7014
35.2000 32.4000 32.0000 33.2000 33.1000 29.1000 35.1000
Columns 281 through 287
45.9504 36.7908 46.8473 51.1511 31.4501 23.5448 21.8759
45.4000 35.4000 46.0000 50.0000 32.2000 22.0000 20.1000
Columns 288 through 294
23.4287 22.6450 25.2664 32.6864 35.8508 30.2136 23.6842
23.2000 22.3000 24.8000 28.5000 37.3000 27.9000 23.9000
Columns 295 through 301
21.9361 29.8485 27.2524 20.1174 23.8767 30.0819 26.0127
21.7000 28.6000 27.1000 20.3000 22.5000 29.0000 24.8000
Columns 302 through 308
24.2473 26.5010 32.4189 34.7875 28.9162 35.5226 30.2009
22.0000 26.4000 33.1000 36.1000 28.4000 33.4000 28.2000
Columns 309 through 315
24.1533 19.4709 18.2852 22.0597 19.2040 20.6719 22.5350
22.8000 20.3000 16.1000 22.1000 19.4000 21.6000 23.8000
Columns 316 through 322
17.3670 19.0604 19.3382 22.5443 21.3041 23.1810 22.7061
16.2000 17.8000 19.8000 23.1000 21.0000 23.8000 23.1000
Columns 323 through 329
20.5965 18.3056 23.5467 24.9056 22.9716 21.5595 21.3041
20.4000 18.5000 25.0000 24.6000 23.0000 22.2000 19.3000
Columns 330 through 336
25.0149 21.6011 16.6778 20.7220 23.1595 23.1583 21.2709
22.6000 19.8000 17.1000 19.4000 22.2000 20.7000 21.1000
Columns 337 through 343
20.5881 20.0957 22.1791 21.6646 20.9052 32.2760 19.6094
19.5000 18.5000 20.6000 19.0000 18.7000 32.7000 16.5000
Columns 344 through 350
26.6120 30.8497 18.2102 17.3026 24.0279 27.7069 28.6764
23.9000 31.2000 17.5000 17.2000 23.1000 24.5000 26.6000
Columns 351 through 357
24.0160 22.8876 19.0665 27.5481 18.8396 20.1889 17.1304
22.9000 24.1000 18.6000 30.1000 18.2000 20.6000 17.8000
Columns 358 through 364
19.3401 22.0580 20.7871 25.3794 19.9338 23.0456 20.1196
21.7000 22.7000 22.6000 25.0000 19.9000 20.8000 16.8000
Columns 365 through 371
23.6761 32.1277 24.0001 21.9790 51.1021 48.6596 51.0467
21.9000 27.5000 21.9000 23.1000 50.0000 50.0000 50.0000
Columns 372 through 378
38.7150 40.0930 11.7337 11.7271 19.3180 12.2218 14.0871
50.0000 50.0000 13.8000 13.8000 15.0000 13.9000 13.3000
Columns 379 through 385
10.6245 12.3102 11.2811 12.5886 12.6262 12.7699 8.7457
13.1000 10.2000 10.4000 10.9000 11.3000 12.3000 8.8000
Columns 386 through 392
8.8422 5.2698 8.0072 8.1700 14.5322 17.5716 16.0186
7.2000 10.5000 7.4000 10.2000 11.5000 15.1000 23.2000
Columns 393 through 399
9.8385 19.5303 15.8293 16.7939 16.1594 16.0031 5.2560
9.7000 13.8000 12.7000 13.1000 12.5000 8.5000 5.0000
Columns 400 through 406
11.0778 7.5946 12.6534 13.7895 7.5276 6.5614 4.2802
6.3000 5.6000 7.2000 12.1000 8.3000 8.5000 5.0000
Columns 407 through 413
13.7780 28.4038 16.2167 15.3346 16.2930 10.6613 11.2681
11.9000 27.9000 17.2000 27.5000 15.0000 17.2000 17.9000
Columns 414 through 420
15.1122 7.6163 7.5657 8.3258 10.5912 8.7514 10.9284
16.3000 7.0000 7.2000 7.5000 10.4000 8.8000 8.4000
Columns 421 through 427
16.8908 18.9752 21.6162 9.5188 12.4425 8.3453 14.5831
16.7000 14.2000 20.8000 13.4000 11.7000 8.3000 10.2000
Columns 428 through 434
9.2716 13.8301 9.5708 14.5073 11.7783 19.1158 16.5270
10.9000 11.0000 9.5000 14.5000 14.1000 16.1000 14.3000
Columns 435 through 441
14.5899 9.5630 11.7702 7.7339 5.5916 11.7561 6.6963
11.7000 13.4000 9.6000 8.7000 8.4000 12.8000 10.5000
Columns 442 through 448
12.4933 17.3840 12.9751 10.0512 9.2991 15.4655 14.4212
17.1000 18.4000 15.4000 10.8000 11.8000 14.9000 12.6000
Columns 449 through 455
13.2374 13.2950 11.7887 14.2879 15.7591 12.5250 10.9566
14.1000 13.0000 13.4000 15.2000 16.1000 17.8000 14.9000
Columns 456 through 462
13.5464 11.8807 12.4477 16.1538 16.4501 16.3458 18.2755
14.1000 12.7000 13.5000 14.9000 20.0000 16.4000 17.7000
Columns 463 through 469
17.0225 20.7452 18.1601 20.7623 12.4873 15.6683 15.4485
19.5000 20.2000 21.4000 19.9000 19.0000 19.1000 19.1000
Columns 470 through 476
19.0019 17.4639 22.5748 20.2223 22.6186 17.8242 13.9765
20.1000 19.9000 19.6000 23.2000 29.8000 13.8000 13.3000
Columns 477 through 483
16.1047 11.9184 16.2553 22.4462 23.2027 26.7181 26.4324
16.7000 12.0000 14.6000 21.4000 23.0000 23.7000 25.0000
Columns 484 through 490
21.5557 20.0931 20.0937 16.1954 23.1288 15.7670 12.6886
21.8000 20.6000 21.2000 19.1000 20.6000 15.2000 7.0000
Columns 491 through 497
8.1229 18.0084 20.7898 20.3537 22.0706 21.7872 19.0696
8.1000 13.6000 20.1000 21.8000 24.5000 23.1000 19.7000
Columns 498 through 504
19.4128 21.0991 18.7917 20.1158 22.0507 17.6495 24.3503
18.3000 21.2000 17.5000 16.8000 22.4000 20.6000 23.9000
Columns 505 through 506
22.1611 16.1823
22.0000 11.9000
You can see that the majority of the samples match and are relatively close. Some outputs however are far off.... so it's really about tuning the neural network at this point, but don't tune too much or you will overfit the data.
Check the documentation for more details on sim: http://www.mathworks.com/help/nnet/ref/sim.html
Related
I have a file named Myfile.txt with a header and three rows of data:
Header Row1 Row2 Row3
5.10 10 15
5.20 20 25
5.30 30 35
5.40 40 45
5.50 50 55
5.60 60 65
5.70 70 75
5.80 80 85
5.90 90 95
5.95 10 20
6.00 25 30
6.05 35 40
I want to read in every 3 lines that increment the first column by .1 and then all lines after the increment to .05, so my output looks like:
5.30 30 35
5.60 60 65
5.90 90 95
5.95 10 20
6.00 25 30
6.05 35 40
I have the following code but I don't know how to implement the condition, can I get some help doing this?
per_line = 3;
every_nth_line = 3;
fmt = [repmat('%*f',1,per_line*(every_nth_line-1)), repmat('%f',1,per_line)];
fid = fopen('Myfile.txt','rt');
datacell = textscan(fid,fmt,'delimiter','\n','HeaderLines',1,'CollectOutput',1);
fclose(fid);
C=datacell{1};
You can use the following code:
fileID = fopen('Myfile.txt');
mydata = textscan(fileID,'%f%f%f','HeaderLines',2);
findx = (find(abs(diff([mydata{1}(1)-0.1;mydata{1}])-0.1000)<0.0001));
sindx = (find(abs(diff(mydata{1})-0.05)<0.0001))+1;
alldata = [mydata{:}];
C= [alldata(findx(3:3:end),:);alldata(sindx,:)];
fclose(fileID);
diff is used to determine the difference between entries in the first column. abs is used to determine equality between floating point numbers within a small difference, and find is used to return their indexes.
C contains:
5.3000 30.0000 35.0000
5.6000 60.0000 65.0000
5.9000 90.0000 95.0000
5.9500 10.0000 20.0000
6.0000 25.0000 30.0000
6.0500 35.0000 40.0000
Let's we are given a number that we would like to compare it with the whole numbers which are in a column of matrix. For example:
value = 210;
A = [
0.0010 68
0.0011 277
0.0011 129
0.0012 87
0.0015 78
0.0016 248
0.0019 270
0.0019 133
0.0022 258
0.0025 264
0.0029 255
0.0030 81
0.0032 242
0.0033 27
0.0036 124];
Now, we want to compare value with all the numbers in column two under a condition and if it satisfies for all the numbers in the second column then do some computations otherwise do some other computations. If it does not hold for one then exit and continue code.
In the example:
if abs(value - A(:,2)) > 50 % should be true for all A(:,2)
do something
else
do something
How could one write it in code?
I have a p-by-p-by-n tensor. I want to extract diagonal element for each p-by-p slice. Are there anyone know how to do this without looping?
Thank you.
Behold the ever mighty and ever powerful bsxfun for vectorizing MATLAB problems to do this task very efficiently using MATLAB's linear indexing -
diags = A(bsxfun(#plus,[1:p+1:p*p]',[0:n-1]*p*p))
Sample run with 4 x 4 x 3 sized input array -
A(:,:,1) =
0.7094 0.6551 0.9597 0.7513
0.7547 0.1626 0.3404 0.2551
0.2760 0.1190 0.5853 0.5060
0.6797 0.4984 0.2238 0.6991
A(:,:,2) =
0.8909 0.1493 0.8143 0.1966
0.9593 0.2575 0.2435 0.2511
0.5472 0.8407 0.9293 0.6160
0.1386 0.2543 0.3500 0.4733
A(:,:,3) =
0.3517 0.9172 0.3804 0.5308
0.8308 0.2858 0.5678 0.7792
0.5853 0.7572 0.0759 0.9340
0.5497 0.7537 0.0540 0.1299
diags =
0.7094 0.8909 0.3517
0.1626 0.2575 0.2858
0.5853 0.9293 0.0759
0.6991 0.4733 0.1299
Benchmarking
Here's few runtime tests comparing this bsxfun based approach against repmat + eye based approach for big datasizes -
***** Datasize: 500 x 500 x 500 *****
----------------------- With BSXFUN
Elapsed time is 0.008383 seconds.
----------------------- With REPMAT + EYE
Elapsed time is 0.163341 seconds.
***** Datasize: 800 x 800 x 500 *****
----------------------- With BSXFUN
Elapsed time is 0.012977 seconds.
----------------------- With REPMAT + EYE
Elapsed time is 0.402111 seconds.
***** Datasize: 1000 x 1000 x 500 *****
----------------------- With BSXFUN
Elapsed time is 0.017058 seconds.
----------------------- With REPMAT + EYE
Elapsed time is 0.690199 seconds.
One suggestion I have is to create a p x p logical identity matrix, replicate this n times in the third dimension, and then use this matrix to access your tensor. Something like this, supposing that your tensor was stored in A:
ind = repmat(logical(eye(p)), [1 1 n]);
out = A(ind);
Example use:
>> p = 5; n = 3;
>> A = reshape(1:75, p, p, n)
A(:,:,1) =
1 6 11 16 21
2 7 12 17 22
3 8 13 18 23
4 9 14 19 24
5 10 15 20 25
A(:,:,2) =
26 31 36 41 46
27 32 37 42 47
28 33 38 43 48
29 34 39 44 49
30 35 40 45 50
A(:,:,3) =
51 56 61 66 71
52 57 62 67 72
53 58 63 68 73
54 59 64 69 74
55 60 65 70 75
>> ind = repmat(logical(eye(p)), [1 1 n]);
>> out = A(ind)
out =
1
7
13
19
25
26
32
38
44
50
51
57
63
69
75
You'll notice that we grab the diagonals of the first slice, followed by the diagonals of the second slice, etc. up until the last slice. These values are all concatenated into a single vector.
Just reading Divakar's answer and trying to understand why he again is roughly 10 times faster than my idea I put together code mixing both, and ended up with code which is faster:
A=reshape(A,[],n);
diags2 = A(1:p+1:p*p,:);
For a 500x500x500 tensor I get 0.008s for Divakar's solution and 0.005s for my solution using Matlab 2013a. Probably plain indexing is the only way to beat bsxfun.
how to remove decimal trailing zeros in matrix (C) in matlab.
65.7500 4.7500 4.7500 64.0000 60.0000
118.9000 105.6000 92.5500 147.6000 178.2000
73.6600 84.0100 95.6900 190.0000 164.0000
147.9000 132.0000 140.0000 147.0000 116.5000
ans=
65.75 4.75 4.75 64 60
118.9 105.6 92.55 147.6 178.2
73.66 84.01 95.69 190 164
147.9 132 140 147 116.5
>> format short g
>> C
C =
65.75 4.75 4.75 64 60
118.9 105.6 92.55 147.6 178.2
73.66 84.01 95.69 190 164
147.9 132 140 147 116.5
I have a bunch of points in 2D for which I know the value, and I'd like to fit a cubic spline through them to interpolate some other data around using MATLAB.
My code looks like:
fitobject = fit(x,y,'cubicinterp');
yy=feval(fitobject,xx)
with the following inputs:
Coordinates
x = [...
313 3;
313 5;
313 7;
315 3;
315 5;
317 3;
319 5];
Values
y = [...
28.0779;
28.0186;
11.6220;
16.7640;
23.7139;
-14.7882;
-20.4626];
Interpolation points
xx = [...
313 3;
313 4;
313 5;
313 6;
313 7;
313 8;
313 9;
314 3;
314 5;
314 7;
315 3;
315 4;
315 5;
315 6;
315 7;
316 3;
316 5;
317 3;
317 4;
317 5;
318 3;
319 5;
319 6;
319 7;
320 5];
In my output vector yy, I get several NaN values. To me, the input data look clean (they are all finite values and there's no NaN). I don't get what would cause feval to return NaN when fitting data. Why couldn't it give the best possible fit, even if it is bad? Is there an error in my approach?
I browsed a bit and it seems that the same question had been asked a bunch of times on mathworks forums, but no one gave a clear answer.
Thanks in advance for your help.
It's because an interpolation cannot be used as an extrapolation:
%xx(:,1) xx(:,2) yy
313.0000 3.0000 28.0779
313.0000 4.0000 29.5074
313.0000 5.0000 28.0186
313.0000 6.0000 22.3233
313.0000 7.0000 11.6220
313.0000 8.0000 NaN % xx exceeds bounds of original x interval
313.0000 9.0000 NaN % xx exceeds bounds of original x interval
314.0000 3.0000 24.1239
314.0000 5.0000 27.5130
314.0000 7.0000 NaN % xx exceeds bounds of original x interval
315.0000 3.0000 16.7640
315.0000 4.0000 21.7028
315.0000 5.0000 23.7139
315.0000 6.0000 11.2710
315.0000 7.0000 NaN % xx exceeds bounds of original x interval
316.0000 3.0000 1.4641
316.0000 5.0000 13.9662
317.0000 3.0000 -14.7882
317.0000 4.0000 -5.4876
317.0000 5.0000 2.7781
318.0000 3.0000 NaN % xx exceeds bounds of original x interval
319.0000 5.0000 -20.4626
319.0000 6.0000 NaN % xx exceeds bounds of original x interval
319.0000 7.0000 NaN % xx exceeds bounds of original x interval
320.0000 5.0000 NaN % xx exceeds bounds of original x interval
In other words, you're trying to get data beyond the boundaries of your original surface data (extrapolation), which is usually already quite dangerous, and fit does not even allow you to do it.
It looks like the points which come up as NaN lie outside of the interpolation. You can plot it to take a look.
The code I used to play with this is as follows: (Note that I set the NaNs to -25 just so that they could be plotted)
x = [313 3;
313 5;
313 7;
315 3;
315 5;
317 3;
319 5];
y = [
28.0779
28.0186
11.6220
16.7640
23.7139
-14.7882
-20.4626];
fitobject = fit(x,y,'cubicinterp');
xx = [
313 3
313 4
313 5
313 6
313 7
313 8
313 9
314 3
314 5
314 7
315 3
315 4
315 5
315 6
315 7
316 3
316 5
317 3
317 4
317 5
318 3
319 5
319 6
319 7
320 5];
yy = fitobject(xx);
badindices = isnan(yy);
yy(badindices) = -25;
plot(fitobject, xx, yy, 'Exclude', badindices)
By the way, notice that I'm not using feval, but a direct call to fitobject