Matlab som access generated data and parameters - matlab

I'm completely new to Matlab and I need some help.
I'm running a self-organising map with the Neural Networks toolbox.
It all works fine, I use
net = selforgmap([x y]);
net = train(net,mydata);
and then I get access to the nice plots.
However I'm interested in the actual numbers generated by the som.
1)How do I access all the data underneath (is there a way to show all the vectors generated by the som package?
For example:
2)how do I access the nodes weights?
3)How do I access the list of cases and their allocated Best Matching Units?
Many thanks

Unfortunately, I don't have R2012, (and thus, I don't have 'selforgmap'), so this answer is potentially too general.
That said, I suspect that the variable 'net' is a a Neural Network object and if you type into the Command Window
net
Then you'll get a display of properties in that object (here's a shortened version of what I get)
net =
Neural Network object:
architecture:
numInputs: 1
numLayers: 2
biasConnect: [1; 1]
inputConnect: [1; 0]
layerConnect: [0 0; 1 0]
outputConnect: [0 1]
numOutputs: 1 (read-only)
numInputDelays: 0 (read-only)
numLayerDelays: 0 (read-only)
And then you can access these properties like this:
net.numInputs
And if you want to see the methods available for that variable, you can do
methods(net)

Related

Matlab State Space Model Response system

I have a following lab where i was asked to write the command matlab lines for these questions:
If initial Conditions are: x(0)=[2;0].๐‘“๐‘–๐‘›๐‘‘ ๐‘กโ„Ž๐‘’ ๐‘Ÿ๐‘’๐‘ ๐‘๐‘œ๐‘›๐‘ ๐‘’ ๐‘ฆ(๐‘ก).
Find the response y(t) due to step input with Amplitude of
Find the Transfer Function for the above state space model
Derive back the state space model from (3).
a = [0,1;0,4];
b = [0;1];
c = [0 -5];
x0=[2;0];
sys = ss(a,b,c,2);
initial(sys,x0); % to get 1
[n,d]=ss2tf(a,b,c,0);
mySys_tf=tf(n,d) % to get 3
[num den] = tfdata(mySys_tf, 'v')
tf2ss(num,den) % to get 4
I have written this code but it seems like its not giving me any results in the response graph and thus i can't also solve 2 and it get error in 4 if you can help me out to check what is wrong
I believe that the error comes from the fact that the system is unstable. If you were to plot the system's reaction to a step input using step() then you will see how it goes to infinity. I also don't know how far you are into your controls course and if you've seen the root locus yet, but you can plot the root locus of the system via rlocus(sys) and you'll see that the real portion of the root is on the right half of the plane and therefore letting you know that the system is unstable.
The response is 0 and will stay zero as x(2) = 0. It requires an input u to get x(2) off zero. So the graph is totally fine.
use step(sys) and you will see the drop to -Inf. Optionally you can define the end-time. Call step(sys,1) to see a reasonable range.
You solved 3 & 4 yourself.
To check stability you simply need to ask MATLAB isstable(sys) (isn't it continent? Well, there is a danger that people will forget the theory behind it and how it is connected...)
To check observability: rank(obsv(sys)) and make sure that it is the same as the system matrix
assert(rank(obsv(sys)) == length(sys.A), 'System is not observable!')

matconvnet classification training last layer (softmax)?

I would like to retrain the vgg-imagenet-f network to do classification (rather than direct image comparison, which is what I have done with my own network).
The downloaded network however is a deployment net, and doesn't have a loss layer included. As I've not done classification training before, I'm a bit stumped as to how to design this last layer. I expect it will be something like this:
layer.name = 'loss' ;
layer.type = 'custom' ;
layer.forward = #forward ;
layer.backward = #backward ;
layer.class = [] ;
but I don't know what my #forward and #backward functions should be. Should they be softmax?
Of note, I have a imdb with about 10k images, corresponding labels, and an ID element with unique numbers running 1 - 10k.
Thanks for any help, or any links to a sample of the way one should construct this layer in matconvnet/matlab!
You could implement your own network adjusting the filters accordingly, since you want to 'retrain' vgg instead of initializing the weights with random numbers you can adapt your classification network using trained filers from downloaded network. The last layer could be softmaxloss
http://www.vlfeat.org/matconvnet/mfiles/vl_nnsoftmaxloss/

Generating synthetic data of mixed type (numerical/categorical) in Matlab (or any other)

I'm trying to generate some synthetic data for experiments. When it comes to data sets with numerical features this is rather easy, I just use a Gaussian mixture (using Netlab, a package for Matlab) and that's done.
Noooww, I also need to generate some data sets with numerical and categorical features. The numerical part I can easily do using the above method, what about the categorical?
I was thinking to generate a categorical feature with (say) 3 categories with probabilities of 68.2% (+/- 1 sigma), 27.2% (between +/- 1 sigma and +/- 2 sigma), and 4.6% (the rest) within the objects with the same label.
And perhaps another categorical feature with 5 categories, with probabilities of 34.1%, 34.1%, 13.6%, 13.6%, 4.6% - again, within the objects with the same label.
Does that make sense to you guys? any thoughts?
I can easily write the code for the above, but if you know of any function that does it for me - please let me know.
Thanks!
It's easy to do in Python using numpy:
import numpy as np
np.random.multinomial(n=1, pvals=[.3,.3,.4], size=10)

Thumb Recognition in matlab using SVM algo

I am working on a project thumb recognition. following is code I am reading the 118 images of order 42 X 25 and storing them in training matrix.
training=zeros(118, 1050);
imagefiles = dir('*.png');
nfiles = length(imagefiles);
for ii=1:nfiles
currentfilename = imagefiles(ii).name;
I = imread(currentfilename);
BW=im2bw(I,graythresh(I));
temp = reshape(BW,1,1050);
training(ii,:)=temp;
end
Now I am creating a matrix of labelData to assign labels to images.
labelData = zeros(118,1);
labelData(1:50,:) = 0;
labelData(51:83,:) = 1;
labelData(84:118,:) = 2;
Here i am training my system by giving training data and label data.
options=optimset('MaxIter',5000);
SVMStruct = svmtrain(training,labelData,'Kernel_Function','linear','QuadProg_Opts',options);
BUT when I run this code it is giving me an error like
Error 1 : SVMTRAIN only supports classification into two groups. GROUP contains 3 groups.
Error 2 : SVMStruct = svmtrain(training,labelData,'Kernel_Function','linear','QuadProg_Opts',options);
Kindly help me what is the problem I used it before it was working fine but now I dont know what is going on. Thanks in advance.
Error 1 tells you what the problem is - the MATLAB built-in SVM only supports binary classification. You are assigning 3 classes.
Your options are:
Construct three classifiers: 0 vs. 1,2 then 1 vs. 0,2 then 2 vs. 0,1 and look at the output of each.
Construct 0 vs. not 0 and then 1 vs. 2
Use a multi-class SVM trainer from LIBSVM or svmlight or other such packages.
The error message is pretty clear. MATLAB's svmtrain does not support multiclass classification, that is only two classes are allowed.
So, you have two options: 1) write your own multiclass classifier as a wrapper around svmtrain. You can implement one-vs-all or one-vs-one strategies. 2) use a svm implementation that already supports multiclass classification such as libsvm.
Your problem is in the labelData vector ceck it and find the eror, yoy shoild OAA architector if hthe number of classes is more then .

Create a Tree Stump Matlab

Any idea on how to create a decision tree stump for use with boosting in Matlab? I mean is there some parameter I can send to classregtree to make sure i end up with only 1 level? I tried pruning and it doesn't always give a stump (single cut). Sometimes I was only able to get 2 cuts (unbalanced tree).
I'm aware of the ClassificationTree.template and the fitensemble functions but I want to write my own boosting algorithm to use it with LDA or other classifiers which are not provided by fitensemble.
Thanks
I believe you can just set the minparent parameter equal to your number of observations. Using the iris example data:
>> load fisheriris;
>> t = classregtree(meas,species,...
'names',{'SL' 'SW' 'PL' 'PW'}, 'minparent', 150)
t =
Decision tree for classification
1 if PL<2.45 then node 2 elseif PL>=2.45 then node 3 else setosa
2 class = setosa
3 class = versicolor
Not sure, but it may be quicker to eventually code it manually - especially if you're incorporating other custom code anyway. Good luck!
If t1 is your tree, as returned by classregtree, I think you can create a decision stump t2 with the command
t2 = prune(t1, 'level', max(prunelist(t1)-1));
Does that do what you need?