📄 demolvq1.html
字号:
<!--This HTML is auto-generated from an m-file.Your changes will be overwritten.--><p xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd" style="color:#990000; font-weight:bold; font-size:x-large">Learning Vector Quantization</p><p xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd">An LVQ network is trained to classify input vectors according to giventargets.</p><p xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd">Copyright 1992-2002 The MathWorks, Inc.$Revision: 1.14 $ $Date: 2002/03/29 19:36:12 $</p><p xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd" style="color:#990000; font-weight:bold; font-size:medium; page-break-before: auto;"><a name=""></a></p><p xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd">Let P be 10 2-element example input vectors and C be the classes these vectorsfall into. These classes can be transformed into vectors to be used astargets, T, with IND2VEC.</p><pre xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd" style="position: relative; left:30px">P = [-3 -2 -2 0 0 0 0 +2 +2 +3; 0 +1 -1 +2 +1 -1 -2 +1 -1 0];C = [1 1 1 2 2 2 2 1 1 1];T = ind2vec(C);</pre><p xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd" style="color:#990000; font-weight:bold; font-size:medium; page-break-before: auto;"><a name=""></a></p><p xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd">Here the data points are plotted. Red = class 1, Cyan = class 2. The LVQnetwork represents clusters of vectors with hidden neurons, and groups theclusters with output neurons to form the desired classes.</p><pre xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd" style="position: relative; left:30px">colormap(hsv);plotvec(P,C)title(<span style="color:#B20000">'Input Vectors'</span>);xlabel(<span style="color:#B20000">'P(1)'</span>);ylabel(<span style="color:#B20000">'P(2)'</span>);</pre><img xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd" src="demolvq1_img03.gif"><p xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd" style="color:#990000; font-weight:bold; font-size:medium; page-break-before: auto;"><a name=""></a></p><p xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd">NEWLVQ creates an LVQ layer and here takes four arguments: Rx2 matrix of minand max values for R input elements, number of hidden neurons, element vectorof typical class percentages, and learning rate,</p><pre xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd" style="position: relative; left:30px">net = newlvq(minmax(P),4,[.6 .4],0.1);</pre><p xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd" style="color:#990000; font-weight:bold; font-size:medium; page-break-before: auto;"><a name=""></a></p><p xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd">The competitive neuron weight vectors are plotted as follows.</p><pre xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd" style="position: relative; left:30px">hold onW1 = net.IW{1};plot(W1(1,1),W1(1,2),<span style="color:#B20000">'ow'</span>)title(<span style="color:#B20000">'Input/Weight Vectors'</span>);xlabel(<span style="color:#B20000">'P(1), W(1)'</span>);ylabel(<span style="color:#B20000">'P(2), W(3)'</span>);</pre><img xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd" src="demolvq1_img05.gif"><p xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd" style="color:#990000; font-weight:bold; font-size:medium; page-break-before: auto;"><a name=""></a></p><p xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd">To train the network, first override the default number of epochs, and thentrain the network. When it is finished, replot the input vectors '+' and thecompetitive neurons' weight vectors 'o'. Red = class 1, Cyan = class 2.</p><pre xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd" style="position: relative; left:30px">net.trainParam.epochs=150;net.trainParam.show=Inf;net=train(net,P,T);cla;plotvec(P,C);hold on;plotvec(net.IW{1}',vec2ind(net.LW{2}),<span style="color:#B20000">'o'</span>);</pre><img xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd" src="demolvq1_img06.gif"><p xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd" style="color:#990000; font-weight:bold; font-size:medium; page-break-before: auto;"><a name=""></a></p><p xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd">Now use the LVQ network as a classifier, where each neuron corresponds to adifferent category. Present the input vector [0.2; 1]. Red = class 1, Cyan =class 2.</p><pre xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd" style="position: relative; left:30px">p = [0.2; 1];a = vec2ind(sim(net,p))</pre><pre xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd" style="color:gray; font-style:italic;">a = 2</pre><originalCode xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd" code="%% Learning Vector Quantization
% An LVQ network is trained to classify input vectors according to given
% targets.
%
% Copyright 1992-2002 The MathWorks, Inc.
% $Revision: 1.14 $ $Date: 2002/03/29 19:36:12 $

%%
% Let P be 10 2-element example input vectors and C be the classes these vectors
% fall into. These classes can be transformed into vectors to be used as
% targets, T, with IND2VEC.

P = [-3 -2 -2 0 0 0 0 +2 +2 +3;
 0 +1 -1 +2 +1 -1 -2 +1 -1 0];
C = [1 1 1 2 2 2 2 1 1 1];
T = ind2vec(C);

%%
% Here the data points are plotted. Red = class 1, Cyan = class 2. The LVQ
% network represents clusters of vectors with hidden neurons, and groups the
% clusters with output neurons to form the desired classes.

colormap(hsv);
plotvec(P,C)
title('Input Vectors');
xlabel('P(1)');
ylabel('P(2)');

%%
% NEWLVQ creates an LVQ layer and here takes four arguments: Rx2 matrix of min
% and max values for R input elements, number of hidden neurons, element vector
% of typical class percentages, and learning rate,

net = newlvq(minmax(P),4,[.6 .4],0.1);

%%
% The competitive neuron weight vectors are plotted as follows.

hold on
W1 = net.IW{1};
plot(W1(1,1),W1(1,2),'ow')
title('Input/Weight Vectors');
xlabel('P(1), W(1)');
ylabel('P(2), W(3)');

%%
% To train the network, first override the default number of epochs, and then
% train the network. When it is finished, replot the input vectors '+' and the
% competitive neurons' weight vectors 'o'. Red = class 1, Cyan = class 2.

net.trainParam.epochs=150;
net.trainParam.show=Inf;
net=train(net,P,T);

cla;
plotvec(P,C);
hold on;
plotvec(net.IW{1}',vec2ind(net.LW{2}),'o');

%%
% Now use the LVQ network as a classifier, where each neuron corresponds to a
% different category. Present the input vector [0.2; 1]. Red = class 1, Cyan =
% class 2.

p = [0.2; 1];
a = vec2ind(sim(net,p))
"></originalCode>
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -