⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 democ1.html

📁 神经网络学习过程的实例程序
💻 HTML
字号:
<!--This HTML is auto-generated from an m-file.Your changes will be overwritten.--><p xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd" style="color:#990000; font-weight:bold; font-size:x-large">Competitive Learning</p><p xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd">Neurons in a competitive layer learn to represent different regions of theinput space where input vectors occur.</p><p xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd">Copyright 1992-2002 The MathWorks, Inc.$Revision: 1.18 $  $Date: 2002/03/29 19:36:25 $</p><p xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd" style="color:#990000; font-weight:bold; font-size:medium; page-break-before: auto;"><a name=""></a></p><p xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd">P is a set of randomly generated but clustered test data points.  Here thedata points are plotted.</p><p xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd">A competitive network will be used to classify these points into naturalclasses.</p><pre xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd" style="position: relative; left:30px"><span style="color:green">% Create P.</span>X = [0 1; 0 1];   <span style="color:green">% Cluster centers to be in these bounds.</span>clusters = 8;     <span style="color:green">% This many clusters.</span>points = 10;      <span style="color:green">% Number of points in each cluster.</span>std_dev = 0.05;   <span style="color:green">% Standard deviation of each cluster.</span>P = nngenc(X,clusters,points,std_dev);<span style="color:green">% Plot P.</span>plot(P(1,:),P(2,:),<span style="color:#B20000">'+r'</span>);title(<span style="color:#B20000">'Input Vectors'</span>);xlabel(<span style="color:#B20000">'p(1)'</span>);ylabel(<span style="color:#B20000">'p(2)'</span>);</pre><img xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd" src="democ1_img02.gif"><p xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd" style="color:#990000; font-weight:bold; font-size:medium; page-break-before: auto;"><a name=""></a></p><p xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd">Here NEWC takes three input arguments, an Rx2 matrix of min and max values forR input elements, the number of neurons, and the learning rate.</p><p xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd">We can plot the weight vectors to see their initial attempt at classification.The weight vectors (o's) will be trained so that they occur centered inclusters of input vectors (+'s).</p><pre xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd" style="position: relative; left:30px">net = newc([0 1;0 1],8,.1);w = net.IW{1};plot(P(1,:),P(2,:),<span style="color:#B20000">'+r'</span>);hold on;circles = plot(w(:,1),w(:,2),<span style="color:#B20000">'ob'</span>);</pre><img xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd" src="democ1_img03.gif"><p xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd" style="color:#990000; font-weight:bold; font-size:medium; page-break-before: auto;"><a name=""></a></p><p xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd">Set the number of epochs to train before stopping and train this competivelayer (may take several seconds).</p><p xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd">Plot the updated layer weights on the same graph.</p><pre xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd" style="position: relative; left:30px">net.trainParam.epochs = 7;net = train(net,P);w = net.IW{1};delete(circles);plot(w(:,1),w(:,2),<span style="color:#B20000">'ob'</span>);</pre><pre xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd" style="color:gray; font-style:italic;">TRAINR, Epoch 0/7TRAINR, Epoch 7/7TRAINR, Maximum epoch reached.</pre><img xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd" src="democ1_img04.gif"><p xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd" style="color:#990000; font-weight:bold; font-size:medium; page-break-before: auto;"><a name=""></a></p><p xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd">Now we use the competitive layer as a classifier, where each neuroncorresponds to a different category.  Here we present the input vector [0;0.2].</p><p xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd">The output, a, indicates which neuron is responding, and thereby which classthe input belongs. Note that SIM returns outputs in sparse matrix form forcompetitive layers.</p><pre xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd" style="position: relative; left:30px">p = [0; 0.2];a = sim(net,p)</pre><pre xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd" style="color:gray; font-style:italic;">a =   (7,1)        1</pre><originalCode xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd" code="%% Competitive Learning&#xA;% Neurons in a competitive layer learn to represent different regions of the&#xA;% input space where input vectors occur.&#xA;% &#xA;% Copyright 1992-2002 The MathWorks, Inc.&#xA;% $Revision: 1.18 $  $Date: 2002/03/29 19:36:25 $&#xA;&#xA;%%&#xA;% P is a set of randomly generated but clustered test data points.  Here the&#xA;% data points are plotted.&#xA;%&#xA;% A competitive network will be used to classify these points into natural&#xA;% classes.&#xA;&#xA;% Create P.&#xA;X = [0 1; 0 1];   % Cluster centers to be in these bounds.&#xA;clusters = 8;     % This many clusters.&#xA;points = 10;      % Number of points in each cluster.&#xA;std_dev = 0.05;   % Standard deviation of each cluster.&#xA;P = nngenc(X,clusters,points,std_dev);&#xA;&#xA;% Plot P.&#xA;plot(P(1,:),P(2,:),'+r');&#xA;title('Input Vectors');&#xA;xlabel('p(1)');&#xA;ylabel('p(2)');&#xA;&#xA;%%&#xA;% Here NEWC takes three input arguments, an Rx2 matrix of min and max values for&#xA;% R input elements, the number of neurons, and the learning rate.&#xA;% &#xA;% We can plot the weight vectors to see their initial attempt at classification.&#xA;% The weight vectors (o's) will be trained so that they occur centered in&#xA;% clusters of input vectors (+'s).&#xA;&#xA;net = newc([0 1;0 1],8,.1);&#xA;w = net.IW{1};&#xA;plot(P(1,:),P(2,:),'+r');&#xA;hold on;&#xA;circles = plot(w(:,1),w(:,2),'ob');&#xA;&#xA;&#xA;%%&#xA;% Set the number of epochs to train before stopping and train this competive&#xA;% layer (may take several seconds).&#xA;% &#xA;% Plot the updated layer weights on the same graph.&#xA;&#xA;net.trainParam.epochs = 7;&#xA;net = train(net,P);&#xA;w = net.IW{1};&#xA;delete(circles);&#xA;plot(w(:,1),w(:,2),'ob');&#xA;&#xA;&#xA;%%&#xA;% Now we use the competitive layer as a classifier, where each neuron&#xA;% corresponds to a different category.  Here we present the input vector [0;&#xA;% 0.2].&#xA;% &#xA;% The output, a, indicates which neuron is responding, and thereby which class&#xA;% the input belongs. Note that SIM returns outputs in sparse matrix form for&#xA;% competitive layers.&#xA;&#xA;p = [0; 0.2];&#xA;a = sim(net,p)&#xA;"></originalCode>

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -