📄 mperceptron.html
字号:
<html><head> <meta HTTP-EQUIV="Content-Type" CONTENT="text/html;charset=ISO-8859-1"> <title>mperceptron.m</title><link rel="stylesheet" type="text/css" href="../../../m-syntax.css"></head><body><code><span class=defun_kw>function</span> <span class=defun_out>model </span>= <span class=defun_name>mpeceptron</span>(<span class=defun_in> data, options, init_model </span>)<br><span class=h1>% MPERCEPTRON Perceptron algorithm to train linear machine.</span><br><span class=help>%</span><br><span class=help>% <span class=help_field>Synopsis:</span></span><br><span class=help>% model = mpeceptron(data)</span><br><span class=help>% model = mpeceptron(data,options)</span><br><span class=help>% model = mpeceptron(data,options,init_model)</span><br><span class=help>%</span><br><span class=help>% <span class=help_field>Description:</span></span><br><span class=help>% model = mperceptron(data) uses the Perceptron learning rule</span><br><span class=help>% to train linear machine (multi-class linear classitfier).</span><br><span class=help>% The multi-class problem is transformed to the single-class</span><br><span class=help>% one using the Kessler's construction [DHS01][SH10].</span><br><span class=help>%</span><br><span class=help>% model = mperceptron(data,options) specifies stopping condition of</span><br><span class=help>% the algorithm in structure options:</span><br><span class=help>% .tmax [1x1]... maximal number of iterations.</span><br><span class=help>%</span><br><span class=help>% model = mperceptron(data,options,init_model) specifies initial </span><br><span class=help>% model which must contain:</span><br><span class=help>% .W [dim x nclass] ... Normal vectors.</span><br><span class=help>% .b [nclass x 1] ... Biases.</span><br><span class=help>%</span><br><span class=help>% <span class=help_field>Input:</span></span><br><span class=help>% data [struct] Labeled training data:</span><br><span class=help>% .X [dim x num_data] Training vectors.</span><br><span class=help>% .y [1 x num_data] Labels (1,2,...,nclass).</span><br><span class=help>%</span><br><span class=help>% options [struct] </span><br><span class=help>% .tmax [1x1] Maximal number of iterations (default tmax=inf).</span><br><span class=help>% </span><br><span class=help>% init_model [struct] Initial model; must contain items .W, .b.</span><br><span class=help>%</span><br><span class=help>% <span class=help_field>Output:</span></span><br><span class=help>% model [struct] Multi-class linear classifier:</span><br><span class=help>% .W [dim x nclass] Normal vectors.</span><br><span class=help>% .b [nclass x 1] Biases.</span><br><span class=help>%</span><br><span class=help>% .exitflag [1x1] 1 ... perceptron has converged.</span><br><span class=help>% 0 ... number of iterations exceeded tmax.</span><br><span class=help>% .t [1x1] Number of iterations.</span><br><span class=help>%</span><br><span class=help>% <span class=help_field>Example:</span></span><br><span class=help>% data = load('pentagon');</span><br><span class=help>% model = mperceptron( data );</span><br><span class=help>% figure; ppatterns( data ); pboundary( model );</span><br><span class=help>%</span><br><span class=help>% See also </span><br><span class=help>% PERCEPTRON, LINCLASS, EKOZINEC.</span><br><span class=help>%</span><br><hr><span class=help1>% <span class=help1_field>Modifications:</span></span><br><span class=help1>% 21-may-2004, VF</span><br><span class=help1>% 18-may-2004, VF</span><br><br><hr><span class=comment>% input arguments</span><br><span class=comment>%----------------------------------------</span><br>[dim,num_data] = size(data.X);<br>nclass = max(data.y);<br><br><span class=keyword>if</span> <span class=stack>nargin</span> < 2, options = []; <span class=keyword>else</span> options = c2s(options); <span class=keyword>end</span><br><span class=keyword>if</span> ~isfield(options,<span class=quotes>'tmax'</span>), options.tmax = inf; <span class=keyword>end</span> <br><br><span class=keyword>if</span> <span class=stack>nargin</span> == 3,<br> model = init_model;<br><span class=keyword>else</span><br> model.W = zeros(dim,nclass);<br> model.b = zeros(nclass,1);<br><span class=keyword>end</span><br>model.t = 0;<br><br><span class=comment>% main loop</span><br><span class=comment>% -----------------------------------</span><br>model.exitflag = 0;<br><span class=keyword>while</span> options.tmax > model.t & model.exitflag == 0,<br><br> model.t = model.t+1;<br><br> model.exitflag = 1;<br> <br> <span class=comment>% search for misclassified vector</span><br> <span class=keyword>for</span> i=1:nclass,<br><br> class_i = find( data.y == i);<br> dfce_i = model.W(:,i)'*data.X(:,class_i) + model.b(i);<br><br> <span class=keyword>for</span> j=setdiff([1:nclass], i),<br> <br> dfce_j = model.W(:,j)'*data.X(:,class_i) + model.b(j);<br> <br> [min_diff,inx] = min( dfce_i - dfce_j);<br> <br> <span class=keyword>if</span> min_diff <= 0,<br> <span class=comment>% Perceptron rule</span><br> <br> <span class=comment>% take index of misclassified vector</span><br> inx=class_i(inx(1));<br><br> model.W(:,i) = model.W(:,i) + data.X(:,inx);<br> model.b(i) = model.b(i) + 1;<br><br> model.W(:,j) = model.W(:,j) - data.X(:,inx);<br> model.b(j) = model.b(j) - 1;<br> <br> <span class=comment>% error was found</span><br> model.exitflag = 0;<br> <span class=jump>break</span>;<br> <span class=keyword>end</span><br> <br> <span class=keyword>end</span><br> <span class=keyword>if</span> model.exitflag == 0, <span class=jump>break</span>; <span class=keyword>end</span><br> <span class=keyword>end</span><br> <br><span class=keyword>end</span><br><br>model.fun = <span class=quotes>'linclass'</span>;<br><br><span class=jump>return</span>;<br><span class=comment>% EOF</span><br></code>
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -