📄 kfd.html
字号:
<html><head> <meta HTTP-EQUIV="Content-Type" CONTENT="text/html;charset=ISO-8859-1"> <title>kfd.m</title><link rel="stylesheet" type="text/css" href="../../m-syntax.css"></head><body><code><span class=defun_kw>function</span> <span class=defun_out>model</span>=<span class=defun_name>kfd</span>(<span class=defun_in>data,options</span>)<br><span class=h1>% KFD Kernel Fisher Discriminat.</span><br><span class=help>%</span><br><span class=help>% <span class=help_field>Synopsis:</span></span><br><span class=help>% model = kfd( data )</span><br><span class=help>% model = kfd( data, options )</span><br><span class=help>%</span><br><span class=help>% <span class=help_field>Description:</span></span><br><span class=help>% This function is an implementation of the Kernel Fisher</span><br><span class=help>% Discriminant (KFD) [Mika99a]. The aim is to find a binary </span><br><span class=help>% kernel classifier which is the linear decision function in a </span><br><span class=help>% feature space induced by the selected kernel function. </span><br><span class=help>% The bias is found decision function is trainined by the </span><br><span class=help>% linear SVM on the data projected on the optimal direction.</span><br><span class=help>%</span><br><span class=help>% <span class=help_field>Input:</span></span><br><span class=help>% data [struct] Training binary labeled data:</span><br><span class=help>% .X [dim x num_data] Vectors.</span><br><span class=help>% .y [1 x num_data] Labels (1 or 2).</span><br><span class=help>%</span><br><span class=help>% options [struct] Control parameters:</span><br><span class=help>% .ker [string] Kernel identifier (default 'linear'). </span><br><span class=help>% See 'help kernel' for more info.</span><br><span class=help>% .arg [1 x nargs] Kernel argument(s).</span><br><span class=help>% .C [1x1] Regularization constant of the linear 1-D SVM </span><br><span class=help>% used to optimize the bias (default C=inf).</span><br><span class=help>% .mu [1x1] Regularization constant added to the diagonal of </span><br><span class=help>% the within scatter matrix (default 1e-4).</span><br><span class=help>% </span><br><span class=help>% <span class=help_field>Output:</span></span><br><span class=help>% model [struct] Binary SVM classifier:</span><br><span class=help>% .Alpha [num_data x 1] Weight vector.</span><br><span class=help>% .b [1x1] Bias of decision function.</span><br><span class=help>% .sv.X [dim x num_data] Training data (support vectors).</span><br><span class=help>%</span><br><span class=help>% .trnerr [1x1] Training classification error.</span><br><span class=help>% .kercnt [1x1] Number of kernel evaluations used during training.</span><br><span class=help>% .nsv [1x1] Number of support vectors.</span><br><span class=help>% .options [struct] Copy of options.</span><br><span class=help>% .cputime [1x1] Used cputime.</span><br><span class=help>%</span><br><span class=help>% <span class=help_field>Example:</span></span><br><span class=help>% trn = load('riply_trn');</span><br><span class=help>% options = struct('ker','rbf','arg',1,'C',10,'mu',0.001);</span><br><span class=help>% model = kfd(trn, options)</span><br><span class=help>% figure; ppatterns(trn); psvm(model);</span><br><span class=help>%</span><br><span class=help>% See also </span><br><span class=help>% SVMCLASS, FLD, SVM.</span><br><span class=help>%</span><br><hr><span class=help1>% <span class=help1_field>Modifications:</span></span><br><span class=help1>% 17-may-2004, VF</span><br><span class=help1>% 14-may-2004, VF</span><br><span class=help1>% 7-july-2003, VF</span><br><br><hr><span class=comment>% timer</span><br>tic;<br><br><span class=comment>% processing inputs</span><br><span class=comment>% ======================================</span><br>[dim,num_data]=size(data.X);<br><span class=keyword>if</span> <span class=stack>nargin</span> < 2, options=[]; <span class=keyword>else</span> options=c2s(options); <span class=keyword>end</span><br><span class=keyword>if</span> ~isfield(options,<span class=quotes>'ker'</span>), options.ker = <span class=quotes>'linear'</span>; <span class=keyword>end</span><br><span class=keyword>if</span> ~isfield(options,<span class=quotes>'arg'</span>), options.arg = 1; <span class=keyword>end</span><br><span class=keyword>if</span> ~isfield(options,<span class=quotes>'mu'</span>), options.mu = 1e-4; <span class=keyword>end</span><br><span class=keyword>if</span> ~isfield(options,<span class=quotes>'C'</span>), options.C = inf; <span class=keyword>end</span><br><br><br><span class=comment>% creates matrices M and N </span><br><span class=comment>%=================================</span><br>inx1=find(data.y==1);<br>inx2=find(data.y==2);<br>l1=length(inx1);<br>l2=length(inx2);<br><br>K = kernel(data.X,options.ker,options.arg);<br><br>M1=sum(K(:,inx1),2)/l1;<br>M2=sum(K(:,inx2),2)/l2;<br>M=(M1-M2)*(M1-M2)';<br><br>E1=eye(l1,l1);<br>E2=eye(l2,l2);<br>J1=ones(l1,l1)/l1;<br>J2=ones(l2,l2)/l2;<br><br>N = K(:,inx1)*(E1-J1)*K(:,inx1)<span class=quotes>' + K(:,inx2)*(E2-J2)*K(:,inx2)'</span>;<br><br><span class=comment>% regularization</span><br>N = N + options.mu * eye(num_data,num_data);<br><br><span class=comment>% Optimization</span><br><span class=comment>%==============================</span><br><span class=comment>%%[Alpha,V,U] = svds( inv(N)*M,1);</span><br>Alpha=inv(N)*(M1-M2); <span class=comment>% It yields the same Alpha up to scale</span><br><br><span class=comment>% project data on the found direction</span><br>projx=(K*Alpha)';<br><br><span class=comment>% training bias of decision rule</span><br>lin_model = svm1d(struct(<span class=quotes>'X'</span>,projx,<span class=quotes>'y'</span>,data.y),struct(<span class=quotes>'C'</span>,options.C));<br><br><span class=comment>% fill output structure</span><br><span class=comment>%===============================</span><br>model.Alpha = lin_model.W*Alpha(:);<br>model.b = lin_model.b;<br>model.sv = data;<br><span class=keyword>if</span> strcmp(options.ker,<span class=quotes>'linear'</span>),<br> <span class=comment>% in the linar case compute normal vector explicitely</span><br> model.W = model.sv.X*model.Alpha;<br><span class=keyword>end</span> <br>model.trnerr = lin_model.trnerr;<br>model.nsv = num_data;<br>model.kercnt = num_data*(num_data+1)/2;<br>model.options = options;<br>model.fun = <span class=quotes>'svmclass'</span>;<br>model.cputime=toc;<br><br><span class=jump>return</span>;<br></code>
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -