📄 svmquadprog.html
字号:
<html><head> <meta HTTP-EQUIV="Content-Type" CONTENT="text/html;charset=ISO-8859-1"> <title>svmquadprog.m</title><link rel="stylesheet" type="text/css" href="../../m-syntax.css"></head><body><code><span class=defun_kw>function</span> <span class=defun_out>model </span>= <span class=defun_name>svmquadprog</span>(<span class=defun_in>data,options</span>)<br><span class=h1>% SVMQUADPROG SVM trained by Matlab Optimization Toolbox.</span><br><span class=help>%</span><br><span class=help>% <span class=help_field>Synopsis:</span></span><br><span class=help>% model = svmquadprog( data )</span><br><span class=help>% model = svmquadprog( data, options )</span><br><span class=help>%</span><br><span class=help>% <span class=help_field>Description:</span></span><br><span class=help>% This function trains binary Support Vector Machines classifer </span><br><span class=help>% with L1 or L2-soft margin. The SVM quadratic programming task </span><br><span class=help>% is solved by the 'quadprog.m' of the Matlab Optimization toolbox.</span><br><span class=help>%</span><br><span class=help>% See 'help svmclass' to see how to classify data with found classifier.</span><br><span class=help>% </span><br><span class=help>% <span class=help_field>Input:</span></span><br><span class=help>% data [struct] Binary labeled training data:</span><br><span class=help>% .X [dim x num_data] Vectors.</span><br><span class=help>% .y [1 x num_data] Training labels.</span><br><span class=help>%</span><br><span class=help>% options [struct] Control parameters:</span><br><span class=help>% .ker [string] Kernel identifier (default 'linear'). </span><br><span class=help>% See 'help kernel' for more info.</span><br><span class=help>% .arg [1 x nargs] Kernel argument(s).</span><br><span class=help>% .C SVM regularization constant (default inf):</span><br><span class=help>% [1 x 1] .. the same for all training vectors.</span><br><span class=help>% [1 x 2] .. for each class separately C=[C1,C2],</span><br><span class=help>% [1 x num_data] .. each training vector separately.</span><br><span class=help>% .norm [1x1] 1 .. L1-soft margin penalization (default).</span><br><span class=help>% 2 .. L2-soft margin penalization.</span><br><span class=help>%</span><br><span class=help>% <span class=help_field>Output:</span></span><br><span class=help>% model [struct] Binary SVM classifier:</span><br><span class=help>% .Alpha [nsv x 1] Weights.</span><br><span class=help>% .b [1x1] Bias of the decision function.</span><br><span class=help>% .sv.X [dim x nsv] Support vectors.</span><br><span class=help>% .nsv [1x1] Number of support vectors.</span><br><span class=help>% .kercnt [1x1] Number of used kernel evaluations.</span><br><span class=help>% .trnerr [1x1] Training classification error.</span><br><span class=help>% .margin [1x1] Margin of found classifier.</span><br><span class=help>% .cputime [1x1] Used CPU time in seconds.</span><br><span class=help>% .options [struct] Copy of used options.</span><br><span class=help>% .exitflag [1x1] Exitflag of the QUADPROG function. </span><br><span class=help>% (if > 0 then it has converged to the solution).</span><br><span class=help>%</span><br><span class=help>% <span class=help_field>Example:</span></span><br><span class=help>% data = load('riply_trn');</span><br><span class=help>% options = struct('ker','rbf','arg',1,'C',10);</span><br><span class=help>% model = svmquadprog(data,options)</span><br><span class=help>% figure; ppatterns(data); psvm(model);</span><br><span class=help>%</span><br><span class=help>% See also </span><br><span class=help>% SMO, SVMLIGHT, SVMCLASS.</span><br><span class=help>%</span><br><hr><span class=help1>% <span class=help1_field>About:</span> Statistical Pattern Recognition Toolbox</span><br><span class=help1>% (C) 1999-2003, Written by Vojtech Franc and Vaclav Hlavac</span><br><span class=help1>% <a href="http://www.cvut.cz">Czech Technical University Prague</a></span><br><span class=help1>% <a href="http://www.feld.cvut.cz">Faculty of Electrical Engineering</a></span><br><span class=help1>% <a href="http://cmp.felk.cvut.cz">Center for Machine Perception</a></span><br><br><span class=help1>% <span class=help1_field>Modifications:</span></span><br><span class=help1>% 31-may-2004, VF</span><br><span class=help1>% 16-may-2004, VF</span><br><span class=help1>% 17-Feb-2003, VF</span><br><span class=help1>% 28-Nov-2001, VF, used quadprog instead of qp</span><br><span class=help1>% 23-Occt-2001, VF</span><br><span class=help1>% 19-September-2001, V. Franc, renamed to svmmot.</span><br><span class=help1>% 8-July-2001, V.Franc, comments changed, bias mistake removed.</span><br><span class=help1>% 28-April-2001, V.Franc, flps counter added</span><br><span class=help1>% 10-April-2001, V. Franc, created</span><br><br><hr><span class=comment>% timer</span><br>tic;<br><br><span class=comment>% input aruments</span><br><span class=comment>%-------------------------------------------</span><br>data=c2s(data);<br>[dim,num_data]=size(data.X);<br><span class=keyword>if</span> <span class=stack>nargin</span> < 2, options=[]; <span class=keyword>else</span> options=c2s(options); <span class=keyword>end</span><br><span class=keyword>if</span> ~isfield(options,<span class=quotes>'ker'</span>), options.ker = <span class=quotes>'linear'</span>; <span class=keyword>end</span><br><span class=keyword>if</span> ~isfield(options,<span class=quotes>'arg'</span>), options.arg = 1; <span class=keyword>end</span><br><span class=keyword>if</span> ~isfield(options,<span class=quotes>'C'</span>), options.C = inf; <span class=keyword>end</span><br><span class=keyword>if</span> ~isfield(options,<span class=quotes>'norm'</span>), options.norm = 1; <span class=keyword>end</span><br><span class=keyword>if</span> ~isfield(options,<span class=quotes>'mu'</span>), options.mu = 1e-12; <span class=keyword>end</span><br><span class=keyword>if</span> ~isfield(options,<span class=quotes>'eps'</span>), options.eps = 1e-12; <span class=keyword>end</span><br><br><span class=comment>% Set up QP task</span><br><span class=comment>%----------------------------</span><br><br><span class=comment>% labels {1,2} -> {1,-1}</span><br>y = data.y(:);<br>y(find(y==2)) = -1;<br><br><span class=comment>% compute kernel matrix</span><br>H = kernel(data.X,data.X,options.ker,options.arg).*(y*y');<br><br><span class=comment>% add small numbers to diagonal </span><br>H = H + options.mu*eye(size(H));<br><br>Aeq = y';<br>beq = 0;<br><br>f = -ones(num_data,1); <span class=comment>% Alpha</span><br><br>LB = zeros(num_data,1); <span class=comment>% 0 <= Alpha</span><br>x0 = zeros(num_data,1); <span class=comment>% starting point</span><br><br><span class=keyword>if</span> options.norm==1,<br> <span class=comment>% L1-soft margin</span><br> <span class=comment>%---------------------</span><br> <span class=keyword>if</span> length(options.C) == 1,<br> UB = options.C*ones(num_data,1);<br> <span class=keyword>elseif</span> length(options.C) == 2,<br> UB=zeros(num_data,1);<br> UB(find(data.y==1))=options.C(1);<br> UB(find(data.y==2))=options.C(2);<br> <span class=keyword>else</span><br> UB=options.C(:);<br> <span class=keyword>end</span><br> vectorC=zeros(num_data,1);<br><span class=keyword>else</span><br> <span class=comment>% L2-soft margin</span><br> <span class=comment>%---------------------</span><br> UB=ones(num_data,1)*inf;<br> vectorC = ones(num_data,1);<br> <span class=keyword>if</span> length(options.C) == 1,<br> vectorC = vectorC*options.C;<br> <span class=keyword>elseif</span> length(options.C) == 2,<br> inx1=find(data.y==1);inx2=find(data.y==2);<br> vectorC(inx1)=options.C(1);<br> vectorC(inx2)=options.C(2);<br> <span class=keyword>else</span><br> vectorC = options.C(:);<br> <span class=keyword>end</span><br> vectorC = 1./(2*vectorC);<br> H = H + diag(vectorC);<br><span class=keyword>end</span><br><br><span class=comment>% call optimization toolbox </span><br><span class=comment>% ----------------------------------</span><br>qp_options = optimset(<span class=quotes>'Display'</span>,<span class=quotes>'off'</span>);<br>[Alpha,fval,exitflag] = quadprog(H, f, [],[],Aeq, beq, LB, UB, x0, qp_options);<br><br>inx_sv = find( Alpha > options.eps);<br><br><span class=comment>% compute bias</span><br><span class=comment>%--------------------------</span><br><span class=comment>% take boundary (f(x)=+/-1) support vectors 0 < Alpha < C</span><br>inx_bound = find( Alpha > options.eps & Alpha < (options.C - options.eps));<br><br><span class=keyword>if</span> length( inx_bound ) ~= 0,<br> model.b = sum(y(inx_bound)-H(inx_bound,inx_sv)*...<br> Alpha(inx_sv).*y(inx_bound))/length( inx_bound );<br><span class=keyword>else</span><br> <span class=io>disp</span>(<span class=quotes>'Bias cannot be determined.'</span>);<br> model.b=0;<br><span class=keyword>end</span><br><br><span class=comment>% compute margin</span><br><span class=comment>%------------------------</span><br><span class=keyword>if</span> options.norm == 1<br> w2 = Alpha(inx_sv)'*H(inx_sv,inx_sv)*Alpha(inx_sv);<br><span class=keyword>else</span><br> w2 = Alpha(inx_sv)'*(H(inx_sv,inx_sv)-diag(vectorC(inx_sv)))*Alpha(inx_sv);<br><span class=keyword>end</span><br><br>margin = 1/sqrt(w2);<br><br><br><span class=comment>% compute training classification error </span><br><span class=comment>%-------------------------------------------</span><br>Alpha = Alpha.*y;<br>model.Alpha = Alpha( inx_sv );<br>model.sv.X = data.X(:,inx_sv );<br>model.sv.y = data.y(inx_sv );<br>model.sv.inx = inx_sv;<br>model.nsv = length( inx_sv );<br>model.margin = margin;<br>model.exitflag = exitflag;<br>model.options = options;<br>model.kercnt = num_data*(num_data+1)/2;<br>model.trnerr = cerror(data.y,svmclass(data.X, model));<br>model.fun = <span class=quotes>'svmclass'</span>;<br><br><span class=comment>% used CPU time</span><br>model.cputime=toc;<br><br><span class=jump>return</span>;<br><span class=comment>% EOF</span><br></code>
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -