📄 svm2.html
字号:
<html><head> <meta HTTP-EQUIV="Content-Type" CONTENT="text/html;charset=ISO-8859-1"> <title>Contents.m</title><link rel="stylesheet" type="text/css" href="../stpr.css"></head><body><table border=0 width="100%" cellpadding=0 cellspacing=0><tr valign="baseline"><td valign="baseline" class="function"><b class="function">SVM2</b><td valign="baseline" align="right" class="function"><a href="../svm/index.html" target="mdsdir"><img border = 0 src="../up.gif"></a></table> <p><b>Learning of binary SVM classifier with L2-soft margin.</b></p> <hr><div class='code'><code><span class=help></span><br><span class=help> <span class=help_field>Synopsis:</span></span><br><span class=help> model = svm2(data)</span><br><span class=help> model = svm2(data,options)</span><br><span class=help></span><br><span class=help> <span class=help_field>Description:</span></span><br><span class=help> This function learns binary Support Vector Machines</span><br><span class=help> classifier with L2-soft margin. The corresponding quadratic </span><br><span class=help> programming task is solved by one of the following </span><br><span class=help> <span class=help_field>algorithms:</span></span><br><span class=help> mdm ... Mitchell-Demyanov-Malozemov (MDM) algorithm.</span><br><span class=help> imdm ... Improved MDM algorithm (IMDM) (defaut).</span><br><span class=help></span><br><span class=help> For more info refer to V.Franc: Optimization Algorithms for Kernel </span><br><span class=help> Methods. Research report. CTU-CMP-2005-22. CTU FEL Prague. 2005.</span><br><span class=help> ftp://cmp.felk.cvut.cz/pub/cmp/articles/franc/Franc-PhD.pdf .</span><br><span class=help></span><br><span class=help> <span class=help_field>Input:</span></span><br><span class=help> data [struct] Training data:</span><br><span class=help> .X [dim x num_data] Training vectors.</span><br><span class=help> .y [1 x num_data] Labels must equal 1 and/or 2.</span><br><span class=help></span><br><span class=help> options [struct] Control parameters:</span><br><span class=help> .ker [string] Kernel identifier. See 'help kernel'.</span><br><span class=help> .arg [1 x nargs] Kernel argument(s).</span><br><span class=help> .C [1x1] Regularization constant.</span><br><span class=help> .solver [string] Solver to be used: 'mdm', 'imdm' (default).</span><br><span class=help> .tmax [1x1] Maximal number of iterations (default inf).</span><br><span class=help> .tolabs [1x1] Absolute tolerance stopping condition (default 0.0).</span><br><span class=help> .tolrel [1x1] Relative tolerance stopping condition (default 1e-3).</span><br><span class=help> .thlb [1x1] Threshold on lower bound (default inf).</span><br><span class=help> .cache [1x1] #of columns of kernel matrix to be cached (default 1000).</span><br><span class=help> .verb [1x1] If > 0 then some info is displayed (default 0).</span><br><span class=help></span><br><span class=help> <span class=help_field>Output:</span></span><br><span class=help> model [struct] Binary SVM classifier:</span><br><span class=help> .Alpha [nsv x 1] Weights of support vectors.</span><br><span class=help> .b [1x1] Bias of decision function.</span><br><span class=help> .sv.X [dim x nsv] Support vectors.</span><br><span class=help> .sv.inx [1 x nsv] Indices of SVs (model.sv.X = data.X(:,inx)).</span><br><span class=help> .nsv [int] Number of Support Vectors.</span><br><span class=help> .kercnt [1x1] Number of kernel evaluations.</span><br><span class=help> .trnerr [1x1] Classification error on training data.</span><br><span class=help> .margin [1x1] Margin.</span><br><span class=help> .options [struct] Copy of used options.</span><br><span class=help> .cputime [1x1] Used CPU time in seconds (meassured by tic-toc).</span><br><span class=help> .stat [struct] Statistics about optimization:</span><br><span class=help> .access [1x1] Number of requested columns of matrix H.</span><br><span class=help> .t [1x1] Number of iterations.</span><br><span class=help> .UB [1x1] Upper bound on the optimal value of criterion. </span><br><span class=help> .LB [1x1] Lower bound on the optimal value of criterion. </span><br><span class=help> .LB_History [1x(t+1)] LB with respect to iteration.</span><br><span class=help> .UB_History [1x(t+1)] UB with respect to iteration.</span><br><span class=help> .NA [1x1] Number of non-zero entries in solution.</span><br><span class=help></span><br><span class=help> <span class=help_field>Example:</span></span><br><span class=help> data = load('riply_trn');</span><br><span class=help> options = struct('ker','rbf','arg',1,'C',1);</span><br><span class=help> model = svm2(data,options )</span><br><span class=help> figure; ppatterns(data); psvm( model );</span><br><span class=help></span><br><span class=help> See also</span><br><span class=help> SVMCLASS, SVMLIGHT, SMO, GNPP.</span><br><span class=help></span><br></code></div> <hr> <b>Source:</b> <a href= "../svm/list/svm2.html">svm2.m</a> <p><b class="info_field">About: </b> Statistical Pattern Recognition Toolbox<br> (C) 1999-2005, Written by Vojtech Franc and Vaclav Hlavac<br> <a href="http://www.cvut.cz">Czech Technical University Prague</a><br> <a href="http://www.feld.cvut.cz">Faculty of Electrical Engineering</a><br> <a href="http://cmp.felk.cvut.cz">Center for Machine Perception</a><br> <p><b class="info_field">Modifications: </b> <br> 09-sep-2005, VF<br> 08-aug-2005, VF<br> 24-jan-2005, VF<br> 29-nov-2004, VF<br></body></html>
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -