📄 adaboost.html
字号:
<html><head> <meta HTTP-EQUIV="Content-Type" CONTENT="text/html;charset=ISO-8859-1"> <title>Contents.m</title><link rel="stylesheet" type="text/css" href="../stpr.css"></head><body><table border=0 width="100%" cellpadding=0 cellspacing=0><tr valign="baseline"><td valign="baseline" class="function"><b class="function">ADABOOST</b><td valign="baseline" align="right" class="function"><a href="../misc/index.html" target="mdsdir"><img border = 0 src="../up.gif"></a></table> <p><b>AdaBoost algorithm.</b></p> <hr><div class='code'><code><span class=help></span><br><span class=help> <span class=help_field>Synopsis:</span></span><br><span class=help> model = adaboost(data,options)</span><br><span class=help></span><br><span class=help> <span class=help_field>Description:</span></span><br><span class=help> This function implements the AdaBoost algorithm which</span><br><span class=help> produces a classifier composed from a set of weak rules.</span><br><span class=help> The weak rules are learned by a weak learner which is</span><br><span class=help> specified in options.learner. The task of the weak learner</span><br><span class=help> is to produce a rule with weighted error less then 0.5.</span><br><span class=help> The Adaboost algorithm calls in each stage the weak learner</span><br><span class=help></span><br><span class=help> rule{t} = feval(options.learner,weight_data)</span><br><span class=help> </span><br><span class=help> where the structure weight_data contains</span><br><span class=help> .X [dim x num_data] Training vectors.</span><br><span class=help> .y [1 x num_data] Labels of training vectos (1 or 2).</span><br><span class=help> .D [1 x num_data] Distribution (weights) over training </span><br><span class=help> data which defines the weighted error.</span><br><span class=help> </span><br><span class=help> The item rule{t}.fun must contain name of function</span><br><span class=help> which classifies vector X by </span><br><span class=help> y = feval( rule{t}.fun, X, rule{t}).</span><br><span class=help></span><br><span class=help> It is assumed that the weak rule responds with labels </span><br><span class=help> 1 or 2 (not 1,-1 as used in AdaBoost literature).</span><br><span class=help></span><br><span class=help> <span class=help_field>Input:</span></span><br><span class=help> data [struct] Input training data:</span><br><span class=help> .X [dim x num_data] Training vectors.</span><br><span class=help> .y [1 x num_data] Labels of training vectos (1 or 2).</span><br><span class=help></span><br><span class=help> options [struct] Parameters of the AdaBoost:</span><br><span class=help> .learner [string] Name of the weak learner.</span><br><span class=help> .max_rules [1x1] Maximal number of weak rules (defaul 100).</span><br><span class=help> This paramater defines a stopping condition.</span><br><span class=help> .err_bound [1x1] AdaBoost stops if the upper bound on the </span><br><span class=help> empirical error drops below the err_bound (default 0.001).</span><br><span class=help> .learner_options Additinal options used when the weak learner</span><br><span class=help> is called.</span><br><span class=help> .verb [1x1] If 1 then some info is displayed.</span><br><span class=help></span><br><span class=help> <span class=help_field>Output:</span></span><br><span class=help> model [struct] AdaBoost classifier:</span><br><span class=help> .rule [cell 1 x T] Weak classification rules.</span><br><span class=help> .Alpha [1 x T] Weights of the rules.</span><br><span class=help> .WeightedErr [1 x T] Weighted errors of the weak rules.</span><br><span class=help> .Z [1 x T] Normalization constants of the distribution D.</span><br><span class=help> .ErrBound [1 x T] Upper bounds on the empirical error.</span><br><span class=help></span><br><span class=help> <span class=help_field>Example:</span></span><br><span class=help> data = load('riply_trn');</span><br><span class=help> options.learner = 'weaklearner';</span><br><span class=help> options.max_rules = 100;</span><br><span class=help> options.verb = 1;</span><br><span class=help> model = adaboost(data,options);</span><br><span class=help> figure; ppatterns(data); pboundary(model);</span><br><span class=help> figure; hold on; plot(model.ErrBound,'r'); </span><br><span class=help> plot(model.WeightedErr);</span><br><span class=help></span><br><span class=help> See also: </span><br><span class=help> ADACLASS, WEAKLEARNER.</span><br><span class=help></span><br></code></div> <hr> <b>Source:</b> <a href= "../misc/list/adaboost.html">adaboost.m</a> <p><b class="info_field">About: </b> Statistical Pattern Recognition Toolbox<br> (C) 1999-2004, Written by Vojtech Franc and Vaclav Hlavac<br> <a href="http://www.cvut.cz">Czech Technical University Prague</a><br> <a href="http://www.feld.cvut.cz">Faculty of Electrical Engineering</a><br> <a href="http://cmp.felk.cvut.cz">Center for Machine Perception</a><br> <p><b class="info_field">Modifications: </b> <br> 11-aug-2004, VF<br></body></html>
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -