📄 adaboost.html
字号:
<html><head> <meta HTTP-EQUIV="Content-Type" CONTENT="text/html;charset=ISO-8859-1"> <title>adaboost.m</title><link rel="stylesheet" type="text/css" href="../../m-syntax.css"></head><body><code><span class=defun_kw>function</span> <span class=defun_out>model </span>= <span class=defun_name>adaboost</span>(<span class=defun_in>data,options</span>)<br><span class=h1>% ADABOOST AdaBoost algorithm.</span><br><span class=help>%</span><br><span class=help>% <span class=help_field>Synopsis:</span></span><br><span class=help>% model = adaboost(data,options)</span><br><span class=help>%</span><br><span class=help>% <span class=help_field>Description:</span></span><br><span class=help>% This function implements the AdaBoost algorithm which</span><br><span class=help>% produces a classifier composed from a set of weak rules.</span><br><span class=help>% The weak rules are learned by a weak learner which is</span><br><span class=help>% specified in options.learner. The task of the weak learner</span><br><span class=help>% is to produce a rule with weighted error less then 0.5.</span><br><span class=help>% The Adaboost algorithm calls in each stage the weak learner</span><br><span class=help>%</span><br><span class=help>% rule{t} = feval(options.learner,weight_data)</span><br><span class=help>% </span><br><span class=help>% where the structure weight_data contains</span><br><span class=help>% .X [dim x num_data] Training vectors.</span><br><span class=help>% .y [1 x num_data] Labels of training vectos (1 or 2).</span><br><span class=help>% .D [1 x num_data] Distribution (weights) over training </span><br><span class=help>% data which defines the weighted error.</span><br><span class=help>% </span><br><span class=help>% The item rule{t}.fun must contain name of function</span><br><span class=help>% which classifies vector X by </span><br><span class=help>% y = feval( rule{t}.fun, X, rule{t}).</span><br><span class=help>%</span><br><span class=help>% It is assumed that the weak rule responds with labels </span><br><span class=help>% 1 or 2 (not 1,-1 as used in AdaBoost literature).</span><br><span class=help>%</span><br><span class=help>% <span class=help_field>Input:</span></span><br><span class=help>% data [struct] Input training data:</span><br><span class=help>% .X [dim x num_data] Training vectors.</span><br><span class=help>% .y [1 x num_data] Labels of training vectos (1 or 2).</span><br><span class=help>%</span><br><span class=help>% options [struct] Parameters of the AdaBoost:</span><br><span class=help>% .learner [string] Name of the weak learner.</span><br><span class=help>% .max_rules [1x1] Maximal number of weak rules (defaul 100).</span><br><span class=help>% This paramater defines a stopping condition.</span><br><span class=help>% .err_bound [1x1] AdaBoost stops if the upper bound on the </span><br><span class=help>% empirical error drops below the err_bound (default 0.001).</span><br><span class=help>% .learner_options Additinal options used when the weak learner</span><br><span class=help>% is called.</span><br><span class=help>% .verb [1x1] If 1 then some info is displayed.</span><br><span class=help>%</span><br><span class=help>% <span class=help_field>Output:</span></span><br><span class=help>% model [struct] AdaBoost classifier:</span><br><span class=help>% .rule [cell 1 x T] Weak classification rules.</span><br><span class=help>% .Alpha [1 x T] Weights of the rules.</span><br><span class=help>% .WeightedErr [1 x T] Weighted errors of the weak rules.</span><br><span class=help>% .Z [1 x T] Normalization constants of the distribution D.</span><br><span class=help>% .ErrBound [1 x T] Upper bounds on the empirical error.</span><br><span class=help>%</span><br><span class=help>% <span class=help_field>Example:</span></span><br><span class=help>% data = load('riply_trn');</span><br><span class=help>% options.learner = 'weaklearner';</span><br><span class=help>% options.max_rules = 100;</span><br><span class=help>% options.verb = 1;</span><br><span class=help>% model = adaboost(data,options);</span><br><span class=help>% figure; ppatterns(data); pboundary(model);</span><br><span class=help>% figure; hold on; plot(model.ErrBound,'r'); </span><br><span class=help>% plot(model.WeightedErr);</span><br><span class=help>%</span><br><span class=help>% See also: </span><br><span class=help>% ADACLASS, WEAKLEARNER.</span><br><span class=help>%</span><br><hr><span class=help1>% <span class=help1_field>About:</span> Statistical Pattern Recognition Toolbox</span><br><span class=help1>% (C) 1999-2004, Written by Vojtech Franc and Vaclav Hlavac</span><br><span class=help1>% <a href="http://www.cvut.cz">Czech Technical University Prague</a></span><br><span class=help1>% <a href="http://www.feld.cvut.cz">Faculty of Electrical Engineering</a></span><br><span class=help1>% <a href="http://cmp.felk.cvut.cz">Center for Machine Perception</a></span><br><br><span class=help1>% <span class=help1_field>Modifications:</span></span><br><span class=help1>% 11-aug-2004, VF</span><br><br><hr><span class=keyword>if</span> <span class=stack>nargin</span> < 2, options = []; <span class=keyword>else</span> options = c2s(options); <span class=keyword>end</span><br><span class=keyword>if</span> ~isfield(options,<span class=quotes>'max_rules'</span>), options.max_rules = 100; <span class=keyword>end</span><br><span class=keyword>if</span> ~isfield(options,<span class=quotes>'err_bound'</span>), options.err_bound = 0.001; <span class=keyword>end</span><br><span class=keyword>if</span> ~isfield(options,<span class=quotes>'learner'</span>), options.learner = <span class=quotes>'weaklearner'</span>; <span class=keyword>end</span><br><span class=keyword>if</span> ~isfield(options,<span class=quotes>'learner_options'</span>), options.learner_options = []; <span class=keyword>end</span><br><span class=keyword>if</span> ~isfield(options,<span class=quotes>'verb'</span>), options.verb = 0; <span class=keyword>end</span><br><br><span class=comment>% take data dimensions</span><br>[dim,num_data] = size(data.X);<br><br><span class=comment>% initial distribution over training samples</span><br>data.D = ones(num_data,1)/num_data;<br><br>model.Alpha =[];<br>model.Z = [];<br>model.WeightedErr = [];<br>model.ErrBound = [];<br><br>t = 0;<br>go = 1;<br><span class=keyword>while</span> go,<br> t = t + 1;<br><br> <span class=keyword>if</span> options.verb, <span class=io>fprintf</span>(<span class=quotes>'rule %d: '</span>, t); <span class=keyword>end</span><br><br> <span class=comment>% call weak learner</span><br> <span class=keyword>if</span> ~isempty(options.learner_options),<br> rule = <span class=eval>feval</span>(options.learner,data,options.learner_options);<br> <span class=keyword>else</span><br> rule = <span class=eval>feval</span>(options.learner,data);<br> <span class=keyword>end</span> <br> <br> y = <span class=eval>feval</span>(rule.fun,data.X,rule);<br><br> werr = (y~=data.y)*data.D;<br> <span class=keyword>if</span> options.verb, <span class=io>fprintf</span>(<span class=quotes>'werr=%f'</span>, werr); <span class=keyword>end</span><br> <br> <span class=keyword>if</span> werr < 0.5,<br> <br> alpha = 0.5*log((1-werr)/werr);<br><br> <span class=comment>% yh(i) = +1 for data.y(i) == y(i)</span><br> <span class=comment>% yh(i) = -1 for data.y(i) ~= y(i)</span><br> yh = 2*(y == data.y)-1;<br> weights = data.D.*exp(-alpha*yh(:));<br><br> <span class=comment>% normalization constant</span><br> Z = sum(weights);<br> data.D = weights/Z;<br><br> <span class=comment>% upper bound on the training error</span><br> err_bound = prod(model.Z);<br> <br> <span class=comment>% store variables</span><br> model.Z = [model.Z; Z];<br> model.Alpha = [model.Alpha;alpha];<br> model.rule{t} = rule;<br> model.ErrBound = [model.ErrBound; err_bound];<br> <br> <span class=comment>% stopping conditions</span><br> <span class=keyword>if</span> t >= options.max_rules,<br> go = 0;<br> model.exitflag = 1;<br> <span class=keyword>elseif</span> err_bound <= options.err_bound,<br> go = 0;<br> model.exitflag = 2;<br> <span class=keyword>end</span><br> <br> <span class=keyword>if</span> options.verb,<br> <span class=io>fprintf</span>(<span class=quotes>', alpha=%f, err_bound=%f\n'</span>,alpha,err_bound);<br> <span class=keyword>end</span><br> <br> <span class=keyword>else</span><br> <span class=comment>% the weighted error is over 0.5</span><br> <span class=keyword>if</span> options.verb, <span class=io>fprintf</span>(<span class=quotes>'>= 0.5 thus stop.\n'</span>); <span class=keyword>end</span><br> <br> go = 0;<br> model.exitflag = 0;<br> <span class=keyword>end</span><br><br> model.WeightedErr = [model.WeightedErr; werr];<br><br><span class=keyword>end</span><br><br>model.fun = <span class=quotes>'adaclass'</span>;<br><br><span class=jump>return</span>;<br><span class=comment>% EOF</span><br></code>
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -