📄 bayescls.html
字号:
<html><head> <meta HTTP-EQUIV="Content-Type" CONTENT="text/html;charset=ISO-8859-1"> <title>bayescls.m</title><link rel="stylesheet" type="text/css" href="../../m-syntax.css"></head><body><code><span class=defun_kw>function</span> <span class=defun_out>[y, dfce] </span>= <span class=defun_name>bayescls</span>(<span class=defun_in> X, model </span>)<br><span class=h1>% BAYESCLS Bayesian classifier with reject option.</span><br><span class=help>% </span><br><span class=help>% <span class=help_field>Synopsis:</span></span><br><span class=help>% [y, dfce] = bayescls(X,model)</span><br><span class=help>%</span><br><span class=help>% <span class=help_field>Description:</span></span><br><span class=help>% This function implements the classifier minimizing the Bayesian risk </span><br><span class=help>% with 0/1-loss function. It corresponds to the minimization of </span><br><span class=help>% probability of misclassification. The input vectors X are classified </span><br><span class=help>% into classes with the highest a posterior probabilities computed from </span><br><span class=help>% given model.</span><br><span class=help>% </span><br><span class=help>% The model contains parameters of conditional class probabilities</span><br><span class=help>% in model.Pclass [cell 1 x num_classes] and a priory probabilities</span><br><span class=help>% in model.Prior [1 x num_classes]. </span><br><span class=help>%</span><br><span class=help>% The function</span><br><span class=help>% p = feval(model.Pclass{i}.fun, X, model.pclass{i})</span><br><span class=help>% is called to evaluate the i-the class conditional probability of X.</span><br><span class=help>% </span><br><span class=help>% It returns class labels y [1 x num_data] for each input vector</span><br><span class=help>% and matrix dfce [num_class x num_data] of unnormalized a posterior</span><br><span class=help>% probabilities</span><br><span class=help>% dfce(y,i) = Conditional_probability(X(:,i)|y)*Prior(y).</span><br><span class=help>%</span><br><span class=help>% If the field model.eps exists then the Bayesian classifier </span><br><span class=help>% with the reject option is used. The eps is penalty for the </span><br><span class=help>% decision "don't know" which is indicated by label y = 0.</span><br><span class=help>% </span><br><span class=help>% <span class=help_field>Input:</span></span><br><span class=help>% X [dim x num_data] Vectors to be classified.</span><br><span class=help>%</span><br><span class=help>% model [struct] Describes probabilistic model:</span><br><span class=help>% .Pclass [cell 1 x num_classes] Class conditional probabilities.</span><br><span class=help>% .Prior [1 x num_classes] A priory probabilities.</span><br><span class=help>%</span><br><span class=help>% .eps [1x1] (optional) Penalty of decision "don't know". </span><br><span class=help>%</span><br><span class=help>% <span class=help_field>Output:</span></span><br><span class=help>% y [1 x num_data] Labels (1 to num_classes); 0 for "don't know".</span><br><span class=help>% dfce [num_classes x num_data] Unnormalized a posterior </span><br><span class=help>% probabilities (see above).</span><br><span class=help>%</span><br><span class=help>% <span class=help_field>Example:</span></span><br><span class=help>% trn = load('riply_trn');</span><br><span class=help>% tst = load('riply_tst');</span><br><span class=help>% inx1 = find(trn.y==1);</span><br><span class=help>% inx2 = find(trn.y==2);</span><br><span class=help>% model.Pclass{1} = mlcgmm(trn.X(:,inx1));</span><br><span class=help>% model.Pclass{2} = mlcgmm(trn.X(:,inx2));</span><br><span class=help>% model.Prior = [length(inx1) length(inx2)]/(length(inx1)+length(inx2));</span><br><span class=help>% ypred = bayescls(tst.X,model);</span><br><span class=help>% cerror(ypred,tst.y)</span><br><span class=help>% </span><br><span class=help>% See also </span><br><span class=help>% BAYESDF, BAYESERR.</span><br><span class=help>%</span><br><hr><span class=help1>% <span class=help1_field>About:</span> Statistical Pattern Recognition Toolbox</span><br><span class=help1>% (C) 1999-2003, Written by Vojtech Franc and Vaclav Hlavac</span><br><span class=help1>% <a href="http://www.cvut.cz">Czech Technical University Prague</a></span><br><span class=help1>% <a href="http://www.feld.cvut.cz">Faculty of Electrical Engineering</a></span><br><span class=help1>% <a href="http://cmp.felk.cvut.cz">Center for Machine Perception</a></span><br><br><span class=help1>% <span class=help1_field>Modifications:</span></span><br><span class=help1>% 09-jun-2004, VF</span><br><span class=help1>% 01-may-2004, VF</span><br><span class=help1>% 11-mar-2004, VF, "don't" know decision added.</span><br><span class=help1>% 19-sep-2003, VF</span><br><br><hr>[dim,num_data]=size(X);<br>num_classes = length( model.Pclass );<br><br>dfce=zeros(num_classes,num_data);<br><br><span class=comment>% compute unnormalized a posterior probabilities</span><br><span class=keyword>for</span> i=1:num_classes,<br> dfce(i,:) = model.Prior(i)*<span class=eval>feval</span>(model.Pclass{i}.fun,X,model.Pclass{i});<br><span class=keyword>end</span><br><br><span class=comment>% take maximum</span><br>[tmp,y] = max(dfce);<br><br><span class=comment>% reject options</span><br><span class=keyword>if</span> isfield(model, <span class=quotes>'eps'</span>),<br> perror = 1-tmp./sum(dfce,1);<br> <br> inx = find( perror > model.eps);<br> y(inx) = 0;<br><span class=keyword>end</span><br><br><span class=jump>return</span>;<br><span class=comment>% EOF</span><br></code>
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -