📄 classify.m
字号:
function [Class, PercentageCorrect] = Classify(Ytest,Ypred,MLKP);
[Nobj,Nvar]=size(Ytest);
if (upper(MLKP.OutputCriterion) == 'MAX')
OffSet=Nvar;
Class=zeros(Nvar+1,Nvar+1);
else
OffSet=Nvar+2;
Class=zeros(Nvar+1,Nvar+3);
end
%
% calculate confusion matrix
%
for iobj=1:Nobj
Ytst=Ypred(iobj,:);
[Value, PredClass]=max(Ytst);
if (upper(MLKP.OutputCriterion) == 'MAX')
DefClass=0;
else
Ytst(PredClass) = -1;
if (Value > MLKP.Threshold)
if (Value-max(Ytst) > MLKP.Diff)
DefClass = 0; % classification unambiguously
else
DefClass = 2; % classification undecided
end
else
DefClass = 1; % classification unknown
end
end
[Dummy, RealClass]=max(Ytest(iobj,:));
if (DefClass == 0)
Class(RealClass,PredClass) = Class(RealClass,PredClass)+1;
else
Class(RealClass,Nvar+DefClass) = Class(RealClass,Nvar+DefClass)+1;
end
end
%
% calculate alpha and beta errors
%
for irow=1:Nvar
NFalseClass=0;
for icol=1:OffSet
if (icol == irow)
NTrueClass=Class(irow,icol);
else
NFalseClass=NFalseClass+Class(irow,icol);
end
end
if (NTrueClass+NFalseClass > 0)
Class(irow,OffSet+1)=round(100*(1-NTrueClass/(NTrueClass+NFalseClass)));
else
Class(irow,OffSet+1)=0;
end
end
for icol=1:Nvar
NFalseClass=0;
for irow=1:Nvar
if (icol == irow)
NTrueClass=Class(irow,icol);
else
NFalseClass=NFalseClass+Class(irow,icol);
end
end
if (NTrueClass > 0)
Class(Nvar+1,icol)=round(100*(1-NTrueClass/(NTrueClass+NFalseClass)));
else
Class(Nvar+1,icol)=0;
end
end
PercentageCorrect=100*sum(diag(Class))/Nobj;
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -