📄 findbestthresholdfdp.m
字号:
%Builds a threshols classifier for the multiclass case
%classifier.a
%classifier.b
%classifier.thr
%classifier.hm
%classifier.ind (the feature)
%classifier.positive
%classifier.NClass
%classifier.k %for the non positive
function classifier = FindBestThresholdFDP(Data, labels, positiveClasses, Weights)
[Nfeatures, Nsamples] = size(Data);
NClasses = max(labels);
%Normalize per each row
for i=1:NClasses
Weights(i,:) = Weights(i,:)/sum(Weights(i,:));
end
%for each classe belonging to the positive classes
%z is the -1,+1 vector for the ones belonging to positiveClasses
z=zeros(1,Nsamples);
for i=1:size(positiveClasses,2)
classi = positiveClasses(i);
z=z+(labels==classi);
end
z = 2*z-1;
input.X = Data;
input.y = (z>0)+1;
output=lda(input,1);
w=output.W(:,1)';
[a(i),b(i),th(i),error(i)]=findSingleThreshold(w*Data,Weights,z,positiveClasses);
[error, ind] = min(error);
classifier.a = a(ind);
classifier.b = b(ind);
classifier.th = th(ind);
classifier.ind = ind;
classifier.NClass = NClasses;
classifier.positive = positiveClasses;
classifier.FisherProjection = w;
%Define the weak classifiers h for this step as:
%The decision stumps for the ones that belong to the class
for j=1:NClasses
if ismember(j,positiveClasses)
classifier.hm(j,:)=(classifier.a * (Data(classifier.ind,:)>classifier.th) + classifier.b);
end
end
%And k's for the non class classifications
for i=1:NClasses
Zz(i,:) = 2*(labels == i)-1;
end
classifier.k = zeros(1,NClasses);
for i=1:NClasses
if ismember(i,positiveClasses)
else
classifier.k(i)=sum(Zz(i,:).*Weights(i,:))/sum(Weights(i,:));
end
end
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -