⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 mchierarchyclassify.m

📁 一个matlab的工具包,里面包括一些分类器 例如 KNN KMEAN SVM NETLAB 等等有很多.
💻 M
字号:
function  [Y_compute, Y_prob] = MCHierarchyClassify(classifier, para, X_train, Y_train, X_test, Y_test, num_class)

class_set = GetClassSet(Y_train);
p = str2num(char(ParseParameter(para, {'-PosNegRatio'; '-DevSet'}, {'0.5';'0'})));
sizefactor = p(1);
SampleDevSet = p(2);

if (num_class ~= 2), 
    fprintf('Error: The number of classes is larger than 2!');
    return;
end;

[meta_classifier, para_meta_classifier, classifier] = ParseCmd(classifier, '--');
if (isempty(classifier)), 
    error('The bottom level classifier is not provided!');
end;

num_test = size(Y_test, 1);
num_data = size(Y_train, 1);
X_train = X_train(1:num_data - num_test, :);
Y_train = Y_train(1:num_data - num_test);

data_neg = [];
data_pos = [];
for j = 1:size(Y_train, 1)
      if (Y_train(j) ~= class_set(1)) 
            data_neg = [data_neg; X_train(j, :)];
      else
            data_pos = [data_pos; X_train(j, :)];
      end;
end;

% Sample the nearest examples as the development set
X_develop = [];
Y_develop = [];
if (SampleDevSet == 0),
    for j = 1:size(Y_test, 1)
        Distance = [];
        for k= 1:size(X_train, 1)
            %Distance = [Distance; sqrt(sum((X_train(k,:) - X_test(j,:)) .* (X_train(k,:) -  X_test(j,:))))];
            distance = sum(X_train(k, :) .* X_test(j,:));
            distance = distance / sqrt(sum(X_train(k, :) .* X_train(k, :)));
            distance = distance / sqrt(sum(X_test(j, :) .* X_test(j, :)));
            Distance = [Distance; distance];
        end;
        [junk, index] = max(Distance);
        X_develop = [X_develop; X_train(index, :)];
        Y_develop = [Y_develop; Y_train(index, :)];
    end;
else, 
    X_develop = X_train;
    Y_develop = Y_train;
end;

num_positive = sum(Y_train == class_set(1));
num_negative = sum(Y_train ~= class_set(1));

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
 
num_develop_positive = sum(Y_develop == class_set(1));
num_develop_negative = sum(Y_develop ~= class_set(1));

downsize = fix((1 - sizefactor) / sizefactor * num_positive);
num_group = fix(num_negative / downsize);
downsize = fix(num_negative / num_group);
fprintf('Sizefactor:%f GroupNum:%d PosSize:%d NegSize:%d PosDev:%d NegDev:%d\n', ...
            sizefactor, num_group, num_positive, num_negative, num_develop_positive, num_develop_negative);

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
X_combinetest = [X_develop; X_test];
Y_combinetest = [Y_develop; Y_test];
num_develop = size(Y_develop, 1);
num_train = size(Y_train, 1);

Y_all = zeros(size(Y_combinetest, 1), num_group);
for i = 1:num_group
    data_additional = data_neg(floor((i-1)*num_negative/num_group)+1 : floor(i*num_negative/num_group), :);
    X_train = [data_pos; data_additional];
       
    label_pos = ones(size(data_pos, 1), 1) * class_set(1);
    label_additional = ones(size(data_additional, 1), 1) * class_set(2);
    Y_train = [label_pos; label_additional];
    
    [Y_compute, Y_prob] = Classify(classifier, X_train, Y_train, X_combinetest, Y_combinetest, num_class);
    Y_all(:, i) = Y_prob;    
end;

D_develop = Y_all(1:num_develop, :);
D_test = Y_all(num_develop+1:num_develop+num_test, :);
% [Y_compute, Y_prob]  = Classify('SVM_LIGHT', 'Kernel 0 KernelParam 0 CostFactor 10', D_develop, Y_develop, D_test, Y_test, num_class);
[Y_compute, Y_prob]  = Classify(strcat(meta_classifier, ' ', para_meta_classifier), D_develop, Y_develop, D_test, Y_test, num_class);

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -