⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 mcadaboostm1.m

📁 一个matlab的工具包,里面包括一些分类器 例如 KNN KMEAN SVM NETLAB 等等有很多.
💻 M
字号:
% Written by Rong Jin
% Revised by Rong Yan

function [Y_compute, Y_prob] = MCAdaBoostM1(classifier, para, X_train, Y_train, X_test, Y_test, num_class)

rand('state', 40);
class_set = GetClassSet(Y_train);

p = str2num(char(ParseParameter(para, {'-Iter';'-SampleRatio'}, {'10';'1'})));
Max_Iter = p(1);
Sample_Ratio = p(2);

%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Data for models
%%%%%%%%%%%%%%%%%%%%%%%%%%%%

num_data = length(Y_train);

% Initialize the data 
Dist = ones(length(Y_train), 1) ./ length(Y_train);

X_Sample = X_train;
Y_Sample = Y_train;
Y_compute_train_matrix = zeros(length(Y_train), num_class);
Y_compute_test_matrix = zeros(length(Y_test), num_class);
Y_compute = zeros(length(Y_test), 1);
Y_prob = zeros(length(Y_test), 1);

for iter = 0:Max_Iter
    % Compute scores for training data
    [Y_compute_all junk] = Classify(classifier, X_Sample, Y_Sample, [X_train; X_test], [Y_train; Y_test], num_class);
    Y_compute_train  = Y_compute_all(1:length(Y_train), :);
    Y_compute_test  = Y_compute_all(length(Y_train)+1:length(Y_train)+length(Y_test), :);
    
    Weight_Err = sum(Dist .* (Y_compute_train ~= Y_train));
    %Weight_Err = (1 - Weight_Err) / 2;
    if Weight_Err == 0, fprintf('Terminated: Training Error is Zero!'); break, end;
    if Weight_Err >= 0.5, fprintf('Terminated: Training Error is Larger than 0.5!'); break, end
    beta = Weight_Err / (1 - Weight_Err);
    alpha = -log(beta); % log(1/beta)
    fprintf('%d: beta = %f, alpha = %f\n', iter, beta, alpha);
    for i = 1:num_class, 
        ind = find(Y_compute_train == class_set(i));
        Y_compute_train_matrix(ind, i) = Y_compute_train_matrix(ind, i) + alpha;
    end;
    [junk Index] = max(Y_compute_train_matrix, [], 2);
    fprintf('Training: '); CalculatePerformance(class_set(Index), Y_train, class_set, 0);

    % Compute the predictions
    %[Y_compute Test_Y_Prob] = Classify(classifier, para, X_Sample, Y_Sample, X_test, Y_test, num_class);    
    % Test_Y_Combine = Test_Y_Combine + Test_Y_Pred * Comb_Const;

    %[Y_compute_test junk] = Classify(classifier, X_Sample, Y_Sample, X_test, Y_test, num_class);    
    for i = 1:num_class, 
        ind = find(Y_compute_test == class_set(i));
        Y_compute_test_matrix(ind, i) = Y_compute_test_matrix(ind, i) + alpha;
    end;
    [Y_prob Index] = max(Y_compute_test_matrix, [], 2);
    Y_compute = class_set(Index);
    fprintf('Testing: '); CalculatePerformance(Y_compute, Y_test, class_set, 0);
    
    % Compute the sampling distribution
    Dist = Dist .* ((Y_compute_train ~= Y_train) + (Y_compute_train == Y_train) .* beta);
    Dist = Dist ./ sum(Dist);

    % Sample data and retrain the model
    Y_Sample = [];
    while (length(unique(Y_Sample)) < num_class), 
        num_samples = ceil(length(Y_train) * Sample_Ratio);
        Sample_Idx = SampleDistribution(Dist, num_samples);
        X_Sample = X_train(Sample_Idx, :);
        Y_Sample = Y_train(Sample_Idx);
    end;      
end

% Convert Y_prob to probability 
Y_prob = 1 ./ (1 + exp(-Y_prob)); 

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -