⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 iis_classify.m

📁 一个matlab的工具包,里面包括一些分类器 例如 KNN KMEAN SVM NETLAB 等等有很多.
💻 M
字号:
function [Y_compute, Y_prob] = IIS_classify(para, X_train, Y_train, X_test, Y_test, num_class)

% This function will employ IIS algorithm to find the optimal weight

class_set = GetClassSet(Y_train);
p = str2num(char(ParseParameter(para, {'-MinValue';'-Iter'; '-MinDiff'; '-Sigma'}, {'0.000001';'50';'1e-7';'0'})));

% constants
min_value = p(1);
max_iter = p(2);
min_diff = p(3);
% Smoothing covariance
sigma = p(4);

X = X_train;
Y = Y_train;
testX = X_test;

% Load in data
[num_data, num_feature] = size(X);

% feature selection
X = abs(X);
X = (X == 0) .* min_value + X;
X = X';

% Adjust the class labels
% class_set = unique(Y);
New_Y = zeros(num_data, 1);
for i = 1:length(class_set)
	New_Y = New_Y + (Y == class_set(i)) .* i;
end
Y = New_Y;
% num_class = max(Y);

% compute the f*f_sum
X_sum = sum(X)';
X_feature_sum = scale_cols(X, X_sum);

% Initialize the weights
Weights = zeros(num_feature, num_class);
% compute the empirical estimation for the features both the positive and the negative class
for class = 1:num_class
    X_class = X(:, Y == class);
    if (size(X_class, 2) > 1), X_class = sum(X_class')'; end;
	Emp_Feature_Exp_Vec(:, class) = X_class;   
end
% Compute the label index array
Label_Index = Y + ([0:(num_data - 1)])' .* num_class;

% IIS
fprintf('Iter:%4d, LL:%11.7f', 0, 0);
Old_Avg_LL = 0;
Avg_LL = 0;

for iter = 1:max_iter
   % compute the logarithm likelihood
   Mod_Prob = exp(Weights' * X);
   Sum_Prob = sum(Mod_Prob)';
   Mod_Prob = scale_cols(Mod_Prob, 1 ./ Sum_Prob);
   if (sum(Mod_Prob <= 0) > 0), break; end;
   Mod_Prob = Mod_Prob + 1e-10 * (Mod_Prob <= 0);
      
   % compute the change in the feature
   Mod_Feature_Exp_Vec = X * Mod_Prob';
   first_der = Emp_Feature_Exp_Vec - Mod_Feature_Exp_Vec; % - 1/(sigma * sigma) * Weights;
   sec_der = X_feature_sum * Mod_Prob'; % + 1/(sigma * sigma);
   Delta = first_der ./ sec_der;   
   
   % Weights = Weights + Delta;
   Old_Avg_LL = Avg_LL;
   [step, Avg_LL] = fminsearch(@LL, 0, [], Weights, Delta, X, Label_Index, num_data); 
   Weights = Weights + step * Delta;
   
   % compute the average logarithm likelihood
   Avg_LL = -sum(log(Mod_Prob(Label_Index))) ./ num_data;

   % print out the LL information
   if rem(iter, 1) == 0
      fprintf('\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\bIter:%4d, LL:%11.7f', iter, Avg_LL);
   end
   if (abs(Avg_LL - Old_Avg_LL) < min_diff), break; end;
       
end
fprintf('\n');

% compute result
mod_prob = exp(Weights' * testX');
[Y_prob, run] = max(mod_prob);
run = run';
Y_compute = class_set(run);
Y_prob = (Y_prob ./ sum(mod_prob))';

function Avg_LL = LL(step, Weights, Delta, X, Label_Index, num_data)

Weights1 = Weights + step * Delta;
Mod_Prob = exp(Weights1' * X);
Sum_Prob = sum(Mod_Prob)';
Mod_Prob = scale_cols(Mod_Prob, 1 ./ Sum_Prob);
 if (sum(Mod_Prob <= 0) > 0), Avg_LL = 1e+4; return; end;
Mod_Prob = Mod_Prob + 1e-10 * (Mod_Prob <= 0);
Avg_LL = -sum(log(Mod_Prob(Label_Index))) ./ num_data;

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -