⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 basebagging.m

📁 经典教材《机器学习》里的几乎所有算法 作者就是《机器学习》的2大牛人 有的函数添加了中文的注释 仅供参考
💻 M
字号:
function test_targets = BaseBagging(train_patterns, train_targets, test_patterns, params)

% Classify using the Bagging algorithm
% Inputs:
% 	train_patterns	- Train patterns
%	train_targets	- Train targets
%   test_patterns   - Test  patterns
%	Params	- [NumberOfIterations, Weak Learner Type, Learner's parameters]
%
% Outputs
%	test_targets	- Predicted targets
%   E               - Errors through the iterations
%
% NOTE: Suitable for only two classes
%


[k_max, weak_learner, alg_param] = process_params(params);%alg_param:weak_learner's 参数,

[Ni,M]			= size(train_patterns);

IterDisp		= 20;
full_patterns   = [train_patterns, test_patterns];
test_targets    = zeros(1, size(test_patterns,2));
pos_idx=find(train_targets>0);
P=length(pos_idx);%正例的数目,每次只取2P个进行
%Do the bagging,k_max为bagging的次数
for k = 1:k_max,
   %Draw a samples from the data, with replacement
   indices = zeros(1,2*P);
   for j = 1:2*P
      indices(j) = 1 + floor(rand(1)*M);%得到训练例的下标
   end

  %此时的测试样本为所有的训练集,这是AdaBoosting的基本做法
   Ck 	= feval(weak_learner, train_patterns(:, indices), train_targets(indices), full_patterns, alg_param);
   
   %alpha_k <- 1/2*ln(1+Ek)/1-Ek)%这个公式也有所不同,Ek也类似于论文中的r

   %M为训练样本的数目,Ck实际上是所有样本的分类结果,这里当然只取对测试样本的结果并进行累加
   test_targets  = test_targets + Ck(M+1:end);
   %2*X-1是为了使0,1型变量变成-1,+1型。,如果实际输出的也是-1,+1型就不用了
  
   
   if (k/IterDisp == floor(k/IterDisp)),
      disp(['        Completed ' num2str(k) ' bagging iterations'])
  end
   
end

test_targets = test_targets > 0;%如果希望输出的是0,1则使用这一步,否则不用使用.

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -