⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 adaboostc.m

📁 这是我找到的一个模式识别工具箱
💻 M
字号:
%ADABOOSTC%% [W,V,ALF] =  ADABOOSTC(A,CLASSF,N,RULE,VERBOSE);%% INPUT%   A       Dataset%   CLASSF  Untrained weak classifier%   N       Number of classifiers to be trained%   RULE    Combining rule (default: weighted voting)%   VERBOSE Suppress progress report if 0 (default 1)%% OUTPUT%   W       Combined trained classifier%   V       Cell array of all classifiers%           Use VC = stacked(V) for combining%   ALF     Weights%% DESCRIPTION%% Computation of a combined classifier according to adaboost.%% In total N weighted versions of the training set A are generated% iteratevely and used for the training of the specified classifier.% Weights, to be used for the probabilities of the objects in the training% set to be selected, are updated according to the Adaboost rule.%% The entire set of generated classifiers is given in V.% The set of classifier weigths, according to Adaboost is returned in ALF%% Various aggregating possibilities can be given in % the final parameter rule:% []:      WVOTEC, weighted voting.% VOTEC    voting% MEANC    sum rule% AVERAGEC averaging of coeffients (for linear combiners)% PRODC    product rule% MAXC     maximum rule% MINC     minimum rule% MEDIANC  median rule%% SEE ALSO% MAPPINGS, DATASETS,% Copyright: R.P.W. Duin, r.p.w.duin@prtools.org% Faculty EWI, Delft University of Technology% P.O. Box 5031, 2600 GA Delft, The Netherlandsfunction [W,V,alf,U] = adaboostc(a,clasf,n,rule,verbose);	prtrace(mfilename);%               INITIALISATIONif nargin < 5, verbose = 1; endif nargin < 4, rule = []; endif nargin < 3, n = 1; endif nargin < 2 | isempty(clasf), clasf = nmc; endif nargin < 1 | isempty(a)	W = mapping(mfilename,{clasf,n,rule,verbose});	W = setname(W,'Adaboost');	returnend[m,k,c] = getsize(a);V = [];lablist = getlablist(a);laba = getnlab(a);p = getprior(a);a = dataset(a,laba);         % use numeric labels for speeda = setprior(a,p);u =ones(m,1)/m;              % initialise object weightsalf = zeros(1,n);            % space for classifier weightsif verbose > 0 & k == 2	figure(verbose);	scatterd(a);endeprior = 1-max(getprior(a)); % maximum error                             % generate n classifiersfor i = 1:n	b=gendatw(a,u,ceil(0.2*m));            % sample training set	b = setprior(b,getprior(a)); % use original priors  labb=getlab(b);            	w = b*clasf;               % train weak classifier	if verbose & k == 2  	plotc(w,1); drawnow	end  erra = a*w*testc;          % compute its error  labc = a*w*labeld;         % find objects that ...	diff = labc~=laba;         % are erroneously classified  dd = 1-2*diff;      	r = sum(u.*dd);  	if erra ~= 0 | 1           % do not stop for zero error classifier		alf(i) = 0.5*log((1+r)/(1-r+realmin)); % classifier weight		correct = find(diff==0);       % find correctly classified objects		wrong = find(diff==1);         % find incorrectly classified objects		u(correct) = u(correct)*exp(-alf(i));  % give them the ...		u(wrong) = u(wrong)*exp(alf(i));       % proper weights		u=u./sum(u);                   % normalize weights	else                             % if classifier perfect		alf = alf(1:i-1);              % stop		break	end	w = setlabels(w,lablist);        % restore original labels	if verbose		%disp([erra r alf(i) sum(alf)])	end  V = [V w];                       % store all classifiersend                                   % combineif isempty(rule)	W = wvotec(V,alf);               % default is weighted combinerelse	W = traincc(a,V,rule);           % otherwise, use user supplied combinerendif verbose > 0 & k == 2	plotc(W,'r',3)	ee = a*W*testc;	title(['Error: ', num2str(ee)]);endreturn

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -