⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 consistent_occ.m

📁 数据挖掘的工具箱,最新版的,希望对做这方面研究的人有用
💻 M
字号:
function [w1,fracout,err_thr] = consistent_occ(x,w,fracrej,range,nrbags,varargin)%CONSISTENT_OCC%%     W = CONSISTENT_OCC(X,W,FRACREJ,RANGE,NRBAGS)%% Optimize the hyperparameters of method W. W should contain the% (string) name of a one-class classifier. Using crossvalidation on% dataset X (containing just target objects!), this classifier is% trained using the target rejection rate FRACREJ and the values of% the hyperparameter given in RANGE. The hyperparameters in RANGE% should be ordered such that the most simple classifier comes% first. New hyperparameters (for more complex classifiers) are used% until the classifier becomes inconsistent. Per default% NRBAGS-crossvalidation is used.%% An example for kmeans_dd, where k is optimized:%    w = consistent_occ(x,'kmeans_dd',0.1, 1:20)%    w = consistent_occ(x,'svdd',0.1, scale_range(x))%%     W = CONSISTENT_OCC(X,W,FRACREJ,RANGE,NRBAGS,P1,P2,...)%% Finally, some classifiers require additional parameters, they% should be given in P1,P2,... at the end.%% Default: NRBAGS=5%% See also: scale_range, dd_crossvalif nargin<5 | isempty(nrbags)	nrbags = 5;end% Check some things:if ~isa(w,'char')	error('Expecting the name (string!) of the classifier');endif length(fracrej)>1	error('Fracrej should be a scalar');endnrrange = length(range);if nrrange<2	error('Expecting a range of param. values (from simple to complex classifier)');endsigma_thr = 2;if length(nrbags)>1  %aiaiaiaiaiaai   sigma_thr = nrbags(2);   nrbags = nrbags(1);end% Setup the consistency threshold, say the three sigma bound:%DXD still a magic parameter!nrx = size(x,1);err_thr = fracrej + sigma_thr*sqrt(fracrej*(1-fracrej)/nrx);% AI!---------------^% Train the most simple classifier:k = 1;I = nrbags;for i=1:nrbags	% Compute the target error on the leave-out bags	[xtr,xte,I] = dd_crossval(x,I);	w1 = feval(w,xtr,fracrej,range(k),varargin{:});	res = dd_error(xte,w1);	err(i) = res(1);end% This one should at least satisfy the bound, else the model is already% too complex?!if (mean(err)>err_thr)	error('The most simple classifier is already inconsistent!');endfracout(1) = mean(err);% Go through the other parameter settings until it becomes inconsistent:%while (mean(err)<err_thr) & (k<nrrange)kopt=[];while (k<nrrange)	k = k+1;	I = nrbags;	for i=1:nrbags		% Compute the target error on the leave-out bags		[xtr,xte,I] = dd_crossval(x,I);		w1 = feval(w,xtr,fracrej,range(k),varargin{:});		res = dd_error(xte,w1);		err(i) = res(1);	end	fracout(k) = mean(err);   if (mean(err)>err_thr) & (isempty(kopt))      kopt = k-1, endend%% So, the final classifier becomes:%if (fracout(k)<err_thr)%	% The method was still consistent:%	w1 = feval(w,x,fracrej,range(end),varargin{:});%else%	% This is not consistent, so take the previous solution:%	w1 = feval(w,x,fracrej,range(k-1),varargin{:});%endw1 = feval(w,x,fracrej,range(kopt),varargin{:});return

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -