⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 optim_auc.m

📁 data description toolbox 1.6 单类分类器工具包
💻 M
字号:
function [w,optval] = optim_auc(x,wname,fracrej,range,nrbags,varargin)%OPTIM_AUC Optimize hyperparameters for an OCC%%    W = OPTIM_AUC(X,WNAME,FRACREJ,RANGE,NRBAGS,VARARGIN)%% Optimize the AUC-performance of classifier WNAME (string) on dataset% X. This optimization is over the complexity parameter (this should be% the third input parameter for the one-class classifier). RANGE should% contain a vector of parameter values that are tried. For each% parameter value, NRBAGS-fold crossvalidation is applied and the% parameter for which the highest AUC is obtained, is used to train the% final classifier W.%% When dataset X does not contain example outlier objects, they% are generated by gendatout.m.%% Example:% >> a = target_class(gendatb,1);% >> w = optim_auc(a,'mog_dd',0.1,1:5);% >> scatterd(a); plotc(w);%% See also: scale_range, dd_crossval, gendatout% Copyright: D.M.J. Tax, D.M.J.Tax@prtools.org% Faculty EWI, Delft University of Technology% P.O. Box 5031, 2600 GA Delft, The Netherlandsif nargin<5 | isempty(nrbags)	nrbags = 5;endif nargin<4	range = 1:5;endif nargin<3	fracrej = 0.1;endif nargin<2	wname = 'knndd';endif ~exist('x') | isempty(x) 	% When no inputs are given, we are expected to return an empty	% mapping:	if isempty(varargin)		w = mapping(mfilename,{wname,fracrej,range,nrbags});	else		w = mapping(mfilename,{wname,fracrej,range,nrbags,varargin{:}});	end	% And give a suitable name:	w = setname(w,'AUC optim. classifier');	returnend% Check some things (there are too many input parameters, and it is very% easy to forget one, or mix them up):if ~isa(wname,'char')	error('Expecting the name (string!) of the classifier.');endif length(fracrej)>1	error('Fracrej should be a scalar.');endnrrange = length(range);if nrrange<2	error('Expecting a range of param. values.');endif ~isa(nrbags,'double') | length(nrbags)>1	error('Parameter NRBAGS should be a scalar.');end% First check if the dataset contains any outlier data, if not then we% have to generate them![xt,xo]=target_class(x);if size(xo,1)==0	%fprintf('Generating artificial outliers in "optim_auc"');	xo = gendatout(x,size(x,1));end% run over all the parameter values:meanauc = zeros(nrrange,1);for k=1:nrrange	fprintf('.');	% Apply the crossvalidation:	res = zeros(nrbags,1);	I = nrbags;	for i=1:nrbags		% Generate train and test set		[xtr,xte,I] = dd_crossval(x,I);		if ~isempty(xo),			% DXD: sometimes datasets are not completely well-formed, make sure			% that the identifiers are cellarrays:			if isa(xte.ident,'double')				xte.ident = num2cell(xte.ident);			end			if isa(xo.ident,'double')				xo.ident = num2cell(xo.ident);			end			xte = [xte; xo];		else			error('I require outlier examples for the evaluation of the AUC');		end		% Train a classifier:		if ~isempty(varargin)			w = feval(wname,xtr,fracrej,range(k),varargin{:});		else			w = feval(wname,xtr,fracrej,range(k));		end		% Compute the AUC on the leave-out bags		res(i) = dd_auc(xte*w*dd_roc);	end		% store the mean auc:	meanauc(k) = mean(res);end% now find which value is the best:[maxval,kopt] = max(meanauc);%% So, the final classifier becomes:w = feval(wname,x,fracrej,range(kopt),varargin{:});w = setname(w,'AUC optim. classifier');optval = range(kopt);return

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -