📄 newsvdd1.m
字号:
function [W,out,J] = newsvdd(a,fracrej,fracerr,param2)% [W,out,J] = newsvdd(a,fracrej,fracerr,param2)%% This is another SVDD, which uses an optimized mex function to solve% the quadratic optimization problem (libsvm). The advantage is that% it is very fast and that very large datasets can be attacked (the% problem is decomposed in subproblems). The drawback is that it is% not a 'pure' SVDD. It is based on the nu-SVC and therefore tries to% seperate the data with maximal margin from the origin. Therefore% only for RBF-kernels the same results as in the original SVDD are% found.%% This has also influence on the classifier output. Because it% is impossible to compute the distance to the center of the (hyper)% sphere (only a distance to the decision boundary), there is no clear% way to transform this distance to a probability. Combining the% outputs of this method with other outputs is therefore still% impossible.%% Oh, another drawback: it is still not possible to include outlier% objects in the training.%% Finally, it is also possible to give sigma directly:% [W,out,J] = newsvdd(a,[],sigma);% Copyright: D. Tax, R.P.W. Duin, davidt@ph.tn.tudelft.nl% Faculty of Applied Physics, Delft University of Technology% P.O. Box 5046, 2600 GA Delft, The Netherlandssigma=[];if nargin >= 3 & isempty(fracrej) sigma = fracerr; if nargin < 4 fracerr = 0.0; else fracerr = param2; endendif nargin < 3 | isempty(fracerr), fracerr = 0.0; endif nargin < 2 | isempty(fracrej), fracrej = 0.05; endif nargin < 1 | isempty(a) % empty svdd W = mapping(mfilename,{fracrej,fracerr}); returnendif isa(fracrej,'double') % training % a bit scary: but I want to adapt the threshold: % (undocumented feature...:-))% if isa(a,'mapping')% [W,classlist,type,k,c] = mapping(a); % unpack% W.threshold = W.threshold + fracrej;% disp(['Using new threshold value: ',num2str(W.threshold)]);% W = mapping('newsvdd',W,str2mat('target','outlier'),k,c);% return;% end %------------------------------------------------------------ if ~is_ocset(a) a = target_class(+a); end [nlab,lablist,m,k,c] = dataset(a); % introduce outlier label for outlier class if it is not avail. signlab = -ones(m,1); signlab(find_target(a)) = 1; % other parameters if (fracerr<=0), fracerr=1/m; end thiseps = 1e-4; if isempty(sigma) %maxsigma =max(max(+a)-min(+a)) maxsigma = sqrt(max(max(distm(a)))); options = optimset('Display','off','TolX',0.01); sigma = fminbnd('new_f_svs',0,maxsigma,options,... +a,signlab,fracerr,fracrej,thiseps); [svx,alf,b]=m_svm(+a,signlab,2,2,sigma,fracerr,thiseps); else % the user supplied a sigma % OC support vector, RBF kernel: [svx,alf,b]=m_svm(+a,signlab,2,2,sigma,fracerr,thiseps); end % how is the training set mapped? K = exp(-distm(+a,svx)/(sigma*sigma)); Dx = sum(K.*(ones(m,1)*alf'),2);% W = {sigma,svx,alf,b,mean(Dx)}; W.s = sigma; W.sv = svx; W.a = alf; W.threshold = b; W.scale = mean(Dx); W = mapping(mfilename,W,str2mat('target','outlier'),k,c); if nargout>1 out = Dx; end if nargout>2 [dummy,J,Jb] = intersect(+a,svx,'rows'); J = {J, alf, sigma, b, Jb}; endelse %testing [W,classlist,type,k,c] = mapping(fracrej); % unpack [nlab,lablist,m,k,c,p] = dataset(a); % and here we go: K = exp(-distm(+a,W.sv)/(W.s*W.s)); out = [sum(K.*(ones(size(+a,1),1)*W.a'),2) ones(m,1)*W.threshold]; % here we have the mapping of the data onto the normal of the % decision boundary, the larger the mapping, the more it fits onto % the target class. It is not obvious how to map it to probability: newout = out;% newout = dist2dens(out,W.scale); W = dataset(newout,getlab(a),classlist,p,lablist);endreturn
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -