⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 dlpdd.m

📁 数据挖掘的工具箱,最新版的,希望对做这方面研究的人有用
💻 M
字号:
function W = dlpdd(x,nu,usematlab)%DLPDD Distance Linear Programming Data Description%%       W = DLPDD(X,NU)%% This linear one-class classifier works directly on the distances, so% the data is not remapped. The NU parameter gives the fraction of% error on the target set. It tries to separate the (distance) data as% best as possible from +infinity.%% Although it is more or less assumed that the data is in the positive% quadrant, you can put other data in as well and see how it may or% may not work.%% See also: lpdd%% @inproceedings{Pekalska2002,%	author = {Pekalska, E. and Tax, D.M.J. and Duin, R.P.W.},%	title = {One-class {LP} classifier for dissimilarity representations},%	booktitle = {Advances in Neural Information Processing Systems},%	year = {2003},%	pages = {},%  editor =       {S.~Becker and S.~Thrun and K.~Obermayer},%  volume =       {15},%   publisher = {MIT Press: Cambridge, MA}%}% Copyright: D. Tax, R.P.W. Duin, davidt@ph.tn.tudelft.nl% Faculty of Applied Physics, Delft University of Technology% P.O. Box 5046, 2600 GA Delft, The netherlands% first set up the parametersif nargin < 3, usematlab = 0; endif nargin < 2 | isempty(nu), nu = 0.05; endif nargin < 1 | isempty(x) % empty	W = mapping(mfilename,{nu});	W = setname(W,'Distance Linear Programming Distance-dd');	returnend% trainingif ~ismapping(nu)	% work directly on the distance matrix	[n,d] = size(x);	if (n~=d)		error('I was expecting a square distance matrix!');	end	% maybe we have example outliers...	if isocset(x)		labx = getoclab(x);	else		labx = ones(n,1);	end	x = +x; % no dataset please.	% set up the LP problem:	C = 1./(n*nu);	f = [1 -1 zeros(1,n) repmat(C,1,n)]';	A = [-labx labx repmat(labx,1,n).*x -eye(n)];	b = -ones(n,1);	Aeq = [0 0 ones(1,n) zeros(1,n)];	beq = 1;	N = 2*n + 2;	lb = zeros(N,1);	ub = repmat(inf,N,1);	% optimize::	if (exist('lp_solve')>0) & (usematlab==0)		if ~exist('cplex_init')			% we can have the lp optimizer:			e = [0; -ones(n,1)];			[v,alf] = lp_solve(-f,sparse([Aeq;A]),[beq;b],e,lb,ub);		else			% the cplex optimizer:			lpenv=cplex_init;			disp = 0;			[alf,y_upd_,how_upd_,p_lp]=...			lp_solve(lpenv, f, sparse([Aeq;A]), [beq;b], lb, ub, 1, disp);		end	else		% or the good old Matlab optimizer:		alf = linprog(f,A,b,Aeq,beq,lb,ub);	end	% store the results	paramalf = alf(3:2+n);	W.I = find(paramalf>1e-6);	W.w = paramalf(W.I);	W.threshold = 1-(alf(1)-alf(2));	W = mapping(mfilename,'trained',W,str2mat('target','outlier'),d,2);	W = setname(W,'Distance Linear Programming Distance-data description');else                               %testing	% get the data:	W = getdata(nu);	m = size(x,1);	% and here we go:	D = +x(:,W.I);	% annoying prtools:	newout = [-D*W.w repmat(W.threshold,m,1)];	W = setdat(x,newout,nu);endreturn

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -