pcldc.m

来自「模式识别工具箱。非常丰富的底层函数和常见的统计识别工具」· M 代码 · 共 64 行

M
64
字号
%PCLDC Linear classifier using PC expansion on the joint data.% % 	W = PCLDC(A,N)% 	W = PCLDC(A,ALF)%% INPUT%  A    Dataset%  N    Number of eigenvectors%  ALF  Total explained variance (default: ALF = 0.9)%% OUTPUT%  W    Mapping% % DESCRIPTION% Finds the linear discriminant function W for the dataset A % computing the LDC on a projection of the data on the first N  % eigenvectors of the total dataset (Principle Component Analysis).% % When ALF is supplied the number of eigenvalues is chosen such that at % least a part ALF of the total variance is explained. % % If N (ALF) is NaN it is optimised by REGOPTC.%% SEE ALSO% MAPPINGS, DATASETS, KLLDC, KLM, FISHERM, REGOPTC% Copyright: R.P.W. Duin, duin@ph.tn.tudelft.nl% Faculty of Applied Physics, Delft University of Technology% P.O. Box 5046, 2600 GA Delft, The Netherlands% $Id: pcldc.m,v 1.4 2007/06/13 21:59:42 duin Exp $function W = pcldc(a,n)		prtrace(mfilename);	if nargin < 2, n = []; end		if nargin == 0 | isempty(a)		W = mapping('pcldc',{n});			elseif isnan(n)    % optimize regularisation parameter		defs = {1};		parmin_max = [1,size(a,2)];		W = regoptc(a,mfilename,{n},defs,[1],parmin_max,testc([],'soft'),0);			else		islabtype(a,'crisp','soft');		isvaldfile(a,2,2); % at least 2 object per class, 2 classes		a = testdatasize(a,'features');		a = setprior(a,getprior(a));		% Make a sequential classifier combining PCA and LDC:		v = pca(a,n);		W = v*ldc(a*v);		W = setcost(W,a);			end		W = setname(W,'PC Bayes-Normal-1');return

⌨️ 快捷键说明

复制代码Ctrl + C
搜索代码Ctrl + F
全屏模式F11
增大字号Ctrl + =
减小字号Ctrl + -
显示快捷键?