⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 pcaklm.m

📁 这个为模式识别工具箱
💻 M
字号:
%PCAKLM Principal Component Analysis/Karhunen-Loeve Mapping%       (PCA or MCA of overall/mean covariance matrix)% % 	[W,FRAC] = PCAKLM(TYPE,A,N)% 	[W,N]    = PCAKLM(TYPE,A,FRAC)%% INPUT%  A              Dataset% TYPE				Type of mapping: 'pca' or 'klm'. Default: 'pca'.%	N	or FRAC		Number of dimensions (>= 1) or fraction of variance (< 1) %							to retain; if > 0, perform PCA; otherwise MCA. Default: N = inf.%% OUTPUT% W 					Affine Karhunen-Loeve mapping% FRAC or N		Fraction of variance or number of dimensions retained.%% DESCRIPTION% Performs a principal component analysis (PCA) or minor component analysis% (MCA) on the overall or mean class covariance matrix (weighted by the% class prior probabilities). It finds a rotation of the dataset A to an% N-dimensional linear subspace such that at least (for PCA) or at most (for% MCA) a fraction FRAC of the total variance is preserved.%% PCA is applied when N (or FRAC) >= 0; MCA when N (or FRAC) < 0. If N is % given (abs(N) >= 1), FRAC is optimised. If FRAC is given (abs(FRAC) < 1), % N is optimised. %% Objects in a new dataset B can be mapped by B*W, W*B or by A*KLM([],N)*B.% Default (N = inf): the features are decorrelated and ordered, but no % feature reduction is performed.%% ALTERNATIVE%% 	V = PCAKLM(A,TYPE,0)% % Returns the cumulative fraction of the explained variance. V(N) is the % cumulative fraction of the explained variance by using N eigenvectors.%% This function should not be called directly, only trough PCA or KLM.% Use FISHERM for optimizing the linear class separability (LDA).% % SEE ALSO% MAPPINGS, DATASETS, PALDC, KLLDC, PCA, KLM, FISHERM% Copyright: R.P.W. Duin, r.p.w.duin@prtools.org% Faculty EWI, Delft University of Technology% P.O. Box 5031, 2600 GA Delft, The Netherlands% $Id: pcaklm.m,v 1.15 2005/03/04 10:36:25 duin Exp $function [w,truefrac] = pcaklm (type,a,frac)	prtrace(mfilename);	truefrac = [];	% Default: preserve all dimensions (identity mapping).	if (nargin < 3) | (isempty(frac))		frac = inf; 		prwarning (3,'no dimensionality given, only decorrelating and ordering dimensions');	end	% Default: perform PCA.	if (nargin < 1) | (isempty(type))		type = 'pca';		prwarning (3,'no type given, assuming PCA');	end	if (strcmp(type,'pca'))		mapname = 'Principal Component Analysis';	elseif (strcmp(type,'klm'))		mapname = 'Karhunen-Loeve Mapping';	else		error('unknown type specified');	end	% Empty mapping: return straightaway.	if (nargin < 2) | (isempty(a))		w = mapping(type,frac);		w = setname(w,mapname);		return	end	if ~isdataset(a) 		a = dataset(a,1);   % make sure we have a dataset	end		islabtype(a,'crisp','soft');	isvaldset(a,1);   % at least 1 object per class	[m,k,c] = getsize(a); p = getprior(a);	% If FRAC < 0, perform minor component analysis (MCA) instead of 	% principal component analysis.	mca = (frac < 0); frac = abs(frac);	% Shift mean of data to origin.	b = a*scalem(a); 	% If there are less samples M than features K, first perform a lossless	% projection to the (M-1) dimensional space spanned by the samples.	if (m <= k)		u = reducm(b); b = b*u;		korg = k; [m,k] = size(b);	else		u = [];	end	% Calculate overall or average class prior-weighted covariance matrix and	% find eigenvectors F. 	if (strcmp(type,'pca'))		if (c==0)  % we have unlabeled data!			bb = +b; % use all		else			bb = [];			for j = 1:c				bb = [bb; +seldat(b,j)*p(j)];			end 		end		G = cov(bb);	else		[U,GG] = meancov(b,1);		G = zeros(k,k);		for i = 1:c			G = G + p(i)*GG(:,:,i);		end	end	[F,V] = eig(G); 	% v = V(I) contains the sorted eigenvalues:	% descending for PCA, ascending for MCA.	if (mca)		[v,I] = sort(diag(V));	else		[v,I] = sort(-diag(V));	end		if (frac == inf)				% Return all dimensions, decorrelated and ordered.		n = k; truefrac = k;	elseif (frac == 0)			% Just return cumulative retained variance.		w = cumsum(v)/sum(v);		return	elseif (frac >= 1)			% Return FRAC dimensions.		n = abs(frac); if (n > k), error('illegal dimensionality requested'); end		I = I(1:n); sv = sum(v); 		if (sv ~= 0), truefrac = cumsum(v(1:n))/sv; else, truefrac = 0; end;	elseif (frac > 0)				% Return the N dimensions that retain at least (PCA)                          % or at most (MCA) FRAC variance.		J = find(cumsum(v)/sum(v) > frac);		if (mca), n = J(1)-1; else, n = J(1); end;		truefrac = n; I = I(1:n);	end	% If needed, apply pre-calculated projection to (M-1) dimensional subspace.	if (~isempty(u))		rot = u.data.rot*F(:,I); 		off = u.data.offset*F(:,I);	else		rot = F(:,I); 		off = -mean(a*F(:,I));	end	% Construct affine mapping.	w = affine(rot,off,a);	w = setname(w,mapname);		return

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -