⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 kernelpca.m

📁 this a SVM toolbox,it is very useful for someone who just learn SVM.In order to be undestood easily,
💻 M
字号:
function [Z,Lambda]=kernelpca(X,T,l,ker,arg,display)% KERNELPCA computes kernel Principal Component Analysis.
%  [Z,lambda]=kernelpca(X,T,l,ker,arg,display)%% KERNELPCA computes Principal Component Analysis (PCA)
%  from non-linearly mapped data. The non-linear feature
%  space is determined by a kernel function. 
%  The PCA projection is used to reduce the dimension
%  of non-linearly mapped data into l-dimesional space.
% 
%  The kernel-PCA projection is learnt on the data in 
%  matix T. The data in the matrix X are processed 
%  by the leart non-linear projection and the result is
%  returned in the matrix Z.
%% Input:%   T [dxN] contains N training points in d-dimensional space%   X [dxL] contains L, d-dimensional points to be processed.%   l [1x1] is a dimension of the reduced (output) space.%   ker [string] determines non-linear mapping, see help kernel.
%   arg [] arguments of the used kernel.%   display [1x1] if display==1 then the info will be displayed.%% Output:%   Z [lxL] L processed points in l-dimensional space.
%   Lambda [1xN] eigenvalues of covariance matrix of non-linearly %     mapped training points.%% See also SPCA, KERNEL, PKERNELPCA.%   % Statistical Pattern Recognition Toolbox, Vojtech Franc, Vaclav Hlavac% (c) Czech Technical University Prague, http://cmp.felk.cvut.cz% Modifications
% 11-july-2002, VF, mistake "Jt=zeros(N,L)/N" repared %              (reported by SH_Srinivasan@Satyam.com).% 5-July-2001, V.Franc, comments changed% 20-dec-2000, V.Franc, algorithm was implemented
if nargin < 6,  display=0;endd=size(X,1);  % dimensionN=size(T,2);  % number of training pointsL=size(X,2);  % number of points to be processed% Compute kernel matrix for training dataif display==1,  disp('Computing kernel matrix for training data...');endK=kernel(T,T,ker,arg);% Centering the training data.J = ones(N,N)/N;K = K - J*K - K*J + J*K*J;% Computing the eigenvectors and eigenvalues of K/N.if display==1,    disp('Computing eigenvalue decomposition...');end[U,D]=eig(K/N);Lambda=real(diag(D));for k=1:min(N,d),  if Lambda(k) ~= 0,     U(:,k)=U(:,k)/sqrt(Lambda(k));  endend% Sort the eigenvalues and the eigenvectors in descending order.[Lambda,ordered]=sort(-Lambda);    % sort eigenvaluesLambda=-Lambda;U=U(:,ordered);                    % sort eigenvectors% Compute kernel matrix for feature extractionif display==1,  disp('Computing kernel matrix for feature extraction...');endKt = kernel(X,T,ker,arg);% Centering the data.Jt=ones(N,L)/N;Kt=Kt - Jt'*K - Kt*J + Jt'*K*J;% Feature extractionA=U(:,1:l);              % use first l principal componentsZ=(Kt*A)';return;

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -