⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 kernel_pca.m

📁 The goal of SPID is to provide the user with tools capable to simulate, preprocess, process and clas
💻 M
字号:
function [mappedX, mapping] = kernel_pca(X, no_dims, varargin)%KERNEL_PCA Perform the kernel PCA algorithm%%   [mappedX, mapping] = kernel_pca(X, no_dims)%   [mappedX, mapping] = kernel_pca(X, no_dims, kernel)%   [mappedX, mapping] = kernel_pca(X, no_dims, kernel, param1)%   [mappedX, mapping] = kernel_pca(X, no_dims, kernel, param1, param2)%% The function runs kernel PCA on a set of datapoints X. The variable% no_dims sets the number of dimensions of the feature points in the % embedded feature space (no_dims >= 1, default = 2). % For no_dims, you can also specify a number between 0 and 1, determining % the amount of variance you want to retain in the PCA step.% The value of kernel determines the used kernel. Possible values are 'linear',% 'gauss', 'poly', 'subsets', or 'princ_angles' (default = 'gauss'). For% more info on setting the parameters of the kernel function, type HELP% GRAM.% The function returns the locations of the embedded trainingdata in % mappedX. Furthermore, it returns information on the mapping in mapping.%%% This file is part of the Matlab Toolbox for Dimensionality Reduction v0.1b.% The toolbox can be obtained from http://www.cs.unimaas.nl/l.vandermaaten% You are free to use, change, or redistribute this code in any way you% want. However, it is appreciated if you maintain the name of the original% author.%% (C) Laurens van der Maaten% Maastricht University, 2007    if ~exist('no_dims', 'var')        no_dims = 2;    end    kernel = 'gauss';    param1 = 1;	param2 = 3;    if nargin > 2		kernel = varargin{1};		if length(varargin) > 1 & strcmp(class(varargin{2}), 'double'), param1 = varargin{2}; end		if length(varargin) > 2 & strcmp(class(varargin{3}), 'double'), param2 = varargin{3}; end    end        % Store the number of training and test points    ell = size(X, 1);    if size(X, 1) < 3000        % Get Gram matrix for training points        disp('Computing kernel matrix...');         K = gram(X, X, kernel, param1, param2);        % Normalize kernel matrix K        D = sum(K) / ell;           % column sums        E = sum(D) / ell;           % total sum        J = ones(ell, 1) * D;       % column sums (in matrix)        K = K - J - J' + E * ones(ell, ell);         % Compute first no_dims eigenvectors and store these in V, store corresponding eigenvalues in L        disp('Eigenanalysis of kernel matrix...');        K(isnan(K)) = 0;        K(isinf(K)) = 0;        [V, L] = eig(K);    else        % Perform eigenanalysis of kernel matrix without explicitly        % computing it        disp('Eigenanalysis of kernel matrix (using slower but memory-conservative implementation)...');        options.disp = 0;        options.isreal = 1;        options.issym = 1;        [V, L] = eigs(@(v)kernel_function(v, X', 1, kernel, param1, param2), size(X, 1), no_dims, 'LM', options);           end        % Sort eigenvalues and eigenvectors in descending order    [L, ind] = sort(diag(L), 'descend');    L = L(1:no_dims);	V = V(:,ind(1:no_dims));        % Compute inverse of eigenvalues matrix L	disp('Computing final embedding...');    invL = diag(1 ./ L);        % Compute square root of eigenvalues matrix L    sqrtL = diag(sqrt(L));        % Compute inverse of square root of eigenvalues matrix L    invsqrtL = diag(1 ./ diag(sqrtL));        % Compute the new embedded points for both K and Ktest-data    mappedX = sqrtL * V';                     % = invsqrtL * V'* K        % Set feature vectors in original order    mappedX = mappedX';        % Store information for out-of-sample extension    mapping.V = V;    mapping.invsqrtL = invsqrtL;    

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -