pca.m
来自「A toolbox for the non-local means algori」· M 代码 · 共 26 行
M
26 行
function [Y,X1,v,Psi] = pca(X,numvecs, options)% pca - compute the principal component analysis.%% [Y,X1,v,Psi] = pca(X,numvecs)%% X is a matrix of size dim x p of data points.% X1 is the matrix of size numvecs x p (projection on the numvect first eigenvectors)% Y the matrix of size dim x numvecs of numvecs first eigenvector of the correlation matrix X*X'% (this matrix is computed using the traditional flipping trick if p is large).% v is the vector of size numvecs of eigenvalues.% Psi is the mean.%% Warning: the mean of X is substracted before computing the covariance% matrix.%% You can use an iterative algorithm based % on expectation maximization by setting% options.use_em = 1;% if you want a fast estimation of a few eigenvectors.% This algorithm use the code of Sam Roweis% Sam Roweis, "EM Algorithms for PCA and SPCA",% Neural Information Processing Systems 10 (NIPS'97) pp.626-632% http://www.cs.toronto.edu/~roweis/code.html%% Copyright (c) 2006 Gabriel Peyr
⌨️ 快捷键说明
复制代码Ctrl + C
搜索代码Ctrl + F
全屏模式F11
增大字号Ctrl + =
减小字号Ctrl + -
显示快捷键?