📄 lda.m
字号:
function [newSampleIn, discrim_vec] = LDA(sampleIn, label, discrim_vec_n)
%LDA Linear discriminant analysis
% Usage:
% [NEWSAMPLE, DISCRIM_VEC] = lda(SAMPLE, DISCRIM_VEC_N)
% SAMPLE: Sample data with class information
% (Each row of SAMPLE is a sample point, with the
% last column being the class label ranging from 1 to
% no. of classes.)
% DISCRIM_VEC_N: No. of discriminant vectors
% NEWSAMPLE: new sample after projection
%
% Reference:
% J. Duchene and S. Leclercq, "An Optimal Transformation for
% Discriminant Principal Component Analysis," IEEE Trans. on
% Pattern Analysis and Machine Intelligence,
% Vol. 10, No 6, November 1988
%
% Roger Jang, 990829
if nargin < 3, discrim_vec_n = size(sampleIn ,2); end
% ====== Initialization
data_n = size(sampleIn, 1);
feature_n = size(sampleIn,2);
featureMatrix = sampleIn;
% class_n = size(sampleOut, 2);
class_set = unique(label);
class_n = length(class_set);
sampleMean = mean(featureMatrix);
% ====== Compute B and W
% ====== B: between-class scatter matrix
% ====== W: within-class scatter matrix
% MMM = \sum_k m_k*mu_k*mu_k^T
% U = sampleOut';
U = [];
for i = 1:class_n, U = [U; (label == class_set(i))']; end;
count = sum(U, 2); % Cardinality of each class
% Each row of MU is the mean of a class
MU = U*featureMatrix./(count*ones(1, feature_n));
MMM = MU'*diag(count)*MU;
W = featureMatrix'*featureMatrix - MMM;
B = MMM - data_n*sampleMean'*sampleMean;
% ====== Find the best discriminant vectors
invW = inv(W);
Q = invW*B;
D = [];
for i = 1:discrim_vec_n,
[eigVec, eigVal] = eig(Q);
[maxEigVal, index] = max(abs(diag(eigVal)));
D = [D, eigVec(:, index)]; % Each col of D is a eigenvector
Q = (eye(feature_n)-invW*D*inv(D'*invW*D)*D')*invW*B;
end
newSampleIn = featureMatrix*D(:,1:discrim_vec_n);
discrim_vec = D;
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -