⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 tdlda.m

📁 人脸识别2DLDA代码
💻 M
字号:
function [eigvectors, eigvalues, newTrainData,newTestData] = TDLDA(trainData,testData,height,width,gnd,numvecs)
% 2DLDA: Two dimensional linear discriminant analysis
%
%    usage: [eigvectors, eigvalues] = TDLDA(trainData,testData,height,width,gnd,numvecs)
%
%
%             Input:
%               trainData  - Data matrix. Each row vector of fea is a data point
%               testData  - Data matrix. Each row vector of fea is a data point
%               height - height of the original image matrix
%               width - width of the original image matrix
%               gnd   - Colunm vector of the label information for each train
%                       data point. 
%               numvecs - the needed number eigenvectors
%
%             Output:
%               eigvectors - Each column is an embedding function, for a new
%                           data point (row vector) x,  
%                           y = reshape(reshape(x,height,width)*eigvectors,1,height*width)
%                           will be the embedding result of x.
%               eigvalues  - The eigvalues of LDA eigen-problem. 
% 
% 
%           [eigvectors, eigvalues, newTrainData,newTestData] = TDLDA(trainData,testData,height,width,gnd,numvecs)		
%               
%               newTrainData&newTestData:  The embedding results, Each row vector is a data point.
%                   newTrainData(i,:) = reshape(reshape(trainData(i,:),height,width)*eigvectors,1,height*numvecs)
%                   newTestData(i,:) = reshape(reshape(testData(i,:),height,width)*eigvectors,1,height*numvecs)
%
%
%	Reference paper: H.Kong,X.Li,L.Wang,E.K.Teoh,J.G.Wang,and
%	                 R.Venkateswards.Two dimesiional fisher discriminant
%	                 analysis:Forget about small sample size problem.In
%	                 Proceddings of ICASSP,2005.
%   
%   Written by Zhonghua Shen (cnjsnt_s@yahoo.com.cn), 2006.07


% ====== Initialization
[nSmp,nFea] = size(trainData);
classLabel = unique(gnd);
nClass = length(classLabel);

sampleMean = mean(trainData);

B = zeros(width, width); % Between-class matrix
W = zeros(width, width); % Within-class matrix
for i = 1:nClass,
	index = find(gnd==classLabel(i));
	classMean = mean(trainData(index, :));
    
    % calculate between-class matrix
    dummyMat = reshape((classMean-sampleMean),height,width);
	B = B + dummyMat'*dummyMat;
    % calculate within-class matrix
    Beg = index(1);
    End = index(1) + length(index) - 1;
    for j = Beg:End
        dummyMat = reshape((trainData(j,:)-classMean),height,width);
        W = W + dummyMat'*dummyMat;
    end
end
B = B/nClass;
W = W/nSmp;

W = (W + W')/2;
B = (B + B')/2;

fprintf(1,'Calculating generalized eigenvectors and eigenvalues...\n');
[eigvectors, eigvalues] = eig(B,W);

fprintf(1,'Sorting eigenvectors according to eigenvalues...\n');
[eigvectors,eigvalues] = sortem(eigvectors,eigvalues);
eigvalues = diag(eigvalues);

for i = 1:size(eigvectors,2)
    eigvectors(:,i) = eigvectors(:,i)./norm(eigvectors(:,i));
end
eigvectors = eigvectors(:,1:numvecs);

if nargout == 4
   fprintf(1,'Feature extraction and calculating newData...\n');
   newTrainData = zeros(nSmp,height*numvecs);
   for i = 1:nSmp
       dummyMat = reshape((trainData(i,:)-sampleMean),height,width);
       newTrainData(i,:) = reshape(dummyMat*eigvectors,1,height*numvecs);
   end
   nSam1 = size(testData,1); 
   newTestData = zeros(nSam1,height*numvecs);
   for i = 1:nSam1
       dummyMat = reshape((testData(i,:)-sampleMean),height,width);
       newTestData(i,:) = reshape(dummyMat*eigvectors,1,height*numvecs);
   end

end

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -