⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 lda.m

📁 本贝叶斯分类器可以实现对二维高斯分布样本的分类
💻 M
字号:
% 
N=500;
%Generating Data with Gaussain Distribution......
%data for the first category
c1_model.Mean = [1;1]; % Mean vector
c1_model.Cov = [1 0.6; 0.6 1]; % Covariance matrix
data_c1=gsamp(c1_model,N);
%data for the second category
c2_model.Mean = [4;4]; % Mean vector
c2_model.Cov = [1 0.8; 0.8 1]; % Covariance matrix
data_c2=gsamp(c2_model,N);

figure(1); hold on;

%Visualizing the data......
ppatterns(data_c1); % plot sampled data
pgauss(c1_model); % plot shape

ppatterns(data_c2); % plot sampled data
pgauss(c2_model); % plot shape


%Saving data in the required format by Toolbox
testdata=[data_c1(:,1:N/2) data_c2(:,1:N/2)];
trndata=[data_c1(:,(N/2+1):N) data_c2(:,(N/2+1):N)];
testlabel=2*ones(1,N);
testlabel(1:N/2)=1;
trn=struct('X',testdata,'name','Gaussain Data','y',testlabel,'dim',2,'num_data',N);
tst=struct('X',trndata,'name','Gaussain Data','y',testlabel,'dim',2,'num_data',N);

%Classification by Baysain Method
inx1 = find(trn.y==1);
inx2 = find(trn.y==2);
% Estimation of class-conditional distributions by EM
bayes_model.Pclass{1} = emgmm(trn.X(:,inx1),struct('ncomp',2));
bayes_model.Pclass{2} = emgmm(trn.X(:,inx2),struct('ncomp',2));
% Estimation of priors
n1 = length(inx1); n2 = length(inx2);
bayes_model.Prior = [n1 n2]/(n1+n2);
% Evaluation on testing data
ypred1 = bayescls(tst.X,bayes_model);
cerror1=cerror(ypred1,tst.y)
% Visualization
figure(2); hold on;
ppatterns(trn);
bayes_model.fun = 'bayescls';
pboundary(bayes_model);
% Penalization for don’t know decision
reject_model = bayes_model;
reject_model.eps = 0.1;
% Vislualization of rejet-option rule
pboundary(reject_model,struct('line_style','k--'));

% Classification by Fisher Discrimant of the toolbox.....
model_fld=fld(trn);
ypred2=linclass(tst.X,model_fld);
cerror2=cerror(ypred2,tst.y)
figure(3)
ppatterns(tst);
pline(model_fld)

% Classification by our Fisher Discrimant.....

%Calculating the Mean Vector....
u1=[0 0]';
u2=[0 0]';
for k=1:N/2
    u1=u1+trn.X(:,k);    
    u2=u2+trn.X(:,k+N/2);
end
u1=u1/(N/2);
u2=u2/(N/2);

%Calculating the Covariance Matrix
S1=[0 0;0 0];
S2=[0 0;0 0];
for k=1:N/2
    S1=S1+(trn.X(:,k)-u1)*(trn.X(:,k)-u1)';
    S2=S2+(trn.X(:,k+N/2)-u2)*(trn.X(:,k+N/2)-u2)';
end

Sw=S1+S2;
w=inv(Sw)*(u1-u2);
%Now doing the classification......
for k=1:N
    y(k)=w'*tst.X(:,k);
end

%find a threshold 

T=(w'*u1+w'*u2)/2;

ypred=y<T;
ypred=ypred+1;
cerror3=cerror(ypred,tst.y)
model_fls_own=struct('W', w, 'b', -T);
figure(4);
ppatterns(tst);
pline(model_fls_own);

%Classification by Perceptron
figure(5)
ppatterns(trn)
model_ptn=perceptron(trn);
pline(model_ptn);






⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -