📄 exp_k_nearest.m
字号:
% KNNCLASS k-Nearest Neighbours classifier.
%
% Synopsis:
% y = knnclass(X,model)
%
% Description:
% The input feature vectors X are classified using the K-NN
% rule defined by the input model.
%
% Input:
% X [dim x num_data] Data to be classified.
% model [struct] Model of K-NN classfier:
% .X [dim x num_prototypes] Prototypes.
% .y [1 x num_prototypes] Labels of prototypes.
% .K [1x1] Number of used nearest-neighbours.
%
% Output:
% y [1 x num_data] Classified labels of testing data.
%
% Example:
trn = load('riply_trn');
subplot(2,2,1);title('训练数据集分布');
% 原始分布
ppatterns( trn );
tst = load('riply_tst');
subplot(2,2,2);title('测试数据集分布');
% 原始分布
ppatterns( tst );
gauss_model = mlcgmm(trn);%1
t=cputime; %1 %计时开始
quad_model = bayesdf(gauss_model);%1
ypred = knnclass(tst.X,knnrule(trn,5));
cerror( ypred, tst.y )
TimeCost=cputime-t;
text1=num2str(error); % 计算分类误差
subplot(2,2,3);title('训练数据集pgauss分布');
ppatterns(trn); pboundary(quad_model); %1
subplot(2,2,4);title('测试数据集pgmm分布');
ppatterns( tst );%pgmm( model );
pboundary(quad_model); %1
%2 pboundary(quad_model);
hpop1 = uicontrol('Style', 'text',...
'String','分类误差', 'Position', [100 -20 100 50], 'FontSize',16);
hpop2 = uicontrol('Style', 'text',...
'String', text1, 'Position', [200 -20 100 50], 'FontSize',16);
hpop3 = uicontrol('Style', 'text',...
'String','时间耗费', 'Position', [300 -20 100 50], 'FontSize',16);
hpop4 = uicontrol('Style', 'text',...
'String', TimeCost, 'Position', [400 -20 100 50], 'FontSize',16);
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -