⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 k_l_nn_rule_vc.m

📁 最新的模式识别分类工具箱,希望对朋友们有用!
💻 M
字号:
% Classifies input using (k-l)NN classifier% This means that it will classify the input if at least l of the k nearest% neighbors agree on the label, and refuses to classify otherwise.% % NOTE: To make this comparable to other classifiers, we select the majority %       class when we refuse to classify%%% Usage%      [trainError, testError, estTrainLabels, estTestLabels] = ...%           K_L_nn_Rule_VC(trainFeatures, trainLabels,params ,testFeatures, testLabels)% where%    trainingFeatures is a matrix containing 1 training vector%                      with d features in each of its n columns, where%                      n is the training set size%    trainLabels  is a COLUMN vector (nx1) containing the labels of%                      the training samples%    params       is a string, containing%                  k  is the number of nearest neighbors.%                  l  is the minimum number of neighbors that must agree%                         (set l = 1 for regular k-NN classifier)%    testFeatures   - test set, one column per vector%    testLabels     - labels  for test set%% Outputs%	trainError     - the error rate on the training set (one entry per%	                   class + total error)%	testError      - the error rate on the test set (one entry per class%	                  + total error)%       estTrainLabels - the labels produced by the algorithm for the%                          training samples%       estTestLabels - the labels produced by the algorithm for the%                          test samples%function [trainError, testError, estTrainLabels, estTestLabels] = ...    K_L_nn_Rule_VC(trainFeatures, trainLabels,algParam ,testFeatures, testLabels)[Nclasses, classes]  = find_classes([trainLabels(:);testLabels(:)]); % Number of classes in labels[Dim, Nsam]          = size(trainFeatures);% extracts the thinghysKnn        = algParam(1);l          = algParam(2);defaultClass = 0;nInBestCl = 0;for cl=1:Nclasses  nInCl = sum(trainLabels == classes(cl));  if nInCl > nInBestCl,    nInBestCl = nInCl;    defaultClass = classes(cl);  endend%================================================================hm = findobj('Tag', 'Messages'); fprintf('(%d-%d)-N-N Rule: Classifying Training Set\n',Knn,l);if (isempty(hm)==0)  s = sprintf('(%d-%d)N-N Rule: Classifying Training Set\n',Knn,l);  set(hm,'String',s);  refresh;end[Dim, Nsam]          = size(trainFeatures);OneD = ones(1,Nsam);hatTrainLabels = zeros(size(trainLabels));for sam = 1:Nsam,  % computes squared distances, sorts them  diff = trainFeatures(:,sam) * OneD;  diff = diff - trainFeatures;  diff = sum(diff.* diff);  [diff, indices] = sort(diff);  % takes care of the small sample problem  if (length(trainLabels) <= Knn)    k_nearest = trainLabels;  else    k_nearest = trainLabels(indices(1:Knn));  end  % finds the class with largest number of neighbors  bestClass = 0;  maxNN     = 0;  for cl=1:Nclasses,    nrNN = sum(k_nearest == classes(cl));    if(nrNN > maxNN)      maxNN = nrNN; bestClass = cl;    end  end  if ( maxNN >= l)    hatTrainLabels(sam) = classes(bestClass);  else    hatTrainLabels(sam) = defaultClass;  endend%  and now for the test set%================================================================hm = findobj('Tag', 'Messages'); fprintf('(%d-%d)-N-N Rule: Classifying Test Set\n',Knn,l);if (isempty(hm)==0)  s = sprintf('(%d-%d)N-N Rule: Classifying Test Set\n',Knn,l);  set(hm,'String',s);  refresh;end[Dim, Nsam]          = size(testFeatures);hatTestLabels = zeros(size(testLabels));for sam = 1:Nsam,  % computes squared distances, sorts them  diff = testFeatures(:,sam) * OneD;  diff = diff - trainFeatures;  diff = sum(diff.* diff);  [diff, indices] = sort(diff);  % takes care of the small sample problem  if (length(trainLabels) <= Knn)    k_nearest = trainLabels;  else    k_nearest = trainLabels(indices(1:Knn));  end  % finds the class with largest number of neighbors  bestClass = 0;  maxNN     = 0;  for cl=1:Nclasses,    nrNN = sum(k_nearest == classes(cl));    if(nrNN > maxNN)      maxNN = nrNN; bestClass = cl;    end  end  if ( maxNN >= l)    hatTestLabels(sam) = classes(bestClass);  else    hatTestLabels(sam) = defaultClass;  endendtrainError = computeError( classes, trainLabels, hatTrainLabels);testError  = computeError( classes, testLabels, hatTestLabels);

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -