⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 knn_rule.m

📁 最新的模式识别分类工具箱,希望对朋友们有用!
💻 M
字号:
% Classifies input using k-NN rule% Usage%      label = Knn_Rule(inputSample, TrainingSamples, TrainingLabels, k);% where%    inputSample     is a ROW vector (1xd) %                      containing the sample to be classified%    TrainingSamples is a matrix (nxd) containing 1 training vector%                      with d features in each of its n rows, where%                      n is the training set size%    TrainingLabels  is a COLUMN vector (nx1) containing the labels of%                      the training samples%    k               is the number of nearest neighbors.%%  NOTE: the above conventions on rows vs. columns are not extremely%  important, since the routine will automatically adjust to cases%  where they are violated.%  HOWEVER: if the number of vectors in the training set is the same as%  the number of dimensions, the algorithm will not be able to tell which%  is which!%% A caveat:%   if at all possible, use the set (1,...,C) as class labels.%   this implementation will not work that well if you use very large %   or very small (negative) numbers as class labels (like 10^23, for example)%%%  Used by Store_Grabbag%function target = Knn_Rule(Sample, trainingVectors, classLabels, Knn)% Date     Name               Change% 03/18/02 Vittorio Castelli  Now works for any number of features,%                             any number of classes. %                             Added documentation[a,b] = size(trainingVectors);if ( a ~= length(classLabels) )  if ( b ~= length(classLabels))    fprintf('Apparent inconsistency between arguments 2 and 3\n');    fprintf('Argument 2 should contain the training set, with vectors as rows\n');    fprintf('Argument 3 should contain the labels and have length equal to the number of rows of argument 2\n');    target = -1;    return;  else    % Rearranges the training vectors according to the    % desired order    trainingVectors = trainingVectors';    [a,b] = size(trainingVectors);  endend% computes the squared differences between the entries% of each training sample and the corresponding entries% of the sample to be classified[c d] = size(Sample);if (c > d)  % rearranges the sample as a row vector  Sample = Sample';  [c d] = size(Sample);endif (c > 1)  fprintf('This algorithm only classifies one vector at a time\n');  target = -1;  return;endaux    = ones(a,1) * Sample;sdiff  = trainingVectors - aux;sdiff  = sdiff .* sdiff;% computes the squared distances, no need to take square rotsdistsq = sum(sdiff,2);[sorted_dist, indices] = sort(distsq);z = min(classLabels);if (z <= 0)  classLabels = classLabels - z+1;endZ      = max(classLabels);counts = zeros(1,Z);if length(classLabels) <= Knn   k_nearest = classLabels;else   k_nearest = classLabels(indices(1:Knn));endfor i=1:Knn,  counts (k_nearest(i)) =   counts (k_nearest(i))+1;end[foo,target] =  max(counts);if (z<=0)  target = target-1+z;end

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -