⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 knn.m

📁 传感器网络探测声信号通过特征提取进行目标设别分类
💻 M
字号:
function [class]=knn(Pr,Tr,Pt,kN)% Usage: [Cmat,C_rate]=knn(Pr,Tr,Pt,Tt,kN)% kNN- k-Nearest Neighbor Classifier% copyright 1993-1996 by Yu Hen Hu% Last revision 2/12/03 by Marco F. Duarte% Pr:  training feature vector  K x N (prototype), the classifier% Tr:  training target vector   K x S (labels of prototye)% Pt:  testing/validating feature vector Q x N % kN:   # of nearest neighbors used.% Assume L2 norm distance% Cmat:  confusion matrix of the testing/validation set%      = classified class(Sx1) * true classes (1xS)% C_rate: classification rate (%)% This is done so that attenuation and distance have approx. same weight.Pr(:,2) = Pr(:,2)/500;Pt(:,2) = Pt(:,2)/500;[K,S]=size(Tr);[Q,N]=size(Pt);rbias=sum(Pr'.*Pr')*.5;  %  1/2||W||^2    1 by Kif S==1,  oneS=eye(S+1);  Tr=[Tr ones(K,1)-Tr];  %  if S=1, change Tr,Tt to 2 class formatelse  oneS=eye(S);end% calculate L2 norm distanceclass=[];for i =1:Q,   d =-Pr*Pt(i,:)'+rbias';   % d is K by 1, distance   %  choose the minimum kN terms (k-nearest neighbor) of each column   [y,idx]=sort(d);   % both y and idx are K by 1, first kN entries of idx                      % gives the indices of the kN nearest neighbor   if kN > 1,      [yy,kidx]=max(sum(Tr(idx(1:kN),:)));      class=[class;oneS(kidx,:)];   else      class=[class;Tr(idx(1:kN),:)];   endend % i-loopif S == 1,    class = class(:,1);end

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -