⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 perceptron.m

📁 包含用lms、mse、perceptron准则函数的二类分类器
💻 M
字号:
function a = perceptron(feature1, feature2,theta,eta)

%-----------------------------------------------------
% a = perceptron(feature1, feature2,theta,eta)
% Function calculates the weight vector for the linear discriminant function
% for the 2-category data N-dimensional data. Algorithm is based on the minimization of the Perceptron
% criterion function (batch perceptron).
% Input variables:
% - feature1 -  augmented feature vector for the first class of the size
%              (Nf,Ns), where Nf is the number of different features +1 , Ns is the
%              number of samples
% - feature2 - feature vector for the second class of the same size
% - theta - threshold
% - eta - learning rate
% Output:
% - a - Weight vector (a_0,a_1, ..., a_d)' for the linear discriminant of the form: 
% g(x) = a0 + a1*x1 + a2*x2 + ... + a_d*xd 

% --------------------------------------------------
% Evgeny Krestyannikov
% krestyan@cs.tut.fi
% Institute of Signal Processing
% Room TE 313




Nf= size(feature1,2); % number of samples
dim=size(feature1,1); % number of features 


% normalized augmented feature vector
feature2=-feature2;
Y = [feature1 feature2];
szY=size(Y,2)



% Initialization
a=[1 rand(1,dim-1)]'
k = 0;
update   = 1e3;


% Perceptron algorithm
while (sum(abs(update))>=theta)
    k=k+1;
    for i=1:szY
        y(i)=0.5*(1-sign(a'*Y(:,i))); % y(i) equals to zero if the sample is properly classified 
    end

    misclassified(k)=size(find(y>0),2);  % number of misclassified points
    
    update	= eta*(Y*y');  % update rule  
    a	    = a + update;


    disp(['Number of misclassified samples ' num2str(misclassified(k))'']);
    
end


    disp(['Did ' num2str(k) ' iterations']);
    

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -