📄 kperceptr.m
字号:
function [Alpha,bias,sol,t,nsv]=... kperceptr(data,labels,ker,arg,tmax)% KPERCEPTR Kernel Perceptron algorithm.%% [Alpha,bias,sol,t,nsv]=kperceptr(data,labels,ker,arg,tmax)%% Input:% data [dim x n] input patterns; dim is dimension and% n is number of patterns.% labels [1 x n] labels of patterns; 1 denotes the 1st class% 2 denotes the 2nd class.% ker [string] kernel identifier (see 'help kernel').% arg [...] argumetn of given kernel.% tmax [int] maximal number of iterations.%% Output:% Alpha [1 x n] positive linear pattern multipliers.% bias [real] bias of the found decision function.% sol [int] 1 - Perceptron converged to the solution (zero% training classification error), 0 - Perceptron has not% converged in tmax iterations.% t [int] number of iterations.% nsv [int] number of non-zero multipliers Alpha.% % See also PERCEPTR, SVM, SVMCLASS, PSVM.%% Statistical Pattern Recognition Toolbox, Vojtech Franc, Vaclav Hlavac% (c) Czech Technical University Prague, http://cmp.felk.cvut.cz%% Modifications: % 21-Nov-2001, V. Francif nargin < 4, error('Not enough input arguments.');endif nargin <5, tmax = inf;end% get dimension and number of training patterns[dim, n ] = size( data );% precompute kernel functionK = kmatrix( data, ker, arg );% make labels to be 1 and -1Y = itosgn( labels );% inicialize multiliers Alpha and biasAlpha = zeros( 1, n );bias = 0;sol = 0;t = 0;while t < tmax & sol == 0, t = t + 1; % dot products < x, w > proj = Y'.*(K*(Y.*Alpha)'+bias); % find misclassified patter [dot_prod, inx ] = min( proj ); % chacks whether the adaptation is needed if dot_prod <= 0, Alpha( inx ) = Alpha( inx ) + 1; bias = Y(inx) * bias; else sol = 1; endend% compute number of nonzero Alphasnsv = length( find( Alpha ));return;
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -