📄 svmtest.m
字号:
function [ClassRate, DecisionValue]= SVMTest(Samples, Labels, AlphaY, SVs, Bias, Parameters)
% USAGE:
% [ClassRate, DecisionValue]= SVMTest(Samples, Labels, AlphaY, SVs, Bias, Parameters)
%
% DESCRIPTION:
% Test the performance of a trained 2-class SVM classifier by a group of input patterns
% with their true class labels given.
% In fact, this function is used to do the input parameter checking, and it
% depends on a mex file to implement the algorithm.
%
% INPUTS:
% Samples: all the input patterns. (a row of column vectors)
% Lables: the corresponding true class labels for the input patterns in Samples.
% where Labels(i) in {1, -1}. (a row vector)
% AlphaY: Alpha * Y, where Alpha is the non-zero Lagrange Coefficients
% Y is the corresponding labels.(a row vector)
% SVs : support vectors. That is, the patterns corresponding the non-zero
% Alphas.(a row of column vectors)
% Bias : the bias in the decision function, which is AlphaY*Kernel(SVs',x)-Bias. (scalar)
% Parameters: the paramters required by the training algorithm.
% (a 10-element row vector)
% +-----------------------------------------------------------------
% |Kernel Type| Degree | Gamma | Coefficient | C |Cache size|epsilon|
% +-----------------------------------------------------------------
% -------------------------------------------+
% | SVM Type | nu (nu-svm) | loss tolerance |
% -------------------------------------------+
% where Kernel Type:
% 0 --- Linear
% 1 --- Polynomial: (Gamma*<X(:,i),X(:,j)>+Coefficient)^Degree
% 2 --- RBF: (exp(-Gamma*|X(:,i)-X(:,j)|^2))
% 3 --- Sigmoid: tanh(Gamma*<X(:,i),X(:,j)>+Coefficient)
% Gamma: If the input value is zero, Gamma will be set defautly as
% 1/(max_pattern_dimension) in the function. If the input
% value is non-zero, Gamma will remain unchanged in the
% function.
% C: Cost of the constrain violation (for C-SVC & C-SVR)
% Cache Size: as the buffer to hold the <X(:,i),X(:,j)> (in MB)
% epsilon: tolerance of termination criterion
% SVM Type:
% 0 --- c-SVM classifier
% 1 --- nu-SVM classifier
% 2 --- 1-SVM
% 3 --- c-SVM regressioner
% nu: the nu used in nu-SVM classifer (for 1-SVM and nu-SVM)
% loss tolerance: the epsilon in epsilon-insensitive loss function
%
% OUTPUTS:
% ClassRate: the ration of properly classified pattern to the total number of
% input patterns.
% DecisionValue: the output of the decision function.
%
if (nargin ~= 6)
disp(' Incorrect number of input variables.');
help SVMTest;
return
end
[spM spN]=size(Samples);
[lbM lbN]=size(Labels);
if lbM ~= 1
disp(' Error: ''Labels'' should be a row vector.');
return
end
if spN ~= lbN
disp(' Error: different number of training patterns and their labels.');
return
end
[prM prN]= size(Parameters);
if prM ~= 1
disp(' Error: ''Parameters'' should be a row vector.');
return
end
if prN ~= 10
disp(' Error: ''Parameters'' should have exactly 10 elements.');
return
end
if (Parameters(1)>3) & (Parameters(1) < 0)
disp(' Error: this program only supports 4 types of kernel functions.');
return
end
if (Parameters(8)>3) & (Parameters(8) < 0)
disp(' Error: this program only supports 4 types of SVMs.');
return
end
[svM svN]=size(SVs);
[ayM ayN]=size(AlphaY);
if ayM ~= 1
disp(' Error: ''AlphaY'' should be a row vector.');
return
end
if svM ~= spM
disp(' Warning: ''SVs'' should have the same feature dimension as ''Samples''.');
[Samples]=DimFit(Samples,svM);
end
[ClassRate, DecisionValue]= mexSVMClass(Samples, Labels, AlphaY, SVs, Bias, Parameters);
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -