⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 osusvmtest.m

📁 支持向量机实现程序
💻 M
字号:
function  [ConfMatrix, scores]= osuSVMTest(Samples, Labels, Ns, AlphaY, SVs, Bias, Parameters)
% USAGE: 
% [ConfMatrix, scores]= osuSVMTest(Samples, Labels, Ns, AlphaY, SVs, Bias, Parameters)
%
% DESCRIPTION: 
% Test the performance of a trained SVM classifier by a group of input patterns
% with their true class labels given. (both multi-class and 2-class)
%
% INPUTS:
% Samples: all the input patterns. (a row of column vectors)
% Lables: the corresponding true class labels for the input patterns in Samples.
%        (a row vector) (labels are like 1, 2, 3, ..., M )
% Ns: number of SVs for each class (a row vector). This parameter is valid only
%     for the multi-class case, and is 0 for the 1-svm and 2-class case.
% AlphaY: Alpha * Y, where Alpha is the non-zero Lagrange Coefficients
%                    Y is the corresponding {1 -1} labels. 
%     in multi-class case:
%        [AlphaY_Class1, AlphaY_Class2, ..., AlphaY_ClassM]
%        +----Ns(1)----+----Ns(2)-----+----+---Ns(M)------+
% SVs : support vectors. That is, the patterns corresponding the non-zero
%       Alphas.
%     in multi-class case:
%        [SVs_Class1, SVs_Class2, ..., SVs_ClassM]
%        +--Ns(1)---+---Ns(2)---+----+---Ns(M)---+
% Bias : the bias in the decision function, which is AlphaY*Kernel(SVs',x)-Bias.
%     in multi-class case:
%        [Bias_Class1, Bias_Class2, ..., Bias_ClassM]
% Parameters: the paramters required by the training algorithm. 
%             (a 10-element row vector)
%            +-----------------------------------------------------------------
%            |Kernel Type| Degree | Gamma | Coefficient | C |Cache size|epsilon| 
%            +-----------------------------------------------------------------
%				 -------------------------------------------+
%            | SVM Type |	nu (nu-svm) | loss tolerance |				
%				 -------------------------------------------+
%            where Kernel Type:
%                   0 --- Linear
%                   1 --- Polynomial: (Gamma*<X(:,i),X(:,j)>+Coefficient)^Degree
%                   2 --- RBF: (exp(-Gamma*|X(:,i)-X(:,j)|^2))
%                   3 --- Sigmoid: tanh(Gamma*<X(:,i),X(:,j)>+Coefficient)
%                  Gamma: If the input value is zero, Gamma will be set defautly as
%                         1/(max_pattern_dimension) in the function. If the input
%                         value is non-zero, Gamma will remain unchanged in the 
%                         function.
%                  C: Cost of the constrain violation (for C-SVC & C-SVR)
%                  Cache Size: as the buffer to hold the <X(:,i),X(:,j)> (in MB)
%                  epsilon: tolerance of termination criterion
%                  SVM Type: 
%                   0 --- c-SVM classifier
%                   1 --- nu-SVM classifier
%                   2 --- 1-SVM 
%                   3 --- c-SVM regressioner
%                  nu: the nu used in nu-SVM classifer (for 1-SVM and nu-SVM)
%						 loss tolerance: the epsilon in epsilon-insensitive loss function	
%
% OUTPUTS:
% ConfMatrix: the confusion matrix for the multi-class case, the classification
%             rate for the 2-class case, and the non-outlier rate for 1-class. 
% scores: In the multi-class case, the decision function output for each class. 
%         (a M row matrix, each row is the decision function for a class)
%         In 2-class case, the decision funciton output of this 2-class problem
%
 
if (nargin ~= 7)
   disp(' Incorrect number of input variables.\n');
   help osuSVMTest;
else
   [svDim n]=size(SVs);
   [sampleDim n]=size(Samples);
   % make the dimension of SVs the same as that of the input patterns
   if sampleDim ~=svDim
      [Samples]=DimFit(Samples,svDim);
   end
   
   if Ns == 0 % for 1-svm and 2-class case (including both c-svm and u-svm)
      if (Parameters(8) == 0) | (Parameters(8) == 1) % for the 2-class case
         Labels=2*(1.5-Labels);
      elseif (Parameters(8) == 2)  %for 1-svm
         Labels = ones(size(Labels));
      end
      [ConfMatrix, scores]= SVMTest(Samples, Labels, AlphaY, SVs, Bias, Parameters);
   else
      M=length(Ns);
      pos=1;
      for i=1:M % for multi-class case
         CurrentLabels = (Labels == i);
         % test one class by one class
         NonClassLabelInd =find(CurrentLabels == 0);
         CurrentLabels(NonClassLabelInd)=-1*ones(1,length(NonClassLabelInd));
         [ClassRate, curDecisionValue]= SVMTest(Samples, CurrentLabels, AlphaY(pos:pos+Ns(i)-1), SVs(:,pos:pos+Ns(i)-1), Bias(i), Parameters);
         if i == 1
            scores = curDecisionValue;
         else
            scores = [scores; curDecisionValue];
         end
         pos = pos + Ns(i);
      end
      % the final class is the class whose decision function output is the maximum value among all others.
      [v EstLabels] = max(scores,[],1);
      % construct the MxM Confusion Matrix
      ConfMatrix=zeros(M,M);
      for i=1:M
         TargetInds=find(Labels==i);
         IndsN=length(TargetInds);
         for j=1:M
            ConfMatrix(i,j)=length(find(EstLabels(TargetInds)==j))/IndsN;
         end
      end
   end
end



 

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -