⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 svmdemo.m

📁 list of matlab m-files on matlab 7.0. learning , support vector machine and some utility routines :
💻 M
字号:
% svmdemo.m -- demonstration of svm algorithm for classification
% Utilizing libsvm C programs  svmtrain.exe
% call datasepf.m, datagen.m, mat2libsvm.m, readmodel.m
% modified: 11/23/2005: use libsvm 2.8 matlab interface svmtrain.dll
% no longer need mat2libsvm.m or readmodel.m

clear all; close all
nfeature=2; nclass=2;  % 2D feature space, 2 classes
eymat=eye(nclass);

n=input('Total number of samples to be generated (default = 100) = ');
if isempty(n), n=100; end
n1=ceil(n/2); n0=n-n1;
disp('Enter 0 (default) to generate linearly separable data, or ...');
chos=input('enter 1 to generate non-separable data: ');
if isempty(chos), chos=0; end

if chos==0,
   % generating linearly separable or non-separable 2D data files
   disp('generate linearly separable data files');
   pn=1; % generate +1 -1 label
   % data is n x 3 with last col being target +1, -1
   [data,slope]=datasepf(n,n1,pn);  
   feature=data(:,1:2); label=data(:,3);
   C=1e9; % force a linearly separable solution
elseif chos==1,
   Nvec=[n0 n1];
   mean_var=[-0.5 0 0.2; 0.5 0 0.2]';
   data=datagen(Nvec,mean_var,1); % data is n x 4, training data
   feature=data(:,1:2); label=data(:,3)-data(:,4); % lable contains +1, -1s
   C=1;
end

figure(1),clf
c1idx=find(label==1);  c2idx=find(label==-1);
plot(feature(c1idx,1),feature(c1idx,2),'g.',...
    feature(c2idx,1),feature(c2idx,2),'b.')
legend('Class 1 data','Class 2 data')
axis equal
title('original data')

disp('Press any key to continue ... ');
pause

if chos==0,
    ktype=0; 
else
    disp('Choose kernel type:')
    disp('0. Linear kernel;')
    disp('1. Polynomial kernel;')
    disp('2. RBF kernel (defult);')
    disp('3. Sigmoid kernel;')
    ktype=input('Enter kernel type: ');
    if isempty(ktype), ktype=2; end
end

para=[' -t ' int2str(ktype) ' -c ' int2str(C)];
% switch ktype
% case 0, % kernel='linear';
%    para=['-t 0 '];
% case 1, %kernel='poly';
%    degree=input('Enter order of polynomial (default = 2) = ');
%    if isempty(degree), degree=3; end
   % set parameters, K = (u'v + 1)^degree
%    para=['-t 1 -d ' int2str(degree) ' -g 1 -r 1 '];
% case 2, % kernel='rbf';
%    para=['-t 2 -g 1 '];
% case 3, % kernel = 'sigmoid';
%    para=['-t 3 -g 1 -r 0 '];
% end

% call svmtrain.dll 
% syntax:
% model = svmtrain(training_label_vector, training_instance_matrix, ...
%           [,'libsvm_options']);
%         -training_label_vector:
%             An m by 1 vector of training labels.
%         -training_instance_matrix:
%             An m by n matrix of m training instances with n features.
%             It can be dense or sparse.
%         -libsvm_option:
%             A string of training options in the same format as that of LIBSVM.
% options:
% -s svm_type : set type of SVM (default 0)
% 	0 -- C-SVC
% 	1 -- nu-SVC
% 	2 -- one-class SVM
% 	3 -- epsilon-SVR
% 	4 -- nu-SVR
% -t kernel_type : set type of kernel function (default 2)
% 	0 -- linear: u'*v
% 	1 -- polynomial: (gamma*u'*v + coef0)^degree
% 	2 -- radial basis function: exp(-gamma*|u-v|^2)
% 	3 -- sigmoid: tanh(gamma*u'*v + coef0)
% -d degree : set degree in kernel function (default 3)
% -g gamma : set gamma in kernel function (default 1/k)
% -r coef0 : set coef0 in kernel function (default 0)
% -c cost : set the parameter C of C-SVC, epsilon-SVR, and nu-SVR (default 1)
% -n nu : set the parameter nu of nu-SVC, one-class SVM, and nu-SVR (default 0.5)
% -p epsilon : set the epsilon in loss function of epsilon-SVR (default 0.1)
% -m cachesize : set cache memory size in MB (default 40)
% -e epsilon : set tolerance of termination criterion (default 0.001)
% -h shrinking: whether to use the shrinking heuristics, 0 or 1 (default 1)
% -b probability_estimates: whether to train an SVC or SVR model for probability estimates, 0 or 1 (default 0)
% -wi weight: set the parameter C of class i to weight*C in C-SVC (default 1)
% -v n: n-fold cross validation mode
% 
% The k in the -g option means the number of attributes in the input data.
% 
% option -v randomly splits the data into n parts and calculates cross
% validation accuracy/mean squared error on them.

model=svmtrain(label, feature, para);
% the model structure consists of the following fields:
%     Parameters: [5x1 double]: some default parameters 
%       nr_class: 2
%        totalSV: 3
%            rho: 37.4722
%          Label: [2x1 double]
%          ProbA: []
%          ProbB: []
%            nSV: [2x1 double]: number of support vectors in each class
%        sv_coef: [3x1 double]: alpha vector
%            SVs: [3x2 double]: support vectors in sparse matrix format

alpha=model.sv_coef; % sv(:,1): alpha value model.nsv X 1
svs=full(model.SVs); % rows of support vectors, converting sparse to full matrix

if ktype==0,
   svslabel=svmpredict(rand(model.totalSV,1),svs,model);
   % to find wo and bo, we use the equatioins that wo x + bo = +1 or -1
   % for x being support vectors. Hence, given support vectors and their
   % label, we may solve for wb = [wo bo]
   wb=pinv([svs ones(model.totalSV,1)])*svslabel; % 
   gap=1/norm(wb(1:2));  %
   disp('the weights are:'); % w = sum(alpha(i)*d(i)*x(i))
   disp(['wo = ' num2str(wb(1:2)') ', bo = ' num2str(wb(3))]);
   disp(['the margin (2*rho) = ' num2str(2*gap)]);
end

figure(1),clf

if chos==0,
plot(feature(c1idx,1),feature(c1idx,2),'g.',...
    feature(c2idx,1),feature(c2idx,2),'b.',...
    svs(:,1),svs(:,2),'rs')
    legend('Class 1 data','Class 2 data','Support vectors')
else
plot(feature(c1idx,1),feature(c1idx,2),'g.',...
    feature(c2idx,1),feature(c2idx,2),'b.',...
    svs(find(alpha==C),1),svs(find(alpha==C),2),'ro',...
    svs(find(alpha<C),1),svs(find(alpha<C),2),'rs')
    legend('Class 1 data','Class 2 data','SV, alpha=C','SV, on margin')
end
axis equal
title(['C = ' int2str(C) ', # SV = ' int2str(model.totalSV)])

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -