📄 svmlspex02.m
字号:
%SVMLSPex02.m
%Two Dimension SVM Problem, Two Class and Separable Situation
%
%%Difference with SVMLSPex01.m:
% Take the Largrange Function (16)as object function insteads ||W||,
% so it need more time than SVMLSex01.m
%
%Method from Christopher J. C. Burges:
%"A Tutorial on Support Vector Machines for Pattern Recognition", page 9
%
% Objective: min "f(A)=-sum(ai)+sum[sum(ai*yi*xi*aj*yj*xj)]/2" ,function (16)
% Subject to: sum{ai*yi}=0 ,function (15);
% and ai>=0 for any i, the particular set of constraints C2 (page 9, line14).
%The optimizing variables is "Lagrange Multipliers": A=[a1,a2,...,am],m is the number of total samples.
%
function mm=SVMLSex02(mm) %mm -- select sample number
echo off;
close all;
fclose all;
%Generate Two Set of Data each set with n data
m1=mm; % The number of class A
m2=mm; % The number of class B
n = 2; % The dimension of sample vector
al=pi/4; %
d=0.05; %
x=rand(m1,2);
x(1:m1,1)=2*x(1:m1,1);
x(1:m1,2)=x(1:m1,2);
X1(1:m1,1)=x(1:m1,1)*cos(al)-x(1:m1,2)*sin(al);
X1(1:m1,2)=x(1:m1,2)*cos(al)+x(1:m1,1)*sin(al);
x=rand(m2,2);
x(1:m2,2)=2*x(1:m2,2);
x(1:m2,1)=x(1:m2,1)+d;
X2(1:m2,1)=x(1:m2,1)*cos(al)+x(1:m2,2)*sin(al);
X2(1:m2,2)=x(1:m2,2)*cos(al)-x(1:m2,1)*sin(al);
clear x;
X=[X1;X2]; %X1 is Positive Samples Set and X2 is Negative Samples Set
m=m1+m2; %Total number of samples
figure(1);
whitebg(1,[0,0.1,0]);
plot(X(1:m1,1),X(1:m1,2),'m*');
hold on
plot(X(m1+1:m,1),X(m1+1:m,2),'c*');
Xa = min(0,min(X(1:m,1)));
Xb = max(X(1:m,1));
axis([Xa,Xb,min(X(1:m,2)),max(X(1:m,2))])
asw = 0;
while (asw~=1)&(asw~=2)
asw=input('Continue or no? (1/2)');
end
if asw==1
%
%Note!Program below can processes any dimension sample set with (m*n)matrix, m-sample number/n-sample dimension.
%
%Transmiting Samples to Object and Constraint Function, in SVMLSex02FUN.m,
%Reference to MATLAB function "constr"
fid=fopen('a.dat','w');
fwrite(fid,[m1,m2,n],'float');
for k=1:n
fwrite(fid,X(1:m,k),'float');
end
fclose(fid);
%
%Keep to MATLAB prescribe,Here
%x0 is start point to optimizing;
%Set options(13)=1 indicate the first constrained function contained in SVMLSPex02FUN.m is a equality,
%and others are "<=0",Reference to MATLAB function "CONSTR";
%
%Return parameter:
% A=[A(i)] is the optimizing variables, that is {ai} in function (14)/(15)/(16);
% L=[L(2),L(3),...,L(m+1)] is Lagrange Multipliyers correspondence to each constrained function;
x0 = 0.5*ones(m,1); %Initial of op
options(13) = 1;
[A,options,L] = constr('SVMLSPex02FUN', x0, options);
%From function (14)
Y=[ones(m1,1);-ones(m2,1)]; %The classfication value for both Positive and Negative Samples Set
W = sum((diag(A)*diag(Y))*X);
% Note!!!!, the Comprehension to "Lagrange Multiplier" A=[ai] and the return parameters L:
% L=[L(2),...,L(m+1)] is the Lagrange Multipliers for function f, the function (16);
% L(i)=0 indicate the corresponding constraint "x(i)>=0" is active and the point x(i) is a "Support Vector"
% Note!!! a(i) in function (16), is also "Lagrange Multiplier" but it is for object function ||W||
% and here, the actual object function to be optimized is function f, the function (16);
% and the function (16) can be considered as an alternation of original object function ||W||
% and a(i)or,in function (16),is only new variables of an alternation of origin variables.
[z,I]=min(L); %The Support Vectors are corresponding to L(i) = 0;
X0=X(I-1,1:n)'; %Pick out a arbitrary Support Vector to determine the "b" in classifying function
%The classifing function should be X*W+b=0. It is y = k*x +b, here k and b are
k = -W(1)/W(2);
b = -1/W(2)+W*X0/W(2);
plot([Xa,Xb],[k*Xa+b,k*Xb+b],'y')
for n=2:m+1
if L(n)<=1e-6,
plot(X(n-1,1),X(n-1,2),'wo');
end
end
delete('a.dat');
end
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -