⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 read me.txt

📁 libsvm is a simple, easy-to-use, and efficient software for SVM classification and regression. It s
💻 TXT
字号:


<1>. Copy all the files into your working folder. 



<2>. Given T = {Z_train, C_train},  value_lambda  and value_gamma(for RBF kernel),

the Matlab function "Build_classifier.m" will give you all the parameters needed for H(z):


          [ CA, SV, b ] = Build_classifier( Z_train, C_train, value_lambda, value_gamma ).




(Then, H(z)   =  sign {   [ \sum_{i=1}^{N_SV}   CA(i) * K(sv_i, z) ]  +  b    }
.)



All the other files are for the purpose of "Build_classifier.m", 
so you don't have to look at them.


=====================================================================================

Explanations about the inputs and outputs in "Build_classifier.m" and how to get H(z): 

------------>   Please  read the comments written in "Build_classifier.m", and 

                pay special attention to the format of Z_train and C_train.



------------>   Read the comments first, then you are ready to look at the example below.
     
======================================================================================


Example:  (D=2)


            Z_Train =  -1.0   0.0              C_train = -1
                        0.8   0.5                         1
                       -1.2  -0.1                        -1
                        1.0   0.0                         1
                       -0.9   0.1                        -1
                       -1.3   0.1                        -1
            
            value_lambda = 10                  value_gamma  = 1



Then, after inputting them into "Build_classifier.m", we get CA, SV, b as follows: 


            CA =   -0.1005          SV =  -1.2   -0.1         b=0.0304,
                   -0.6624                -0.9    0.1
                   -0.3788                -1.3    0.1
                    0.6024                 0.8    0.5
                    0.5393                 1.0    0.0



What we learned from these three outputs is:


There are five support vectors 

               {sv_1, sv_2, sv_3, sv_4, sv_5}, 

associated with five original class labels 

               {c_1, c_2, c_3, c_4, c_5}

and five positive weights 

               {alpha_1, alpha_2, alpha_3, alpha_4, alpha_5} as follows: 


      sv_1= [ -1.2 ,  -0.1 ]  , c_1     =  sign(CA(1)) =  -1 

                                alpha_1 =  |CA(1)|     = 0.1005 
                              

      sv_2= [ -0.9 ,   0.1 ]  , c_2     = sign(CA(2))  =  -1

                                alpha_2 = |CA(2)|      = 0.6624


      sv_3= [ -1.3 ,   0.1 ]  , c_3     = sign(CA(3))  =  -1

                                alpha_3 = |CA(3)|      = 0.3788


      sv_4= [  0.8 ,   0.5 ]  , c_4     = sign(CA(4))  =  +1

                                alpha_4 = |CA(4)|      = 0.6024


      sv_5= [  1.0 ,   0.0 ]  , c_5     = sign(CA(5))  =  +1

                                alpha_5 = |CA(5)|      = 0.5393



So, the classifier in this case is: 


      H(z)= sign {   [ \sum_{i=1}^{5}   c_i * alpha_i * K(sv_i, z) ]    + b  }

          = sign {   [ \sum_{i=1}^{5}   CA(i) * K(sv_i, z)  ]    + b }

where     

      K(sv_i, z) =  exp( - value_gamma  * ( || sv_i - z ||^2 ) )

                 =  exp( - 1 * ( || sv_i - z ||^2 ) ),  
      
      since value_gamma=1.

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -