📄 libsvm tools.mht
字号:
The code is modified from libsvm 2.81.=0A=
=0A=
-->
<P>Author: Pei-Chin Wang=20
<HR>
<A name=3D10>
<H3>SVM Multi-class Probability Outputs</H3>This code implements =
different=20
strategies for multi-class probability estimates from in the following =
paper=20
<P>T.-F. Wu, C.-J. Lin, and R. C. Weng. <A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/papers/svmprob/svmprob.pdf">Pro=
bability=20
Estimates for Multi-class Classification by Pairwise Coupling. =
</A>Journal of=20
Machine Learning Research, 2004. A short version appears in NIPS 2003.=20
<P>After libsvm 2.6, it already includes one of the methods here. You =
may=20
directly use the standard libsvm unless you are interested in doing =
comparisons.=20
Please download the <A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/svmprob/svmprob-1.1=
.tgz">tgz</A>=20
file here. The data used in the paper is available <A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/papers/svmprob/data/">here</A>.=
Please=20
then check README for installation.=20
<P>Matlab programs for the synthetic data experiment in the paper can be =
found=20
in <A =
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/svmprob/newfig">thi=
s=20
directory</A>. The main program is fig1a.m=20
<P>Author: Tingfan Wu (svm [at] future.csie.org)=20
<HR>
<A name=3D11>
<H3>An integrated development environment to libsvm</H3>This is a =
graphical=20
environment for doing experiments with libsvm. You can create and =
connect=20
components (like scaler, trainer, predictor, etc) in this environment. =
The=20
program can be extended easily by writing more "plugins". It was written =
in=20
python and uses wxPython library. Please download the <A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/sfd/sfd.zip">zip</A=
> file=20
here. After unzip the package, run the file wxApp1.py. You then have to =
give the=20
path of libsvm binary files in plugin/svm/svm_interface.py.=20
<CENTER><IMG=20
src=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/sfd/sfd.png"></CENTE=
R>
<P>Author: Chih-Chung Chang=20
<HR>
<A name=3D12>
<H3>ROC Curve for Binary SVM</H3>
<P>This tool which gives the ROC (Receiver Operating Characteristic) =
curve and=20
AUC (Area Under Curve) by ranking the decision values. Note that we =
assume=20
labels of two classes are +1 and -1. Multi-class is not supported yet.=20
<CENTER><IMG=20
src=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/roc/heart_scale-roc.=
png"></CENTER>
<P>Please download the <A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/roc/plotroc.py">plo=
troc.py</A>=20
file here. You need to=20
<OL>
<LI>Download libsvm and make the libsvm python interface=20
<LI>Edit the path of gnuplot in plotroc.py in necessary=20
<LI>Put plotroc.py into the python directory of libsvm package. =
</LI></OL>The=20
usage is <PRE>plotroc.py [-t kern_type][-c Cost][-g gamma][-m =
cache_size][-v cv_fold][-T testing_file] training_file
</PRE>If there is no test data, "validated decision values" from=20
cross-validation on the training data are used. Otherwise, we consider =
decision=20
values of testing data using the model from the training data (without=20
cross-validation).=20
<P>Author: Tingfan Wu (svm [at] future.csie.org) <!--=0A=
<h3>Probability Output and ROC Curve for Binary SVM </h3>=0A=
<p>=0A=
Note: there was a bug in the first release of this code.=0A=
If you downloaded it before May 17, 2003, please get the tool again.=0A=
=0A=
<p>=0A=
This is a powerful tool which gives probability output and draws=0A=
the ROC (Receiver Operating Characteristic) curve. =0A=
<p>=0A=
We use the approach by Platt:=0A=
"Probabilistic outputs for support vector machines and comparison to =
regularized=0A=
likelihood methods"=0A=
but implements an improved version:=0A=
H.-T. Lin,=0A=
C.-J. Lin, and=0A=
R. C. Weng.=0A=
<A HREF=3D"../papers/plattprob.ps">=0A=
A note on Platt's probabilistic outputs for support vector machines=0A=
</A> =0A=
=0A=
<table>=0A=
<tr>=0A=
<td><img src=3Droc/heart_scale-roc.png></td>=0A=
<td><img src=3Dheart_scale-prob.png></td>=0A=
</table>=0A=
=0A=
=0A=
<p>=0A=
Please download the <a href=3Dsvm-prob-roc.tgz>tgz</a> file here.=0A=
It includes=0A=
<ul>=0A=
<li> Modified svm.cpp for output decision value.(a patch file is included=0A=
in case you want to modify new version of libsvm)=0A=
<li> python interface for svm.cpp=0A=
<li> base.py, a library for IO, optimization, plot=0A=
<li> selfroc.py, ttroc.py using the library=0A=
</ul>=0A=
=0A=
Just untar the file and "make test," a demostration using heart_scale=0A=
will show up.=0A=
=0A=
This package is tested under linux where most cooperation utilities=0A=
are installed under /usr/bin. Please change=0A=
the pathname if you face such problem.=0A=
=0A=
<p>=0A=
Multi-class is not supported yet.=0A=
=0A=
=0A=
=0A=
<p>=0A=
Author: Tingfan Wu (svm [at] future.csie.org)=0A=
with the help from Hsuan-Tien Lin=0A=
-->
<HR>
<A name=3D13>
<H3>Grid Parameter Search for Regression</H3>This <A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/gridsvr/gridregress=
ion.py">file</A>=20
is a slight modification of grid.py in the libsvm package. In addition =
to=20
parameters C, gamma in classification, it searches for epsilon as well.=20
<P><PRE>Usage: grid.py [-log2c begin,end,step] [-log2g begin,end,step] =
[-log2p begin,end,step] [-v fold]=20
[-svmtrain pathname] [-gnuplot pathname] [-out pathname] [-png pathname]
[additional parameters for svm-train] dataset
</PRE>
<P>Author: Hsuan-Tien Lin (initial modification); Tzu-Kuo Huang (the =
parameter=20
epsilon).=20
<HR>
<A name=3D14>
<H3>Radius Margin Bounds for SVM Model Selection </H3>This is the code =
used in=20
the paper: K.-M. Chung, W.-C. Kao, T. Sun, L.-L. Wang, and C.-J. Lin. <A =
href=3D"http://www.csie.ntu.edu.tw/~cjlin/papers/rm.ps.gz">Radius Margin =
Bounds=20
for Support Vector Machines with the RBF Kernel. </A>Please download the =
<A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/rm_model_select.tar=
.bz2">tar.bz2</A>=20
file here. Details of using this code are in the readme.txt file. Part =
of the=20
optimization subroutines written in Python were based on the module by =
Travis E.=20
Oliphant.=20
<P>Author: Wei-Chun Kao with the help from Leland Wang, Kai-Min Chung, =
and Tony=20
Sun=20
<HR>
<A name=3D15>
<H3>Weights for data instances</H3>Using this code you can give a weight =
to each=20
data point. Please download the following <A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/libsvm-weight-2.81.=
zip">zip</A>=20
file. Weights must be stored in the first column of the input file =
(i.e., before=20
the class labels). However, we do not support weigts for test data so =
they=20
should be in the original libsvm format. Now C-SVC and epsilon-SVR are=20
supported.=20
<P>Author: Ming-Wei Chang and Hsuan-Tien Lin=20
<HR>
<A name=3D16>
<H3>Primal variable w of linear SVM and feature selection</H3>In the =
following=20
<A =
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/calw">directory</A>=
there=20
are two files. svm-weight.cpp calculates the primal variable w using a =
model=20
trained by libsvm (multi-class supported). Note that this program is for =
LINEAR=20
SVM only! The output is a file containing the decison functions. If the =
data has=20
k classes, the decision functions of all 1vs1 sub-problems are placed in =
the=20
order 1 vs 2, ..., 1 vs k, 2 vs 3, ..., k-1 vs k.=20
<P>The file linear-feasel.cpp conducts feature selection by considering =
indices=20
with larger components of w. Please use the makefile in the same =
directory to=20
build them. <B>Note that this file works for two-class problems only. =
</B>
<P>Author: Tzu-Kuo Huang=20
<HR>
<A name=3D17>
<H3>Reduced Support Vector Machines Implementation</H3>This is the code =
used in=20
the paper: K.-M. Lin and C.-J. Lin. <A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/papers/rsvmTEX.pdf">A study on =
reduced=20
support vector machines. </A><I>IEEE Transactions on Neural =
Networks</I>, 2003.=20
<P>Please download the <A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/rsvm.tgz">.tgz</A> =
file=20
here. After making the binary files, type svm-train to see the usage. It =
includes different methods to implement RSVM.=20
<P>To speed up the code, you may want to link the code to optimized =
BLAS/LAPACK=20
or ATLAS.=20
<P>Author: Kuan-Min Lin=20
<HR>
<A name=3D18>
<H3>Calculating the radius of the smallest sphere containing all =
training=20
data</H3>Please download files in the following <A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/calR2">directory</A=
> and=20
make the code. Then=20
<UL>
<LI>-s 0 trains the regular L1-SVM=20
<LI>-s 1 trains L2-SVM (i.e., error term is quadratic).=20
<LI>-s 2 gives the square of the radius for L1-SVM=20
<LI>-s 3 gives the square of the radius for L2-SVM </LI></UL>
<P>Author: Leland Wang (Holger Froehlich of University of Tuebingen =
extends it=20
from RBF only to general kernels)=20
<HR>
<!--=0A=
<h3>1 against all (or 1 against the rest) for multiclass =
classification</h3>=0A=
=0A=
This is included in the new implementation of using =0A=
"error-correcting codes" for multi-class classification=0A=
and probability estimates. Please check it instead.=0A=
--><!--=0A=
Please download files in the following <a href=3D1vsall>directory</a>=0A=
<p>=0A=
This is the code used in the paper:=0A=
C.-W. Hsu and C.-J. Lin.=0A=
<A HREF=3D"http://www.csie.ntu.edu.tw/~cjlin/papers/multisvm.ps.gz">=0A=
A comparison of methods =0A=
for multi-class support vector machines=0A=
</A>, =0A=
<I>IEEE Transactions on Neural Networks</A></I>, 13(2002), 415-425.=0A=
=0A=
<p>=0A=
Author: Chih-Wei Hsu=0A=
<hr>=0A=
--><A=20
name=3D19>
<H3>DAG approach for multiclass classification</H3>In svm.cpp, please =
replace=20
the lines from <PRE>double *dec_values =3D Malloc(double, =
nr_class*(nr_class-1)/2);
</PRE>to <PRE>return model->label[vote_max_idx];
</PRE>in the subtoutine svm_predict() with <A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/dag/dag_code">this =
segment=20
of code</A>. <!--=0A=
double download the following <a href=3Dlibsvm-2.3dag.zip>zip</a> file =
which is a modified version of=0A=
libsvm=0A=
using DAG for testing=0A=
-->
<P>This follows from the code used in the paper: C.-W. Hsu and C.-J. =
Lin. <A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/papers/multisvm.pdf">A =
comparison of=20
methods for multi-class support vector machines </A>, <I>IEEE =
Transactions on=20
Neural Networks</A></I>, 13(2002), 415-425.=20
<P>Author: Chih-Wei Hsu <!--=0A=
<hr>=0A=
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -