📄 libsvm tools.mht
字号:
From: <Saved by Microsoft Internet Explorer 5>
Subject: LIBSVM Tools
Date: Mon, 17 Mar 2008 09:33:45 -0400
MIME-Version: 1.0
Content-Type: multipart/related;
type="text/html";
boundary="----=_NextPart_000_000C_01C88811.FECF28F0"
X-MimeOLE: Produced By Microsoft MimeOLE V6.00.2900.3198
This is a multi-part message in MIME format.
------=_NextPart_000_000C_01C88811.FECF28F0
Content-Type: text/html;
charset="gb2312"
Content-Transfer-Encoding: quoted-printable
Content-Location: http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN">
<HTML><HEAD><TITLE>LIBSVM Tools</TITLE>
<META http-equiv=3DContent-Type content=3D"text/html; charset=3Dgb2312">
<META content=3D"MSHTML 6.00.2900.3268" name=3DGENERATOR></HEAD>
<BODY text=3D#000000 vLink=3D#0000ff link=3D#ff0000 bgColor=3D#ffefd5>
<H1><A =
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvm/index.html">LIBSVM</A>=20
Tools</H1><!-- Created: Wed Apr 18 19:26:54 CST 2001 --><!-- hhmts start =
-->Last=20
modified: Fri Jan 11 17:23:20 CST 2008 <!-- hhmts end -->
<P>This page provides some miscellaneous tools based on LIBSVM. Roughly =
they=20
include=20
<UL>
<LI>Things not general enough to be included in LIBSVM=20
<LI>Research codes used in some our past papers=20
<LI>Some data sets in LIBSVM formats </LI></UL>They will be less =
maintained=20
comparing to the main LIBSVM package. However, comments are still =
welcome.=20
Please properly <A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvm/faq.html#f203">cite our =
work</A>=20
if you find them useful. This supports our future development. -- <A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin">Chih-Jen Lin</A>=20
<P>Disclaimer: We do not take any responsibility on damage or other =
problems=20
caused by using these software and data sets.=20
<HR>
<H3>Table of Contents</H3><A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/#0">LIBSVM for =
dense data=20
<BR><A href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/#1">LIBSVM =
for string=20
data <BR><A =
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/#2">Multi-label=20
classification <BR><A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/#3">LIBSVM =
Extensions at=20
Caltech <BR><A =
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/#4">Feature=20
selection tool <BR><A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/#5">LIBSVM data =
sets <BR><A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/#6">SVM-toy in 3D =
<BR><A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/#7">Multi-class=20
classification (and probability output) via error-correcting codes =
<BR><A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/#8">Stratified CV =
(cross=20
validation) for LIBSVM <BR><A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/#9">SVM with =
Precomputed=20
Kernel Matrices <BR><A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/#10">SVM =
Multi-class=20
Probability Outputs <BR><A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/#11">An integrated =
development environment to libsvm <BR><A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/#12">ROC Curve for =
Binary=20
SVM <BR><A =
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/#13">Grid=20
Parameter Search for Regression <BR><A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/#14">Radius Margin =
Bounds=20
for SVM Model Selection <BR><A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/#15">Weights for =
data=20
instances <BR><A =
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/#16">Primal=20
variable w of linear SVM and feature selection <BR><A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/#17">Reduced =
Support Vector=20
Machines Implementation <BR><A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/#18">Calculating =
the radius=20
of the smallest sphere containing all training data <BR><A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/#19">DAG approach =
for=20
multiclass classification <BR>
<P>
<HR>
<A name=3D0>
<H3>LIBSVM for dense data</H3>LIBSVM stores instances as sparse vectors. =
For=20
some applications, most feature values are non-zeros, so using a dense=20
representation can significantly save the computational time. The zip =
file <A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/libsvm-dense/libsvm=
-2.85-dense.zip">here</A>=20
is an implementation for dense data. See README for some comparisons =
with the=20
standard libsvm.=20
<P>Author: Ming-Fang Weng=20
<HR>
<A name=3D1>
<H3>LIBSVM for string data</H3>For some applications, data instances are =
strings. SVM trains a model using some string kernels. This experimental =
code=20
(download zip file <A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/string/libsvm-2.85-=
string.zip">here</A>)=20
allows string inputs and implements one string kernel. Details are in =
README.=20
<P>Author: Guo-Xun Yuan=20
<HR>
<A name=3D2>
<H3>Multi-label classification</H3>This <A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/multilabel/">web =
page</A>=20
contains various tools for multi-label classification.=20
<HR>
<A name=3D3>
<H3>LIBSVM Extensions at Caltech</H3>You can link to <A=20
href=3D"http://www.work.caltech.edu/~htlin/program/libsvm">this =
webpage</A>, which=20
is individually maintained by a PhD student <A=20
href=3D"http://www.work.caltech.edu/~htlin/">Hsuan-Tien Lin</A> at =
Caltech. The=20
page contains some programs that he has developed for related research. =
Most of=20
these programs are extended from/for LIBSVM. Some of the most useful =
programs=20
include confidence margin/decision value output, infinite ensemble =
learning with=20
SVM, dense format, and MATLAB implementation for estimating posterior=20
probability.=20
<HR>
<A name=3D4>
<H3>Feature selection tool</H3>This is a simple python script (download =
<A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/fselect/fselect.py"=
>here</A>)=20
to use F-score for selecting features. To run it, please put it in the=20
sub-directory "tools" of LIBSVM. <PRE>Usage: ./fselect.py training_file =
[testing_file]
</PRE>Output files: .fscore shows importance of features, .select gives =
the=20
running log, and .pred gives testing results.=20
<P>More information about this implementation can be found in Y.-W. Chen =
and=20
C.-J. Lin, <A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/papers/features.pdf">Combining =
SVMs with=20
various feature selection strategies</A>. To appear in the book "Feature =
extraction, foundations and applications." 2005. This implementation is =
still=20
preliminary. More comments are very welcome.=20
<P>Author: Yi-Wei Chen=20
<HR>
<A name=3D5>
<H3>LIBSVM data sets</H3>We now have a nice <A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/datasets">web =
page</A>=20
showing available data sets.=20
<HR>
<A name=3D6>
<H3>SVM-toy in 3D</H3>
<P>A simple applet demonstrating SVM classification and regression in =
3D. It=20
extends the java svm-toy in the <A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvm">LIBSVM</A> package.=20
<P><A href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/svmtoy3d">Go =
to 3D=20
SVM-toy page</A>=20
<P>
<HR>
<A name=3D7>
<H3>Multi-class classification (and probability output) via =
error-correcting=20
codes</H3>
<P><B>Note: libsvm does support multi-class classification.</B> The code =
here=20
implements some extensions for experimental purposes.=20
<P>This code implements multi-class classification and probability =
estimates=20
using 4 types of error correcting codes. Details of the 4 types of ECCs =
and the=20
algorithms can be found in the following paper:=20
<P>T.-K. Huang, <A href=3D"http://www3.nccu.edu.tw/~chweng">R. C. =
Weng</A>, and=20
<B>C.-J. Lin</B>. <A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/papers/generalBT.pdf">Generaliz=
ed=20
Bradley-Terry Models and Multi-class Probability Estimates. </A><I><A=20
href=3D"http://www.jmlr.org/">Journal of Machine Learning =
Research</A></I>,=20
7(2006), 85-115. A (very) short version of this paper appears in <A=20
href=3D"http://www.nips.cc/">NIPS</A> 2004.=20
<P>The code can be downloaded <A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/libsvm-errorcode/li=
bsvm-errorcode.zip">here</A>.=20
The installation is the same as the standard LIBSVM package, and =
different types=20
of ECCs are specified as the "-i" option. Type "svm-train" without any =
arguments=20
to see the usage. Note that both "one-againse-one" and "one-against-the =
rest"=20
multi-class strategies are part of the implementation.=20
<P>If you specify -b in training and testing, you get probability =
estimates and=20
the predicted label is the one with the larget value. If you do not =
specify -b,=20
this is classification based on decision values. Now we use the=20
"exponential-loss" method in the paper:=20
<P>Allwein et al.: Reducing multiclass to binary: a unifying approach =
for margin=20
classifiers. Journal of Machine Learning Research, 1:113--141, 2001,=20
<P>to predict class label. For one-against-the rest (or called 1vsall), =
this is=20
the same as the commonly used way <BR>argmax_{i} (decision value of ith =
class vs=20
the rest). <BR>For one-against-one, it is different from the max-win =
strategy=20
used in libsvm.=20
<P>MATLAB code for experiments in our paper is available <A=20
href=3D"http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/libsvm-errorcode/ge=
neralBT.zip">here</A>=20
<P>Author: Tzu-Kuo Huang=20
<HR>
<A name=3D8>
<H3>Stratified CV (cross validation) for LIBSVM</H3><B>Note: this =
feature has=20
been included in the core LIBSVM</B>. <BR>This feature was suggested and =
initially implemented by <A =
href=3D"http://www.cs.toronto.edu/~james">David=20
James</A> from Dept. of Computer Science, University of Toronto. <!--=0A=
=0A=
<p>=0A=
In some situations it is better to have folds=0A=
stratified so that each contains the same proportions=0A=
of labels as the original training set.=0A=
Please replace svm.cpp in LIBSVM by=0A=
<a href=3Dstratifiedcv/svm.cpp>this file</a>.=0A=
Then the same option -v will do stratified CV.=0A=
=0A=
<p>=0A=
If you would like to repeat CV several times, please=0A=
replace svm-train.c by <a href=3Dstratifiedcv/svm-train.c>=0A=
this file</a>. Then svm-train -x will specify the=0A=
number of CV runs.=0A=
=0A=
<p>=0A=
Author: <a href=3Dhttp://www.cs.toronto.edu/~james>David James</a> from =
Dept. of Computer Science, University of Toronto.=0A=
-->
<HR>
<A name=3D9>
<H3>SVM with Precomputed Kernel Matrices</H3>
<P><B>Note: this feature has been included in the core LIBSVM (after =
version=20
2.82)</B> <BR>The input format is <B>slightly different</B>. Please =
check README=20
for details.=20
<P>Special kernels are used for some problems. Users can calculate and =
store the=20
kernel matrix first. Then this code will directly use them without =
needing=20
further kernel evaluations.=20
<P>Please note that this is suitable for special kernels and data with =
very many=20
features. For those with few features, direct kernel evaluations can be =
much=20
faster than reading the kernel matrices. <!--=0A=
<p> Please download the <a =
href=3Dprecomputed/libsvm-precomputed.zip>zip</a> file and follow the =
README file.=0A=
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -