⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 svm_light.htm

📁 本程序用于分类两类数据
💻 HTM
📖 第 1 页 / 共 3 页
字号:
  href="http://links.cse.msu.edu:8000/members/matt_gerber/index.php/Software#SVM-Light_server_mode_classification_module">SVM-Classify 
  TCP/IP Server</A>: a server version of svm_classify that let's you classify 
  examples over a TCP/IP port, written by <A 
  href="http://www.cse.msu.edu/~csega/w/Matt_Gerber">Matthew Gerber</A> (for <A 
  href="http://www-ai.cs.uni-dortmund.de/SOFTWARE/SVM_LIGHT/svm_light_v6.01.eng.html">SVM<SUP><I>light</I></SUP> 
  V6.01</A>) </LI></UL>
<H2>Questions and Bug Reports</H2>
<P>If you find bugs or you have problems with the code you cannot solve by 
yourself, please contact me via <A 
href="mailto:thorsten@joachims.org">email</A>. </P>
<H2>Disclaimer</H2>
<P>This software is free only for non-commercial use. It must not be distributed 
without prior permission of the author. The author is not responsible for 
implications from the use of this software. </P>
<H2>History</H2>
<H4>V6.00 - V6.01</H4>
<UL>
  <LI>Small bug fixes in HIDEO optimizer. </LI></UL>
<H4>V5.00 - V6.00</H4>
<UL>
  <LI>Allows restarts from a particular vector of dual variables (option y). 
  <LI>Time out for exceeding number of iterations without progress (option #). 
  <LI>Allows the use of Kernels for learning ranking functions. 
  <LI>Support for non-vectorial data like strings. 
  <LI>Improved robustness and convergence especially for regression problems. 
  <LI>Cleaned up code, which makes it easier to integrate it into other 
  programs. 
  <LI>Interface to SVM<I><SUP>struct</SUP></I>. 
  <LI>Source code for <A 
  href="http://www.cs.cornell.edu/People/tj/svm_light/old/svm_light_v5.00.html">SVM<I><SUP>light</I></SUP> 
  V5.00</A> </LI></UL>
<H4>V4.00 - V5.00</H4>
<UL>
  <LI>Can now solve ranking problems in addition to classification and 
  regression. 
  <LI>Fixed bug in kernel cache that could lead to segmentation fault on some 
  platforms. 
  <LI>Fixed bug in transductive SVM that was introduced in version V4.00. 
  <LI>Improved robustness. 
  <LI>Source code for <A 
  href="http://www.cs.cornell.edu/People/tj/svm_light/old/svm_light_v4.00.html">SVM<I><SUP>light</I></SUP> 
  V4.00</A> </LI></UL>
<H4>V3.50 - V4.00</H4>
<UL>
  <LI>Can now solve regression problems in addition to classification. 
  <LI>Bug fixes and improved numerical stability. 
  <LI>Source code for <A 
  href="http://www.cs.cornell.edu/People/tj/svm_light/old/svm_light_v3.50.html">SVM<I><SUP>light</I></SUP> 
  V3.50</A> </LI></UL>
<H4>V3.02 - V3.50</H4>
<UL>
  <LI>Computes XiAlpha estimates of the error rate, the precision, and the 
  recall. 
  <LI>Efficiently computes Leave-One-Out estimates of the error rate, the 
  precision, and the recall. 
  <LI>Improved Hildreth and D'Espo optimizer especially for low-dimensional data 
  sets. 
  <LI>Easier to link into other C and C++ code. Easier compilation under 
  Windows. 
  <LI>Faster classification of new examples for linear SVMs. </LI></UL>
<H4>V3.01 - V3.02</H4>
<UL>
  <LI>Now examples can be read in correctly on SGIs. </LI></UL>
<H4>V3.00 - V3.01</H4>
<UL>
  <LI>Fixed convergence bug for Hildreth and D'Espo solver. </LI></UL>
<H4>V2.01 - V3.00</H4>
<UL>
  <LI>Training algorithm for transductive Support Vector Machines. 
  <LI>Integrated core QP-solver based on the method of Hildreth and D'Espo. 
  <LI>Uses folding in the linear case, which speeds up linear SVM training by an 
  order of magnitude. 
  <LI>Allows linear cost models. 
  <LI>Faster in general. </LI></UL>
<H4>V2.00 - V2.01</H4>
<UL>
  <LI>Improved interface to PR_LOQO 
  <LI>Source code for <A 
  href="http://www-ai.cs.uni-dortmund.de/SOFTWARE/SVM_LIGHT/svm_light_v2.01.eng.html">SVM<I><SUP>light</I></SUP> 
  V2.01</A> </LI></UL>
<H4>V1.00 - V2.00</H4>
<UL>
  <LI>Learning is much faster especially for large training sets. 
  <LI>Working set selection based on steepest feasible descent. 
  <LI>"Shrinking" heuristic. 
  <LI>Improved caching. 
  <LI>New solver for intermediate QPs. 
  <LI>Lets you set the size of the cache in MB. 
  <LI>Simplified output format of svm_classify. 
  <LI>Data files may contain comments. </LI></UL>
<H4>V0.91 - V1.00</H4>
<UL>
  <LI>Learning is more than 4 times faster. 
  <LI>Smarter caching and optimization. 
  <LI>You can define your own kernel function. 
  <LI>Lets you set the size of the cache. 
  <LI>VCdim is now estimated based on the radius of the support vectors. 
  <LI>The classification module is more memory efficient. 
  <LI>The f2c library is available from <A 
  href="ftp://ftp-ai.cs.uni-dortmund.de/pub/Users/thorsten/svm_light/f2c/">here</A>. 

  <LI>Adaptive precision tuning makes optimization more robust. 
  <LI>Includes some small bug fixes and is more robust. 
  <LI>Source code for <A 
  href="http://www-ai.cs.uni-dortmund.de/SOFTWARE/SVM_LIGHT/svm_light_v1.00.eng.html">SVM<I><SUP>light</I></SUP> 
  V1.00</A> </LI></UL>
<H4>V0.9 - V0.91</H4>
<UL>
  <LI>Fixed bug which appears for very small C. Optimization did not converge. 
  </LI></UL><A name=References></A>
<H2>References</H2>
<TABLE cellSpacing=0 cellPadding=5 border=0>
  <TBODY>
  <TR>
    <TD vAlign=top width="34%">
      <P>[Joachims, 2002a]</P></TD>
    <TD vAlign=top width="66%">
      <P>Thorsten Joachims, <A 
      href="http://textclassification.joachims.org/"><I>Learning to Classify 
      Text Using Support Vector Machines</I></A>. Dissertation, Kluwer, 
      2002.<BR>[<A 
      href="http://search.barnesandnoble.com/booksearch/isbninquiry.asp?isbn=079237679X">B&amp;N</A>] 
      [<A href="http://www.amazon.com/exec/obidos/ASIN/079237679X">Amazon</A>] 
      [<A href="http://www.wkap.nl/prod/b/0-7923-7679-X">Kluwer</A>] </P></TD></TR>
  <TR>
    <TD vAlign=top width="34%">[Joachims, 2002c]</TD>
    <TD vAlign=top width="66%"><SPAN lang=EN-GB 
      style="mso-ansi-language: EN-GB">T. Joachims, <I>Optimizing Search Engines 
      Using Clickthrough Data</I>, Proceedings of the ACM Conference on 
      Knowledge Discovery and Data Mining (KDD), ACM, 2002.<BR></SPAN><A 
      href="http://www.joachims.org/publications/joachims_02c.ps.gz"><SPAN 
      lang=EN-GB style="mso-ansi-language: EN-GB">Online 
      [Postscript]</SPAN></A><SPAN lang=EN-GB style="mso-ansi-language: EN-GB"> 
      &nbsp;</SPAN><A 
      href="http://www.joachims.org/publications/joachims_02c.pdf"><SPAN 
      lang=EN-GB style="mso-ansi-language: EN-GB">[PDF]</SPAN></A><SPAN 
      lang=EN-GB style="mso-ansi-language: EN-GB"> &nbsp;</SPAN></TD></TR>
  <TR>
    <TD vAlign=top width="34%">
      <P>[Klinkenberg, Joachims, 2000a]</P></TD>
    <TD vAlign=top width="66%">
      <P>R. Klinkenberg and T. Joachims, <I>Detecting Concept Drift with Support 
      Vector Machines</I>. Proceedings of the Seventeenth International 
      Conference on Machine Learning (ICML), Morgan Kaufmann, 2000. <BR><A 
      href="http://www.joachims.org/publications/klinkenberg_joachims_2000a.ps.gz" 
      target=_top>Online [Postscript (gz)]</A> <A 
      href="http://www.joachims.org/publications/klinkenberg_joachims_2000a.pdf.gz" 
      target=_top>[PDF (gz)]</A></P></TD></TR>
  <TR>
    <TD vAlign=top width="34%">
      <P>[Joachims, 2000b]</P></TD>
    <TD vAlign=top width="66%">
      <P>T. Joachims, <I>Estimating the Generalization Performance of a SVM 
      Efficiently</I>. Proceedings of the International Conference on Machine 
      Learning, Morgan Kaufman, 2000. <BR><A 
      href="http://www.joachims.org/publications/joachims_00a.ps.gz" 
      target=_top>Online [Postscript (gz)]</A> <A 
      href="http://www.joachims.org/publications/joachims_00a.pdf" 
      target=_top>[PDF]</A></P></TD></TR>
  <TR>
    <TD vAlign=top width="34%">
      <P>[Joachims, 1999a]</P></TD>
    <TD vAlign=top width="66%">
      <P>T. Joachims, 11 in: <I>Making large-Scale SVM Learning Practical</I>. 
      Advances in Kernel Methods - Support Vector Learning, B. Sch鰈kopf and C. 
      Burges and A. Smola (ed.), MIT Press, 1999. <BR><A 
      href="http://www.joachims.org/publications/joachims_99a.ps.gz" 
      target=_top>Online [Postscript (gz)]</A> <A 
      href="http://www.joachims.org/publications/joachims_99a.pdf" 
      target=_top>[PDF]</A></P></TD></TR>
  <TR>
    <TD vAlign=top width="34%">
      <P>[Joachims, 1999c]</P></TD>
    <TD vAlign=top width="66%">
      <P>Thorsten Joachims, <I>Transductive Inference for Text Classification 
      using Support Vector Machines</I>. International Conference on Machine 
      Learning (ICML), 1999. <BR><A 
      href="http://www.joachims.org/publications/joachims_99c.ps.gz" 
      target=_top>Online [Postscript (gz)]</A> <A 
      href="http://www.joachims.org/publications/joachims_99c.pdf" 
      target=_top>[PDF]</A></P></TD></TR>
  <TR>
    <TD vAlign=top width="34%">
      <P>[Morik et al., 1999a]</P></TD>
    <TD vAlign=top width="66%">
      <P>K. Morik, P. Brockhausen, and T. Joachims, <I>Combining statistical 
      learning with a knowledge-based approach - A case study in intensive care 
      monitoring</I>. Proc. 16th Int'l Conf. on Machine Learning (ICML-99), 
      1999. <BR><A 
      href="http://www.joachims.org/publications/morik_etal_99a.ps.gz" 
      target=_top>Online [Postscript (gz)]</A> <A 
      href="http://www.joachims.org/publications/morik_etal_99a.pdf" 
      target=_top>[PDF]</A></P></TD></TR>
  <TR>
    <TD vAlign=top width="34%">
      <P>[Joachims, 1998a]</P></TD>
    <TD vAlign=top width="66%">
      <P>T. Joachims, <I>Text Categorization with Support Vector Machines: 
      Learning with Many Relevant Features</I>. Proceedings of the European 
      Conference on Machine Learning, Springer, 1998. <BR><A 
      href="http://www.joachims.org/publications/joachims_98a.ps.gz" 
      target=_top>Online [Postscript (gz)]</A> <A 
      href="http://www.joachims.org/publications/joachims_98a.pdf" 
      target=_top>[PDF]</A></P></TD></TR>
  <TR>
    <TD vAlign=top width="34%">
      <P>[Joachims, 1998c]</P></TD>
    <TD vAlign=top width="66%">
      <P>Thorsten Joachims, <I>Making Large-Scale SVM Learning Practical</I>. 
      LS8-Report, 24, Universit鋞 Dortmund, LS VIII-Report, 1998. <BR><A 
      href="http://www.joachims.org/publications/joachims_98c.ps.gz" 
      target=_top>Online [Postscript (gz)]</A> <A 
      href="http://www.joachims.org/publications/joachims_98c.pdf" 
      target=_top>[PDF]</A></P></TD></TR>
  <TR>
    <TD vAlign=top width="34%">
      <P>[Vapnik, 1995a]</P></TD>
    <TD vAlign=top width="66%">
      <P>Vladimir N. Vapnik, <I>The Nature of Statistical Learning Theory</I>. 
      Springer, 1995.</P></TD></TR></TBODY></TABLE>
<H2>Other SVM Resources</H2>
<UL>
  <LI><A href="http://www.first.gmd.de/" target=_top>GMD-First Berlin</A> 
  <LI><A href="http://www.kernel-machines.org/" target=_top>Kernel-Machines Web 
  Site</A> 
  <LI><A href="http://svm.research.bell-labs.com/" target=_top>Bell Labs</A> 
  <LI><A href="http://www.research.microsoft.com/~jplatt/svm.html" 
  target=_top>Microsoft Research</A> 
  <LI><A 
  href="http://www.dcs.rhbnc.ac.uk/research/compint/areas/comp_learn/sv/index.shtml" 
  target=_top>Royal Holloway College</A> 
  <LI><A href="http://wwwsyseng.anu.edu.au/lsg/" target=_top>ANU Canberra</A> 
  <LI><A 
  href="http://www.ai.mit.edu/projects/cbcl/res-area/theory/index-theory-learning.html" 
  target=_top>MIT</A> 
  <LI><A href="http://lara.enm.bris.ac.uk/cig/" target=_top>Bristol CI-Group</A> 
  </LI></UL>
<P>Last modified November 7th, 2007 by <A href="http://www.joachims.org/" 
target=_top>Thorsten Joachims</A> &lt;<A 
href="mailto:thorsten@joachims.org">thorsten@joachims.org</A>&gt;</P></BODY></HTML>

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -