214.txt

来自「This complete matlab for neural network」· 文本 代码 · 共 35 行

TXT
35
字号
发信人: nope (明朗朗的生活), 信区: DataMining
标  题: Re: svm and bayes rule in Classification
发信站: 南京大学小百合站 (Thu Jul  4 15:05:01 2002), 站内信件

可有全文?


【 在 GzLi (笑梨) 的大作中提到: 】
: 篇名: Support Vector Machines and the Bayes Rule in Classification 
: 刊名: Data Mining and Knowledge Discovery 
: ISSN: 1384-5810 
: 卷期: 6 卷 3 期 出版日期: 200207  
: 页码: 从 259 页到 275 页共 17 页 
: 作者: Lin Yi   Department of Statistics, University of Wisconsin, Madison
: , 1210 West Dayton Street, Madison, WI 53706-1685, USA. yilin@stat.wisc.edu
: 文摘: 
: The Bayes rule is the optimal classification rule if the underlying distrib..
:  of the data is known. In practice we do not know the underlying distribution
: , and need to “learn” classification rules from the data. One way to derive
:  classification rules in practice is to implement the Bayes rule approximately
:  by estimating an appropriate classification function. Traditional statistical
:  methods use estimated log odds ratio as the classification function. Support
:  vector machines (SVMs) are one type of large margin classifier, and the 
: relationship between SVMs and the Bayes rule was not clear. In this paper
: , it is shown that the asymptotic target of SVMs are some interesting classif
: ication
:  functions that are directly related to the Bayes rule. The rate of converg..
:  of the solutions of SVMs to their corresponding target functions is explic..
: (以下引言省略 ... ...)

--
http://bbs.nju.edu.cn/cgi-bin/bbs/showfile?name=photo.gif 

※ 来源:.南京大学小百合站 bbs.nju.edu.cn.[FROM: 202.119.94.20]

⌨️ 快捷键说明

复制代码Ctrl + C
搜索代码Ctrl + F
全屏模式F11
增大字号Ctrl + =
减小字号Ctrl + -
显示快捷键?