📄 373.txt
字号:
发信人: GzLi (笑梨), 信区: DataMining
标 题: Machine Learning 47(2/3) <1>
发信站: 南京大学小百合站 (Thu Jul 18 00:44:54 2002), 站内信件
篇名: Boosting Methods for Regression
刊名: Machine Learning
ISSN: 0885-6125
卷期: 47 卷 2/3 期 出版日期: 200205/06
页码: 从 153 页到 200 页共 48 页
作者: Duffy Nigel Computer Science Department, University of California
, Santa Cruz, Santa Cruz, CA 95064, USA. nigeduff@cse.ucsc.edu
Helmbold David Computer Science Department, University of California, Santa
Cruz, Santa Cruz, CA 95064, USA. dph@cse.ucsc.edu
文摘:
In this paper we examine ensemble methods for regression that leverage or
“boost” base regressors by iteratively calling them on modified samples
. The most successful leveraging algorithm for classification is AdaBoost
, an algorithm that requires only modest assumptions on the base learning
method for its strong theoretical guarantees. We present several gradient
descent leveraging algorithms for regression and prove AdaBoost-style bounds
on their sample errors using intuitive assumptions on the base learners.
We bound the complexity of the regression functions produced in order to
derive PAC-style bounds on their generalization errors. Experiments validate
our theoretical results.
--
*** 端庄厚重 谦卑含容 事有归着 心存济物 ***
今天你挖了吗? DataMining http://DataMining.bbs.lilybbs.net
MathToolshttp://bbs.sjtu.edu.cn/cgi-bin/bbsdoc?board=MathTools [m
※ 修改:.GzLi 于 Jul 18 00:46:31 修改本文.[FROM: 211.80.38.29]
※ 来源:.南京大学小百合站 bbs.nju.edu.cn.[FROM: 211.80.38.29]
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -