📄 449.txt
字号:
发信人: GzLi (笑梨), 信区: DataMining
标 题: [转载] Boosting Paper reading
发信站: 南京大学小百合站 (Sat Dec 21 21:47:20 2002)
【 以下文字转载自 AI 讨论区 】
【 原文由 cloud 所发表 】
分类整理了一下,boosting这几年了的主要工作.限于bbs的限制,
排版不是很精美.大家多多原谅.
Overview
Robert E. Schapire. The boosting approach to machine learning: An
overview. In MSRI Workshop on Nonlinear Estimation and Classification,
2002. Postscript or gzipped postscript.
See also many talk slides under \\msrcn\root\share\szli\Boosting
Additive Logistic Models:
Friedman, J. H., Hastie, T. and Tibshirani, R. "Additive Logistic
Regression: a Statistical View of Boosting." (Aug. 1998)
Y.?Freund and R.E. Schapire. Discussion of the paper ``additive logistic
regression: a statistical view of boosting'' by J.?Friedman, T.
?Hastie and R.?Tibshirani. The Annals of Statistics, 38(2):391-293,
2000. (PDF)
Peter Buhlmann and Bin Yu (2000a). Discussion. Additive logistic
regression: a statistical view of boosting, by Friedman, J., Hastie, T.
and Tibshirani, R. Annals of Statistics invited (to appear).
M.?Collins, R.E. Schapire, and Y.?Singer. Logistic regression,
AdaBoost and bregman distances. In Proceedings of the Thirteenth
Annual Conference on Computational Learning Theory, 2000. (PDF)
J.H. Friedman. Greedy function approximation: A gradient boosting
machine. Search for the revised version 2001 Technical report,
Department of Statistics, Stanford University, February 1999. (PDF)
Multiple Randmized Classifiers and Trees
Amit, Y. and Gilles Blanchard, Multiple randomized classifiers: MRCL,
(2001).
Amit, Y. and Geman, D. Shape quantization and recognition with
randomized trees, Neural Computation (1997).
Amit, Y., Geman D. and Wilder, K., Joint induction of shape features and
tree classifiers, IEEE PAMI, (1997).
L.?Breiman. Random forests-random features. Load updated for Version
3 Technical Report 567, Statistics Department, University of California,
September 1999.
T.G. Dietterich. An experimental comparison of three methods for
constructing ensembles of decision trees: Bagging, boosting, and
randomization. Machine Learning, 1999. (PDF)
Theoretical Analysis and Results on Boosting
E.L. Allwein, R.E. Schapire, and Y.?Singer. Reducing multiclass to
binary: A unifying approach for margin classifiers. Journal of Machine
Learning Research, 1:113-141, 2000. (PDF)
R.E. Schapire. Theoretical views of boosting. In Computational
Learning Theory: Fourth European Conference, EuroCOLT'99, 1999. (PDF)
R.E. Schapire. Theoretical views of boosting and applications. In
Tenth International Conference on Algorithmic Learning Theory, 1999.
(PDF)
Yoav Freund, Yishay Mansour and Robert E. Schapire. Why averaging
classifiers can protect against overfitting. Preliminary version
appeared in Proceedings of the Eighth International Workshop on
Artificial Intelligence and Statistics, 2001. Postscript or gzipped
postscript of journal submission (9/4/01).
R.E. Schapire, Y.?Freund, P.?Bartlett, and W.?S. Lee. Boosting the
margin: A new explanation for the effectiveness of voting methods. The
Annals of Statistics, 26(5):1651-1686, October 1998. (PDF)
S.?Kutin and P.?Niyogi. The interaction of stability and weakness in
adaboost. Technical Report University of Chicago Department of
Computer Science Technical Report TR-2001-30, 2001. (PDF)
S.?Mannor, R.?Meir, and S.?Mendelson. On the consistency of boosting
algorithms. submitted to Advances in Neural Information Processing 14,
June 2001. (PDF)
W.?Jiang. Does boosting overfit: Views from an exact solution. Technical
Report 00-04, Department of Statistics, Northwestern University,
September 2000.
Weak Learners that Boost
S.?Mannor and R.?Meir. Weak learners and improved convergence rate in
boosting. In Advances in Neural Information Processing Systems 13:
Proc.?NIPS'2000, 2001. (PDF)
N.?Duffy and D.?Helmbold. Potential boosters?. In S.A. Solla, T.K. Leen,
and K.-R. Müller, editors, Advances in Neural Information Processing
Systems 12, pages 258-264. MIT Press, 2000. (PDF)
W.?Jiang. On weak base hypotheses and their implications for boosting
regression and classification. Technical Report 00-01, Department of
Statistics, Northwestern University, October 2000. Former title: ``Large
Time Behavior of Boosting Algorithms for Regression and
Classification''.
Dealing with Noise and Outliers
An adaptive version of the boost by majority algorithm (Freund,
COLT99)
G.?R?tsch. Robust Boosting via Convex Optimization. PhD thesis,
University of Potsdam, October 2001. (PDF)
G.?R?tsch and M.K. Warmuth. Marginal boosting. In Proceedings of the
Annual Conference on Computational Learning Theory, February 2002. in
press. (PDF)
Wenxin Jiang. Some theoretical aspects of boosting in the presence of
noisy data. Technical Report 01-01, Department of Statistics,
Northwestern University, 2001. To appear in Proceedings: The
Eighteenth International Conference on Machine Learning (ICML-2001),
June 2001, Morgan Kaufmann. (PDF)
Boosting for Regression
Friedman, J. H., Hastie, T. and Tibshirani, R. "Additive Logistic
Regression: a Statistical View of Boosting." (Aug. 1998). See also
discussions on this paper in Additive Logistic Models above
M.?Collins, R.E. Schapire, and Y.?Singer. Logistic regression,
AdaBoost and bregman distances. In Proceedings of the Thirteenth
Annual Conference on Computational Learning Theory, 2000. (PDF) See
also discussions on this paper in Additive Logistic Models above
N.?Duffy and D.?Helmbold. Leaveraging for regression. In Proceedings
of the Thirteenth Annual Conference on Computational Learning Theory,
2000. (PDF)
R.?Avnimelech and N.?Intrator. Boosting regression estimators. Neural
Computation, 11:491-513, 1999. (PDF)
G.?R?tsch, A.?Demiriz, and K.?Bennett. Sparse regression ensembles in
infinite and finite hypothesis spaces. NeuroCOLT2 Technical Report
2000-085, Royal Holloway College, London, September 2000. accepted for
publication in the Machine Learning journal special issue on ``New
Methods for Model Selection and Model Combination''. (PDF)
R.S. Zemel and T.?Pitassi. A gradient-based boosting algorithm for
regression problems. In NIPS-13: Advances in Neural Information
Processing Systems, 13, Cambridge, MA, 2001. MIT Press. In Press.
(PDF)
G.?R?tsch, A.?Demiriz, and K.?Bennett. Sparse regression ensembles in
infinite and finite hypothesis spaces. Machine Learning, 48(1-3):
193-221, 2002. Special Issue on New Methods for Model Selection and
Model Combination. Also NeuroCOLT2 Technical Report NC-TR-2000-085.
(PDF)
--
※ 来源:.南京大学小百合站 bbs.nju.edu.cn.[FROM: 61.132.74.239]
--
※ 转载:.南京大学小百合站 bbs.nju.edu.cn.[FROM: 211.80.38.17]
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -