⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 machine learning textbook introduction to machine learning (ethem alpaydin).htm

📁 Machine Learning with WEKA: An Introduction (讲义) 关于数据挖掘和机器学习的.
💻 HTM
📖 第 1 页 / 共 3 页
字号:
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN">
<!-- saved from url=(0040)http://www.cmpe.boun.edu.tr/~ethem/i2ml/ -->
<HTML><HEAD><TITLE>Machine Learning Textbook: Introduction to Machine Learning (Ethem ALPAYDIN)</TITLE>
<META http-equiv=Content-Type content="text/html; charset=gb2312">
<META content="MSHTML 6.00.2900.2963" name=GENERATOR></HEAD>
<BODY><A title="View large cover" 
href="http://mitpress.mit.edu/images/products/books/0262012111-f30.jpg"><IMG 
alt=[IMG] 
src="Machine Learning Textbook Introduction to Machine Learning (Ethem ALPAYDIN).files/0262012111-medium.jpg" 
align=left>
<H1>Introduction to Machine Learning</H1>
<H3><A href="http://www.cmpe.boun.edu.tr/~ethem/">Ethem ALPAYDIN</A></H3><BR>The 
MIT Press, October 2004, ISBN 0-262-01211-1
<P>The book can be ordered through <A title="MIT Press" 
href="http://mitpress.mit.edu/0262012111/">The MIT Press</A>, Amazon (<A 
title=Canada href="http://www.amazon.ca/exec/obidos/ASIN/0262012111">CA</A>, <A 
title=Germany href="http://www.amazon.de/exec/obidos/ASIN/0262012111">DE</A>, <A 
title=France href="http://www.amazon.fr/exec/obidos/ASIN/0262012111">FR</A>, <A 
title=Japan href="http://www.amazon.co.jp/exec/obidos/ASIN/0262012111">JP</A>, 
<A title="United Kingdom" 
href="http://www.amazon.co.uk/exec/obidos/ASIN/0262012111">UK</A>, <A 
title=" United States" 
href="http://www.amazon.com/exec/obidos/tg/detail/-/0262012111?v=glance">US</A>), 
<A title="Barnes and Noble" 
href="http://search.barnesandnoble.com/booksearch/isbnInquiry.asp?isbn=0262012111">Barnes&amp;Noble 
(US)</A>, <A title=Turkey 
href="http://www.pandora.com.tr/urun.asp?id=117299">Pandora (TR)</A>, <A 
title="Prentice-Hall India" 
href="http://www.phindia.com/bookdetail.php?isbn=81-203-2791-8">Prentice-Hall of 
India (IN)</A>. 
<P>German language edition is in preparation and will be published by Oldenbourg 
Verlag, Munich.
<P>
<HR>
<A title="jump to" 
href="http://www.cmpe.boun.edu.tr/~ethem/i2ml/#desc">Description</A>, <A 
title="jump to" href="http://www.cmpe.boun.edu.tr/~ethem/i2ml/#toc">Table of 
Contents</A>, <A title="jump to" 
href="http://www.cmpe.boun.edu.tr/~ethem/i2ml/#revs">Reviews</A>, <A 
title="jump to" 
href="http://www.cmpe.boun.edu.tr/~ethem/i2ml/#courses">Courses</A>, <A 
title="jump to" 
href="http://www.cmpe.boun.edu.tr/~ethem/i2ml/#figs">Figures</A>, <A 
title="jump to" href="http://www.cmpe.boun.edu.tr/~ethem/i2ml/#lecs">Lecture 
Slides</A>, <A title="jump to" 
href="http://www.cmpe.boun.edu.tr/~ethem/i2ml/#err">Errata</A>
<P><A name=desc><B>Description:</B></A> The goal of machine learning is to 
program computers to use example data or past experience to solve a given 
problem. Many successful applications of machine learning exist already, 
including systems that analyze past sales data to predict customer behavior, 
recognize faces or spoken speech, optimize robot behavior so that a task can be 
completed using minimum resources, and extract knowledge from bioinformatics 
data. <I>Introduction to Machine Learning</I> is a comprehensive textbook on the 
subject, covering a broad array of topics not usually included in introductory 
machine learning texts. It discusses many methods based in different fields, 
including statistics, pattern recognition, neural networks, artificial 
intelligence, signal processing, control, and data mining, in order to present a 
unified treatment of machine learning problems and solutions. All learning 
algorithms are explained so that the student can easily move from the equations 
in the book to a computer program. The book can be used by advanced 
undergraduates and graduate students who have completed courses in computer 
programming, probability, calculus, and linear algebra. It will also be of 
interest to engineers in the field who are concerned with the application of 
machine learning methods.
<P>After an introduction that defines machine learning and gives examples of 
machine learning applications, the book covers supervised learning, Bayesian 
decision theory, parametric methods, multivariate methods, dimensionality 
reduction, clustering, nonparametric methods, decision trees, linear 
discrimination, multilayer perceptrons, local models, hidden Markov models, 
assessing and comparing classification algorithms, combining multiple learners, 
and reinforcement learning. 
<P>
<HR>
<A name=toc><B>Table of Contents</B></A> 
<UL>
  <LI><A 
  href="http://mitpress.mit.edu/books/chapters/0262012111forw1.pdf">Series 
  Foreword</A> xiii 
  <LI>Figures xv 
  <LI>Tables xxiii 
  <LI><A 
  href="http://mitpress.mit.edu/books/chapters/0262012111pref1.pdf">Preface</A> 
  xxv 
  <LI>Acknowledgments xxvii 
  <LI>Notations xxix 
  <LI><B><A href="http://mitpress.mit.edu/books/chapters/0262012111chap1.pdf">1 
  Introduction</A> 1</B> 
  <UL>
    <LI>1.1 What Is Machine Learning? 1 
    <LI>1.2 Examples of Machine Learning Applications 3 
    <UL>
      <LI>1.2.1 Learning Associations 3 
      <LI>1.2.2 Classification 4 
      <LI>1.2.3 Regression 8 
      <LI>1.2.4 Unsupervised Learning 10 
      <LI>1.2.5 Reinforcement Learning 11 </LI></UL>
    <LI>1.3 Notes 12 
    <LI>1.4 Relevant Resources 14 
    <LI>1.5 Exercises 15 
    <LI>1.6 References 16 </LI></UL>
  <LI><B>2 Supervised Learning 17</B> 
  <UL>
    <LI>2.1 Learning a Class from Examples 17 
    <LI>2.2 Vapnik-Chervonenkis (VC) Dimension 22 
    <LI>2.3 Probably Approximately Correct (PAC) Learning 24 
    <LI>2.4 Noise 25 
    <LI>2.5 Learning Multiple Classes 27 
    <LI>2.6 Regression 29 
    <LI>2.7 Model Selection and Generalization 32 
    <LI>2.8 Dimensions of a Supervised Machine Learning Algorithm 35 
    <LI>2.9 Notes 36 
    <LI>2.10 Exercises 37 
    <LI>2.11 References 38 </LI></UL>
  <LI><B>3 Bayesian Decision Theory 39</B> 
  <UL>
    <LI>3.1 Introduction 39 
    <LI>3.2 Classification 41 
    <LI>3.3 Losses and Risks 43 
    <LI>3.4 Discriminant Functions 45 
    <LI>3.5 Utility Theory 46 
    <LI>3.6 Value of Information 47 
    <LI>3.7 Bayesian Networks 48 
    <LI>3.8 Influence Diagrams 55 
    <LI>3.9 Association Rules 56 
    <LI>3.10 Notes 57 
    <LI>3.11 Exercises 57 
    <LI>3.12 References 58 </LI></UL>
  <LI><B>4 Parametric Methods 61</B> 
  <UL>
    <LI>4.1 Introduction 61 
    <LI>4.2 Maximum Likelihood Estimation 62 
    <UL>
      <LI>4.2.1 Bernoulli Density 62 
      <LI>4.2.2 Multinomial Density 63 
      <LI>4.2.3 Gaussian (Normal) Density 64 </LI></UL>
    <LI>4.3 Evaluating an Estimator: Bias and Variance 64 
    <LI>4.4 The Bayes' Estimator 67 
    <LI>4.5 Parametric Classification 69 
    <LI>4.6 Regression 73 
    <LI>4.7 Tuning Model Complexity: Bias/Variance Dilemma 76 
    <LI>4.8 Model Selection Procedures 79 
    <LI>4.9 Notes 82 
    <LI>4.10 Exercises 82 
    <LI>4.11 References 83 </LI></UL>
  <LI><B>5 Multivariate Methods 85</B> 
  <UL>
    <LI>5.1 Multivariate Data 85 
    <LI>5.2 Parameter Estimation 86 
    <LI>5.3 Estimation of Missing Values 87 
    <LI>5.4 Multivariate Normal Distribution 88 
    <LI>5.5 Multivariate Classification 92 
    <LI>5.6 Tuning Complexity 98 
    <LI>5.7 Discrete Features 99 
    <LI>5.8 Multivariate Regression 100 
    <LI>5.9 Notes 102 
    <LI>5.10 Exercises 102 
    <LI>5.11 References 103 </LI></UL>
  <LI><B>6 Dimensionality Reduction 105</B> 
  <UL>
    <LI>6.1 Introduction 105 
    <LI>6.2 Subset Selection 106 
    <LI>6.3 Principal Components Analysis 108 
    <LI>6.4 Factor Analysis 116 
    <LI>6.5 Multidimensional Scaling 121 
    <LI>6.6 Linear Discriminant Analysis 124 
    <LI>6.7 Notes 127 
    <LI>6.8 Exercises 130 
    <LI>6.9 References 130 </LI></UL>
  <LI><B>7 Clustering 133</B> 
  <UL>
    <LI>7.1 Introduction 133 
    <LI>7.2 Mixture Densities 134 
    <LI>7.3 <I>k</I>-Means Clustering 135 
    <LI>7.4 Expectation-Maximization Algorithm 139 
    <LI>7.5 Mixtures of Latent Variable Models 144 
    <LI>7.6 Supervised Learning after Clustering 145 
    <LI>7.7 Hierarchical Clustering 146 
    <LI>7.8 Choosing the Number of Clusters 149 
    <LI>7.9 Notes 149 
    <LI>7.10 Exercises 150 
    <LI>7.11 References 150 </LI></UL>
  <LI><B>8 Nonparametric Methods 153</B> 
  <UL>
    <LI>8.1 Introduction 153 
    <LI>8.2 Nonparametric Density Estimation 154 
    <UL>
      <LI>8.2.1 Histogram Estimator 155 
      <LI>8.2.2 Kernel Estimator 157 
      <LI>8.2.3 <I>k</I>-Nearest Neighbor Estimator 158 </LI></UL>
    <LI>8.3 Generalization to Multivariate Data 159 
    <LI>8.4 Nonparametric Classification 161 
    <LI>8.5 Condensed Nearest Neighbor 162 
    <LI>8.6 Nonparametric Regression: Smoothing Models 164 
    <UL>
      <LI>8.6.1 Running Mean Smoother 165 
      <LI>8.6.2 Kernel Smoother 166 
      <LI>8.6.3 Running Line Smoother 167 </LI></UL>
    <LI>8.7 How to Choose the Smoothing Parameter 168 
    <LI>8.8 Notes 169 
    <LI>8.9 Exercises 170 
    <LI>8.10 References 170 </LI></UL>
  <LI><B>9 Decision Trees 173</B> 
  <UL>
    <LI>9.1 Introduction 173 
    <LI>9.2 Univariate Trees 175 
    <UL>
      <LI>9.2.1 Classification Trees 176 
      <LI>9.2.2 Regression Trees 180 </LI></UL>
    <LI>9.3 Pruning 182 
    <LI>9.4 Rule Extraction from Trees 185 
    <LI>9.5 Learning Rules from Data 186 
    <LI>9.6 Multivariate Trees 190 
    <LI>9.7 Notes 192 
    <LI>9.8 Exercises 195 
    <LI>9.9 References 195 </LI></UL>
  <LI><B>10 Linear Discrimination 197</B> 
  <UL>
    <LI>10.1 Introduction 197 
    <LI>10.2 Generalizing the Linear Model 199 
    <LI>10.3 Geometry of the Linear Discriminant 200 
    <UL>
      <LI>10.3.1 Two Classes 200 
      <LI>10.3.2 Multiple Classes 202 </LI></UL>
    <LI>10.4 Pairwise Separation 204 
    <LI>10.5 Parametric Discrimination Revisited 205 
    <LI>10.6 Gradient Descent 206 
    <LI>10.7 Logistic Discrimination 208 
    <UL>
      <LI>10.7.1 Two Classes 208 
      <LI>10.7.2 Multiple Classes 211 </LI></UL>
    <LI>10.8 Discrimination by Regression 216 
    <LI>10.9 Support Vector Machines 218 
    <UL>

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -