⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 http:^^www.cs.hmc.edu^~keller^cs152f96.html

📁 This data set contains WWW-pages collected from computer science departments of various universities
💻 HTML
📖 第 1 页 / 共 2 页
字号:
Date: Tue, 26 Nov 1996 18:41:25 GMT
Server: NCSA/1.5.1
Last-modified: Tue, 26 Nov 1996 18:36:13 GMT
Content-type: text/html
Content-length: 16806

<html><HEAD><TITLE>CS 152:  Neural Networks</TITLE></HEAD><BODY>URL http://www.cs.hmc.edu/~keller/cs152f96.html<center><H3><!WA0><a href = "http://www.hmc.edu/">Harvey Mudd College</a> Fall 1996</H3><H3><!WA1><A HREF="http://www.cs.hmc.edu/index.html">Computer Science</A> 152: Neural Networks</H3></center><h3>Trailer:</h3>Can a computer be <em>taught</em> to read words aloud,recognize faces, perform a medical diagnosis,drive a car, play a game, balance a pole, predict physical phenomena?<p>The answer to all these is <em>yes</em>.  All these applications and othershave been demonstrated usingvarieties of the computational model known as "neural networks", the subject of thiscourse.<p>The course will develop the theory of a number of neural network models.Participants will exercise the theory through both pre-developed computerprograms and ones of their own design.<h3>Course Personnel:</h3><ul>  <li>Instructor: <!WA2><a href = "http://www.cs.hmc.edu/~keller"> Robert Keller </a>      242 Olin (4-5 p.m. MTuW or by appt.), keller@muddcs, x 18483  <br>  <br>  <li>Tutor/Grader: <!WA3><a href = "http://www.cs.hmc.edu/~tkelly/"> T.J. Kelly</a>      tkelly@muddcs, x 74860  <br>  <br>  <li>Secretary: <!WA4><a href = "http://www.cs.hmc.edu/~nancy"> Nancy Mandala</a> 240 Olin (1-5 M-F)  nancy@muddcs, x 18225  <br>  <br>  <li>System administrator: <!WA5><a href = "http://www.cs.hmc.edu/~quay"> Quay Ly</a> 101 Beckman quay@muddcs, x 73474  <br></ul><P><DT><h3>Catalog Description</h3><P>Modeling, simulation, and analysis of artificial neural networks.Relationship to biological neural networks.  Design and optimization ofdiscrete and continuous neural networks.  Backpropagation, and other gradientdescent methods.  Hopfield and Boltzmann networks.  Unsupervised learning.Self-organizing feature maps.  Applications chosen from function approximation,signal processing, control, computer graphics, pattern recognition, time-seriesanalysis.  Relationship to fuzzy logic, genetic algorithms, and artificiallife.<br><br>Prerequisites: Biology 52 and Mathematics 73 and 82, or permission ofthe instructor.  3 credit hours.<P><h3>Texts</h3><P>   <ul>  <li> Main Textbook:   <blockquote>  <!WA6><a href="http://www.okstate.edu/elec-engr/faculty/hagan/hagan-mt.html">Martin T. Hagan</a>,  <!WA7><a href="http://www.ee.uidaho.edu/ee/digital/hdemuth/hdemuth.html">Howard B. Demuth</a>,  and Mark Beale,  <!WA8><a href="http://www.thomson.com/pws/ee/nnd.html"><em>Neural Network Design</em></a>,  PWS Publishing Company, Boston, 1996, ISBN 0-534-94332-2,  </blockquote>  which I will call <b>NND</b> below.  <p>  <li> Supplementary references, which will be provided as necessary.  <p>  <li>Related <!WA9><a href="#www_links">WWW links</a> (for self-study and research)  appear below.  <P>  <li>Software, which is installed on muddcs.cs.hmc.edu:    <ul>    <li> <!WA10><a href="http://www-math.cc.utexas.edu/math/Matlab/Manual/faq.html">MATLAB</a> Neural Network Toolbox</a>, The MathWorks, Inc.    <li><!WA11><a href="http://www.cray.com/PUBLIC/APPS/DAS/CODES/MATLAB_Neural_Network.html">Matlab Neural Network Stuff</a>    <li><!WA12><a href="http://www.ii.uib.no/~hans/matlab/">Matlab cross referenced</a>    <li> <!WA13><a href="http://www.informatik.uni-stuttgart.de/ipvr/bv/projekte/snns/snns.html">SNNS</a>: Stuttgart Neural Network Simulator.    </ul>  </ul><P><DT><h3>Course Requirements</h3><P>There will be some homework and programmingassignments, but no exams.  These assignments will constitute about50% of the grade.  The other 50% of the grade is from a substantialfinal project involving either a working neural network application ora research paper.  The grade on the project will be determined by thecomprehensiveness and degree to which you explored competingapproaches.  The projects will be presented orally.  <p>Optionalvoluntary oral presentations on textbook materialcan also be made during the term.  Thesecan act to cushion your grade.  They are very much encouraged, as itthey really help you learn the material at a higher level than youwould otherwise.Please see me if you are interested inmaking a presentation.<P><h3>CS 152 Topic Outline</h3>  <UL>  <li>Week 1 (read NND chapters 1 to 4; you may skip 3-8 to 3-12 for now)  <P> Contexts for Neural Networks    <ul>    <li>        Artificial Intelligence    <li>        Biological    <li>        Physics     </ul>  <P> Artificial Neural Network overview    <ul>    <li> Perceptrons    <li> Perceptron learning rule    <li> Perceptron convergence theorem    </ul>  <P>  <li>Week 2 (read NND chapter 5 to 7)  <P>    <ul>    <li> Linear transformations for neural networks    <li> Supervised Hebbian learning    <li> Pseudoinverse rule    <li> Filtered learning rule    <li> Delta rule    <li> Unsupervised Hebbian learning    </ul>  <p>  <li>Week 3 (read NND chapters 8 and 9)  <p>    <ul>    <li> Performance surfaces    <li> Performance optimization      <ul>      <li> Steepest descent algorithm      <li> Newton's method      <li> Conjugate gradient      </ul>    </ul>  <p>  <li>Week 4 (read NND chapter 10)  <p>    <ul>    <!WA14><a href="http://ee.stanford.edu/ee/faculty/Widrow_Bernard.html">Widrow</a>-Hoff Learning      <ul>      <li> Adaline      <li> LMS rule      <li> Adaptive filtering      </ul>    </ul>  <p>  <li>Week 5 (read NND chapters 11 and 12)  <p>    <ul>    <li> Backpropagation in Multi-Level Perceptrons (MLP)    <li> Variations on backpropagation      <ul>        <li> Batching        <li> Momentum        <li> Variable learning rate        <li> Levenberg-Marquardt (LMBP)        <li> Quickprop      </ul>    </ul>  <p>  <li>Week 6 (supplementary material)  <p>    <ul>    <li>  Radial basis function networks (RBF)    </ul>  <p>  <li>Week 7 (read NND chapter 13)  <p>    <ul>    <li> Associative learning      <ul>      <li> Unsupervised Hebb rule      <li> Hebb rule with decay      <li> Instar rule      <li> Kohonen rule      <li> Outstar rule       </ul>    </ul>    <p>  <li>Week 8 (read NND chapter 14)  <p>    <ul>    <li>  Competitive networks      <ul>      <li> Hamming network      <li> Self-Organizing feature maps (SOM)      <li> Counterpropagation networks (CPN)      <li> Learning vector quanitization (LVQ)      </ul>    </ul>  <p>  <li>Week 9 (read NND chapters 15 and 16)  <p>    <ul>

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -