📄 http:^^www.cs.byu.edu^courses^cs578^syllabus.html
字号:
Date: Wed, 20 Nov 1996 21:53:40 GMT
Server: Apache/1.0.5
Content-type: text/html
Content-length: 5971
Last-modified: Fri, 02 Aug 1996 06:55:49 GMT
<HTML><HEAD><TITLE>CS 578 Syllabus - Fall '94</TITLE><!-- describe the document, avoid context sensative descriptions --><meta name="description" value="CS 578 Syllabus - Fall '94"><!-- keywords for the document --><meta name="keywords" value="CS578,Neural Networks,Syllabus"><!-- should be "document" unless providing a search, then "service" --><meta name="resource-type" value="document"><!-- use global for documents to be indexed outside BYU --><meta name="distribution" value="local"></HEAD><BODY><h1>Neural Networks and Connectionist Computing </h1>TuTh 9:35-10:50am, 2241 SFLC <P>Tony Martinez, 3334 TMCB, Office Hours: TuTh 3:00 - 4:00pm or by appointment <P><STRONG>Goals:</STRONG> Introduce and study the philosophy, utility, and models ofconnectionist computing, such that students are able to propose originalresearch with potential follow-up in a graduate research program. Expandthe creativity of the students in all aspects of computing. <P><STRONG>Text:</STRONG> J. Mclelland and D. Rumelhart, Explorations in Parallel DistributedProcessing: A Handbook of Models, Programs, and Exercises. PreparedPacket of papers at the end of each section of the notes. You will beexpected to read the assigned literature beforeand optionally after thescheduled lecture. <P><STRONG>Prerequisites</STRONG> Senior or Graduate standing, computer architecture,Calculus, Creativity. <P><STRONG>Lab (3346 TMCB):</STRONG> 4 Mac II's with 5MB RAM and 40 MB hard drives. 2DS5000 workstations with 32MB RAM and 1GB disks and 3 high speed HPworkstations. (These may be used when available but the researchinggraduate students have priority on these machines). Software forsimulations, projects, etc. will be made available. <P><STRONG>Literature:</STRONG> I have placed interesting and representative papers forreference in the periodicals room of the HBLL library. There are twoseparate packets (2 copies of the first) and both are under my name. Asneeded, I will place more packets in the library. I also have more papersin my office which can be looked over and copied under constraint of the 15minute rule. I can also send for most any paper you wish throughinterlibrary loan, (and will do so), but it usually takes 2-3 weeks, soplan ahead. <P><STRONG>Grading (~):</STRONG> Simulations and Homeworks: 30%, Midterm: 22.5%, Project:22.5%, Final: 25% (Tue., Dec. 14, 7am-10am). Grading is on a curve andsome amount of subjectivity is allowed for attendance, participation,perceived effort, etc. If you think, you'll be all right <P><STRONG>Late assignments:</STRONG> Assignments are expected on time (beginning ofclass on due date). Late papers will be marked off at 5%/school day late. However, if you have any unusual circumstances (sick, out of town, uniquefrom what all the other students have, etc.), which you inform me of, thenI will not take off any late points. Nothing will be accepted after thelast day of class instruction. <P><STRONG>Project:</STRONG> An indepth effort on a particular aspect of neuralnetworks. A relatively extensive literature search in the area is expectedwith a subsequent bibliography. Good projects are typically as follows: Best: Some of your own original thinking and proposal of a network,learning paradigm, system, etc. This (and other projects) are typicallywell benefited by some computer simulation to bear out potential. VeryGood: Starting from an indepth study of some current model, strive toextend it through some new mechanisms. Not Bad: A study of a current modelwith an indepth analysis of its strengths, weaknesses, potential, andsuggested research. Not Good: A description of a current model. Theearlier you start the better. Note that in a semester course like this,you will have to choose a topic when we have only covered half of thematerial. That does not mean your project must cover items related to thefirst half of the semester. You should use your own initiative and theresources available (library literature, texts, me, etc.) to peruse andfind any topic of interest to you, regardless of whether we have or willcover it in class. Interesting models which we will probably not have timeto cover indepth in class include: Feldman nets, Genetic algorithms,Kohonen maps, HOTLU's, BAMs, CMAC, ASN, Cognitron, Neo-Cognitron, BolzCONS,Michie Boxes, Cauchy Machines, Counterpropagation, Madaline II, AssociativeNetworks, RCE, etc, etc. <P><DL><DT> <STRONG>Topics and Reading Assignments</STRONG><DD> 1. Intro to Neural Networks (1) *<DD> 2. Brain and Nervous System (3) Your Neural Network<DD> 3. Computation, VN Bottleneck, and NN Goals (1) <DD> 4. Definitions, Theory, learning, applications, and General Mechanisms of Neural Networks (2) <DD> 5. Delta Rule Models - Linear associators, Perceptron, Adaline, Quadric Machines, Higher Order networks, Committee Machines, Delta rule Simulationand separability issues (4)<DD> 6. Back-Propagation (2) Backpropagation Sim.<DD> 7. ASOCS (Adaptive Self-Organizing Concurrent Systems) (6) <DD> 8. Midterm (1) Project Abstract<DD> 9. Hopfield Networks (2) <DD> 10. Boltzmann Machine (1) <DD> 11. Competitive Learning, Adaptive Resonance Theory (2) CLSimulation<DD> 12. Survey of other models, implementation, future research (2) <DD> 13. Oral Presentations (2) Final Project Paper</DL>*As a general rule, read all of the papers at then end of a section ofnotes before the lecture.<hr>Go to<!WA0><a href="http://www.cs.byu.edu/byu.html"><!WA1><img align=MIDDLE src="http://www.cs.byu.edu/buttons/button-to-cs.gif"></a><!WA2><a href="http://www.cs.byu.edu/byu-home.html"><!WA3><img align=MIDDLE src="http://www.cs.byu.edu/buttons/thumb-cougar.gif"></a><hr><ADDRESS>Comments to <!WA4><A HREF="http://www.cs.byu.edu/courses/cs578//webmaster.html">webmaster</a></ADDRESS></BODY></HTML>
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -