⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 page_88.html

📁 怎样挖掘你的网站的内容。本领域内唯一的书
💻 HTML
字号:
<HTML>  <HEAD>    <!--SCRIPT LANGUAGE="JavaScript" SRC="http://a1835.g.akamai.net/f/1835/276/3h/www.netlibrary.com/include/js/dictionary_library.js"></SCRIPT>    <SCRIPT LANGUAGE="JavaScript">      if (!opener){document.onkeyup=parent.turnBookPage;}    </SCRIPT!-->    <META HTTP-EQUIV="Cache-Control" CONTENT="no-cache">    <META HTTP-EQUIV="Pragma" CONTENT="no-cache">    <META HTTP-EQUIV="Expires" CONTENT="-1"><META http-equiv="Content-Type" content="text/html; charset=windows-1252"><SCRIPT>var PrevPage="Page_87";var NextPage="Page_89";var CurPage="Page_88";var PageOrder="99";</SCRIPT>  <TITLE>Document</TITLE>  </HEAD>  <BODY BGCOLOR="#FFFFFF"><CENTER><TABLE BORDER=0 WIDTH=100% CELLPADDING=0><TR><TD ALIGN=CENTER>  <TABLE BORDER=0 CELLPADDING=2 CELLSPACING=0 WIDTH=100%>  <TR>  <TD ALIGN=LEFT><A HREF='Page_87.html'>Previous</A></TD>  <TD ALIGN=RIGHT><A HREF='Page_89.html'>Next</A></TD>  </TR>  </TABLE></TD></TR><TR><TD ALIGN=LEFT><P><A NAME='JUMPDEST_Page_88'/><A NAME='{304}'/><TABLE BORDER=0 CELLSPACING=0 CELLPADDING=0 WIDTH='100%'><TR><TD ALIGN=RIGHT><FONT FACE='Times New Roman, Times, Serif' SIZE=2 COLOR=#FF0000>Page 88</FONT></TD></TR></TABLE><A NAME='{305}'/><TABLE BORDER=0 CELLSPACING=0 CELLPADDING=0><TR>  <TD ROWSPAN=5></TD>  <TD COLSPAN=3 HEIGHT=12></TD>  <TD ROWSPAN=5></TD></TR><TR>  <TD COLSPAN=3></TD></TR><TR><TD></TD>  <TD><FONT FACE='Times New Roman, Times, Serif' SIZE=3>also opened many sources of funding necessary for the pursuit of neural computing research and the inductive approach to AI. His work led to an explosion in neural networks papers, conferences, and software companies, which continues to this day. Today, neural networks are successful in applications requiring prediction, data classification, and pattern matching. Examples of successful applications include mortgage risk evaluation, production control, handwriting recognition, and credit card fraud. The credit card you carry is being monitored every hour of every day by a neural network for potential patterns of irregularity; the system being used is most likely from HNC, a neural network company from San Diego, which monitors over 150 million credit cards.</FONT></TD><TD></TD></TR><TR>  <TD COLSPAN=3></TD></TR><TR>  <TD COLSPAN=3 HEIGHT=1></TD></TR></TABLE><A NAME='{306}'/><TABLE BORDER=0 CELLSPACING=0 CELLPADDING=0><TR>  <TD ROWSPAN=5></TD>  <TD COLSPAN=3 HEIGHT=17></TD>  <TD ROWSPAN=5></TD></TR><TR>  <TD COLSPAN=3></TD></TR><TR><TD></TD>  <TD><FONT FACE='Times New Roman, Times, Serif' SIZE=3><B>Back-Propagation Networks</B></FONT></TD><TD></TD></TR><TR>  <TD COLSPAN=3></TD></TR><TR>  <TD COLSPAN=3 HEIGHT=1></TD></TR></TABLE><A NAME='{307}'/><TABLE BORDER=0 CELLSPACING=0 CELLPADDING=0><TR>  <TD ROWSPAN=5></TD>  <TD COLSPAN=3 HEIGHT=12></TD>  <TD ROWSPAN=5></TD></TR><TR>  <TD COLSPAN=3></TD></TR><TR><TD></TD>  <TD><FONT FACE='Times New Roman, Times, Serif' SIZE=3>Neural networks were not seriously considered until Paul Werbo's groundbreaking 1974 Harvard doctorate thesis, <I>Beyond Regression,</I> which laid the statistical foundation to the work on back-propagation by several researchers from varied fields. In 1985 the back-propagation neural network architecture was simultaneously discovered by three groups of researchers: D. E. Rumelhart, G. E. Hinton, and R. J. Williams; Y. Le Cun; and D. Parker.</FONT></TD><TD></TD></TR><TR>  <TD COLSPAN=3></TD></TR><TR>  <TD COLSPAN=3 HEIGHT=1></TD></TR></TABLE><A NAME='{308}'/><TABLE BORDER=0 CELLSPACING=0 CELLPADDING=0><TR>  <TD ROWSPAN=5></TD>  <TD COLSPAN=3 HEIGHT=12></TD>  <TD ROWSPAN=5></TD></TR><TR>  <TD COLSPAN=3></TD></TR><TR><TD></TD>  <TD><FONT FACE='Times New Roman, Times, Serif' SIZE=3>A back-propagation neural network consists of several input nodes, some hidden nodes, and one or more output nodes. Data cycles through the nodes as the net trains and adjusts (Figure 3-3). Back propagation is a learning method where an error signal from an output node is fed back through the network altering weights as it goes, to prevent the same error from happening again. The process involves, as with the other types of networks, a gradual process of training with incremental improvement over time, until a network learns a pattern. This pattern can be anything, including the behavior of website visitors and customers. The back propagation architecture is the most dominant of all neural networks, and it is the most popular in the current crop of data mining tools involved in supervised learning.</FONT></TD><TD></TD></TR><TR>  <TD COLSPAN=3></TD></TR><TR>  <TD COLSPAN=3 HEIGHT=1></TD></TR></TABLE><A NAME='{309}'/><TABLE BORDER=0 CELLSPACING=0 CELLPADDING=0><TR>  <TD ROWSPAN=5></TD>  <TD COLSPAN=3 HEIGHT=12></TD>  <TD ROWSPAN=5></TD></TR><TR>  <TD COLSPAN=3></TD></TR><TR><TD></TD>  <TD><FONT FACE='Times New Roman, Times, Serif' SIZE=3>To demonstrate how a back-propagation network works we can try it by solving the exclusive-or (XOR) function whose output is logically true (or has the value &quot;1&quot;) when the two inputs are opposite (Figure 3-4). The numbers between the neurons are the strengths of the connections between them, or the weights. The weight values will be modified as the network is trained. For training, the network is started with a random set of weights (Figure 3-5).</FONT></TD><TD></TD></TR><TR>  <TD COLSPAN=3></TD></TR><TR>  <TD COLSPAN=3 HEIGHT=1></TD></TR></TABLE><A NAME='{30A}'/><TABLE BORDER=0 CELLSPACING=0 CELLPADDING=0><TR>  <TD ROWSPAN=5></TD>  <TD COLSPAN=3 HEIGHT=12></TD>  <TD ROWSPAN=5></TD></TR><TR>  <TD COLSPAN=3></TD></TR><TR><TD></TD>  <TD><FONT FACE='Times New Roman, Times, Serif' SIZE=3>We start with the assumption of the threshold for the neurons to be .01. This means that if the sum of the inputs is greater than .01,</FONT><FONT FACE='Times New Roman, Times, Serif' SIZE=3 COLOR=#FFFF00><!-- continue --></FONT></TD><TD></TD></TR><TR>  <TD COLSPAN=3></TD></TR><TR>  <TD COLSPAN=3 HEIGHT=1></TD></TR></TABLE><A NAME='{30B}'/></FORM></P></TD></TR></TABLE><P><FONT SIZE=0 COLOR=WHITE></CENTER><A NAME="bottom">&nbsp;</A><!-- netLibrary.com Copyright Notice -->  </BODY></HTML>

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -