⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 x246.html

📁 一个功能强大的神经网络分析程序
💻 HTML
字号:
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN""http://www.w3.org/TR/html4/loose.dtd"><HTML><HEAD><TITLE>Training an ANN</TITLE><link href="../style.css" rel="stylesheet" type="text/css"><METANAME="GENERATOR"CONTENT="Modular DocBook HTML Stylesheet Version 1.79"><LINKREL="HOME"TITLE="Fast Artificial Neural Network Library"HREF="index.html"><LINKREL="UP"TITLE="Neural Network Theory"HREF="c225.html"><LINKREL="PREVIOUS"TITLE="Artificial Neural Networks"HREF="x241.html"><LINKREL="NEXT"TITLE="API Reference"HREF="c253.html"></HEAD><BODYCLASS="section"BGCOLOR="#FFFFFF"TEXT="#000000"LINK="#0000FF"VLINK="#840084"ALINK="#0000FF"><DIVCLASS="NAVHEADER"><TABLESUMMARY="Header navigation table"WIDTH="100%"BORDER="0"CELLPADDING="0"CELLSPACING="0"><TR><THCOLSPAN="3"ALIGN="center">Fast Artificial Neural Network Library</TH></TR><TR><TDWIDTH="10%"ALIGN="left"VALIGN="bottom"><AHREF="x241.html"ACCESSKEY="P">Prev</A></TD><TDWIDTH="80%"ALIGN="center"VALIGN="bottom">Chapter 4. Neural Network Theory</TD><TDWIDTH="10%"ALIGN="right"VALIGN="bottom"><AHREF="c253.html"ACCESSKEY="N">Next</A></TD></TR></TABLE><HRALIGN="LEFT"WIDTH="100%"></DIV><DIVCLASS="section"><H1CLASS="section"><ANAME="theory.training">4.3. Training an ANN</A></H1><P>&#13;        When training an ANN with a set of input and output data, we wish to adjust the weights in the ANN, to make	the ANN give the same outputs as seen in the training data. On the other hand, we do not want to make the ANN	too specific, making it give precise results for the training data, but incorrect results for all other data.	When this happens, we say that the ANN has been over-fitted.      </P><P>&#13;        The training process can be seen as an optimization problem, where we wish to minimize the mean square	error of the entire set of training data. This problem can be solved in many different ways, ranging from	standard optimization heuristics like simulated annealing, through more special optimization techniques like	genetic algorithms to specialized gradient descent algorithms like backpropagation.      </P><P>&#13;        The most used algorithm is the backpropagation algorithm, but this algorithm has some limitations	concerning, the extent of adjustment to the weights in each iteration. This problem has been solved in more	advanced algorithms like RPROP [<AHREF="b3048.html#bib.riedmiller_1993"><I>Riedmiller and Braun, 1993</I></A>]	and quickprop [<AHREF="b3048.html#bib.fahlman_1988"><I>Fahlman, 1988</I></A>].      </P></DIV><DIVCLASS="NAVFOOTER"><HRALIGN="LEFT"WIDTH="100%"><TABLESUMMARY="Footer navigation table"WIDTH="100%"BORDER="0"CELLPADDING="0"CELLSPACING="0"><TR><TDWIDTH="33%"ALIGN="left"VALIGN="top"><AHREF="x241.html"ACCESSKEY="P">Prev</A></TD><TDWIDTH="34%"ALIGN="center"VALIGN="top"><AHREF="index.html"ACCESSKEY="H">Home</A></TD><TDWIDTH="33%"ALIGN="right"VALIGN="top"><AHREF="c253.html"ACCESSKEY="N">Next</A></TD></TR><TR><TDWIDTH="33%"ALIGN="left"VALIGN="top">Artificial Neural Networks</TD><TDWIDTH="34%"ALIGN="center"VALIGN="top"><AHREF="c225.html"ACCESSKEY="U">Up</A></TD><TDWIDTH="33%"ALIGN="right"VALIGN="top">API Reference</TD></TR></TABLE></DIV></BODY></HTML>

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -