⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 r1996.html

📁 一个功能强大的神经网络分析程序
💻 HTML
字号:
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN""http://www.w3.org/TR/html4/loose.dtd"><HTML><HEAD><TITLE>Training algorithms</TITLE><link href="../style.css" rel="stylesheet" type="text/css"><METANAME="GENERATOR"CONTENT="Modular DocBook HTML Stylesheet Version 1.79"><LINKREL="HOME"TITLE="Fast Artificial Neural Network Library"HREF="index.html"><LINKREL="UP"TITLE="Constants"HREF="x1994.html"><LINKREL="PREVIOUS"TITLE="Constants"HREF="x1994.html"><LINKREL="NEXT"TITLE="Activation Functions"HREF="r2030.html"></HEAD><BODYCLASS="refentry"BGCOLOR="#FFFFFF"TEXT="#000000"LINK="#0000FF"VLINK="#840084"ALINK="#0000FF"><DIVCLASS="NAVHEADER"><TABLESUMMARY="Header navigation table"WIDTH="100%"BORDER="0"CELLPADDING="0"CELLSPACING="0"><TR><THCOLSPAN="3"ALIGN="center">Fast Artificial Neural Network Library</TH></TR><TR><TDWIDTH="10%"ALIGN="left"VALIGN="bottom"><AHREF="x1994.html"ACCESSKEY="P">Prev</A></TD><TDWIDTH="80%"ALIGN="center"VALIGN="bottom"></TD><TDWIDTH="10%"ALIGN="right"VALIGN="bottom"><AHREF="r2030.html"ACCESSKEY="N">Next</A></TD></TR></TABLE><HRALIGN="LEFT"WIDTH="100%"></DIV><H1><ANAME="api.sec.constants.training"></A>Training algorithms</H1><DIVCLASS="refnamediv"><ANAME="AEN1997"></A><H2>Name</H2>Training algorithms&nbsp;--&nbsp;Constants representing training algorithms.</DIV><DIVCLASS="refsect1"><ANAME="AEN2000"></A><H2>Description</H2><P>&#13;	    These constants represent the training algorithms available within the fann library.	    The list will grow over time, but probably not shrink.	  </P><P>&#13;	    The training algorithm used by this function is chosen by the 	    <AHREF="r972.html"><CODECLASS="function">fann_set_training_algorithm</CODE></A> 	    function. The default training algorithm is <CODECLASS="constant">FANN_TRAIN_RPROP</CODE>.	  </P><P></P><DIVCLASS="variablelist"><P><B>Constants</B></P><DL><DT>FANN_TRAIN_INCREMENTAL</DT><DD><P>                  Standard backpropagation algorithm, where the weights are updated after each training 		 pattern. This means that the weights are updated many times during a single epoch. 		 For this reason some problems, will train very fast with this algorithm, while other more                 advanced problems will not train very well.		</P></DD><DT>FANN_TRAIN_BATCH</DT><DD><P>                  Standard backpropagation algorithm, where the weights are updated after calculating 		 the mean square error for the whole training set. This means that the weights are only updated 		 once during a epoch. For this reason some problems, will train slower with this algorithm. 		 But since the mean square error is calculated more correctly than in incremental training,		 some problems will reach a better solutions with this algorithm.		</P></DD><DT>FANN_TRAIN_RPROP</DT><DD><P>&#13;		  A more advanced batch training algorithm which achieves good results for many problems.		  The RPROP training algorithm is adaptive, and does therefore not use the learning_rate.		  Some other parameters can however be set to change the way the RPROP algorithm works,		  but it is only recommended for users with insight in how the RPROP training algorithm works.		</P><P>&#13;		  The RPROP training algorithm is described in 		  [<AHREF="b3048.html#bib.riedmiller_1993"><I>Riedmiller and Braun, 1993</I></A>], but the		  actual learning algorithm used here is the iRPROP- training algorithm 		  [<AHREF="b3048.html#bib.igel_2000"><I>Igel and H黶ken, 2000</I></A>]  which is an variety		  of the standard RPROP training algorithm.		</P></DD><DT>FANN_TRAIN_QUICKPROP</DT><DD><P>&#13;		  A more advanced batch training algorithm which achieves good results for many problems.		  The quickprop training algorithm uses the learning_rate parameter along with other more		  advanced parameters, but it is only recommended to change these advanced parameters, for 		  users with insight in how the quickprop training algorithm works.		</P><P>&#13;		  The quickprop training algorithm is described in [<AHREF="b3048.html#bib.fahlman_1988"><I>Fahlman, 1988</I></A>].		</P></DD></DL></DIV></DIV><DIVCLASS="NAVFOOTER"><HRALIGN="LEFT"WIDTH="100%"><TABLESUMMARY="Footer navigation table"WIDTH="100%"BORDER="0"CELLPADDING="0"CELLSPACING="0"><TR><TDWIDTH="33%"ALIGN="left"VALIGN="top"><AHREF="x1994.html"ACCESSKEY="P">Prev</A></TD><TDWIDTH="34%"ALIGN="center"VALIGN="top"><AHREF="index.html"ACCESSKEY="H">Home</A></TD><TDWIDTH="33%"ALIGN="right"VALIGN="top"><AHREF="r2030.html"ACCESSKEY="N">Next</A></TD></TR><TR><TDWIDTH="33%"ALIGN="left"VALIGN="top">Constants</TD><TDWIDTH="34%"ALIGN="center"VALIGN="top"><AHREF="x1994.html"ACCESSKEY="U">Up</A></TD><TDWIDTH="33%"ALIGN="right"VALIGN="top">Activation Functions</TD></TR></TABLE></DIV></BODY></HTML>

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -