⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 234-242.html

📁 遗传算法经典书籍-英文原版 是研究遗传算法的很好的资料
💻 HTML
字号:
<HTML>
<HEAD>
<META name=vsisbn content="0849398010">
<META name=vstitle content="Industrial Applications of Genetic Algorithms">
<META name=vsauthor content="Charles Karr; L. Michael Freeman">
<META name=vsimprint content="CRC Press">
<META name=vspublisher content="CRC Press LLC">
<META name=vspubdate content="12/01/98">
<META name=vscategory content="Web and Software Development: Artificial Intelligence: Other">




<TITLE>Industrial Applications of Genetic Algorithms:Tuning Bama Optimized Recurrent Neural Networks Using Genetic Algorithms</TITLE>

<!-- HEADER -->

<STYLE type="text/css"> 
 <!--
 A:hover  {
 	color : Red;
 }
 -->
</STYLE>

<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">

<!--ISBN=0849398010//-->
<!--TITLE=Industrial Applications of Genetic Algorithms//-->
<!--AUTHOR=Charles Karr//-->
<!--AUTHOR=L. Michael Freeman//-->
<!--PUBLISHER=CRC Press LLC//-->
<!--IMPRINT=CRC Press//-->
<!--CHAPTER=11//-->
<!--PAGES=234-242//-->
<!--UNASSIGNED1//-->
<!--UNASSIGNED2//-->

<CENTER>
<TABLE BORDER>
<TR>
<TD><A HREF="231-234.html">Previous</A></TD>
<TD><A HREF="../ewtoc.html">Table of Contents</A></TD>
<TD><A HREF="../ch12/243-245.html">Next</A></TD>
</TR>
</TABLE>
</CENTER>
<P><BR></P>
</P>
<P>The data available for the model development consists of input-output pairs. The inputs consist of the variables shown entering the model in Figure 11.13. The output is the D<SUB><SMALL>50</SMALL></SUB>. The data will be segregated into two distinct groups: (1) a training set and (2) a test set. In training data set, the desired output data are provided to fuzzy system to update the parameters. The desired output is not given in the test set. NN with BORN was applied to model this hydrocyclone separation problem by the author of this chapter. However, due to the difference of system input and output ranges between training and test data sets, NN performance was found to be unacceptable. In order to adapt previously optimized NN by BORN, GA-BORN adaptive unit was used.</P>
<P>The NN that will be optimized by BORN has eight inputs, ten hidden neurons, and one output. This NN is trained with a learning rate of 0.01 for 75 epochs and then the BORN algorithm is executed for 25 epochs. The NN optimized by GA has the same structure as the one optimized using BORN. In order to optimize NN connections by GA, GA parameters are set to 30 populations, 19 strings, 0.8 probability of crossover, and 0.01 probability of mutation for 100 generations.</P>
<P>Figure 11.14 shows the training result for BORN. Despite the high nonlinearity of hydrocyclone model, training performance for BORN was fairly good. Figures 11.15 to 11.19 show BORN test results before and after GA adaptive unit is executed. These plots indicated that the BORN algorithm was improved with the addition of a GA adaptive unit. However, it was not a large improvement. Figures 11.20 and 11.21 show the results of an NN in which the number of hidden neurons is increased to 50. Figures 11.22 and 11.23 are the results with 100 hidden neurons. The performance was improved dramatically because GA has more selections to adjust NN connectivities.</P>
<P><A NAME="Fig14"></A><A HREF="javascript:displayWindow('images/11-14.jpg',500,472)"><IMG SRC="images/11-14t.jpg"></A>
<BR><A HREF="javascript:displayWindow('images/11-14.jpg',500,472)"><FONT COLOR="#000077"><B>Figure 11.14</B></FONT></A>&nbsp;&nbsp;BORN training result.</P>
<P><A NAME="Fig15"></A><A HREF="javascript:displayWindow('images/11-15.jpg',500,470)"><IMG SRC="images/11-15t.jpg"></A>
<BR><A HREF="javascript:displayWindow('images/11-15.jpg',500,470)"><FONT COLOR="#000077"><B>Figure 11.15</B></FONT></A>&nbsp;&nbsp;BORN test result.</P>
<P><A NAME="Fig16"></A><A HREF="javascript:displayWindow('images/11-16.jpg',500,461)"><IMG SRC="images/11-16t.jpg"></A>
<BR><A HREF="javascript:displayWindow('images/11-16.jpg',500,461)"><FONT COLOR="#000077"><B>Figure 11.16</B></FONT></A>&nbsp;&nbsp;Before GA-NN.</P>
<P><A NAME="Fig17"></A><A HREF="javascript:displayWindow('images/11-17.jpg',500,467)"><IMG SRC="images/11-17t.jpg"></A>
<BR><A HREF="javascript:displayWindow('images/11-17.jpg',500,467)"><FONT COLOR="#000077"><B>Figure 11.17</B></FONT></A>&nbsp;&nbsp;After GA</P>
<P><A NAME="Fig18"></A><A HREF="javascript:displayWindow('images/11-18.jpg',500,381)"><IMG SRC="images/11-18t.jpg"></A>
<BR><A HREF="javascript:displayWindow('images/11-18.jpg',500,381)"><FONT COLOR="#000077"><B>Figure 11.18</B></FONT></A>&nbsp;&nbsp;Before GA.</P>
<P><A NAME="Fig19"></A><A HREF="javascript:displayWindow('images/11-19.jpg',500,379)"><IMG SRC="images/11-19t.jpg"></A>
<BR><A HREF="javascript:displayWindow('images/11-19.jpg',500,379)"><FONT COLOR="#000077"><B>Figure 11.19</B></FONT></A>&nbsp;&nbsp;After GA.</P>
<P><A NAME="Fig20"></A><A HREF="javascript:displayWindow('images/11-20.jpg',500,516)"><IMG SRC="images/11-20t.jpg"></A>
<BR><A HREF="javascript:displayWindow('images/11-20.jpg',500,516)"><FONT COLOR="#000077"><B>Figure 11.20</B></FONT></A>&nbsp;&nbsp;GA-NN opt. (50 hidden).</P>
<P><A NAME="Fig21"></A><A HREF="javascript:displayWindow('images/11-21.jpg',500,507)"><IMG SRC="images/11-21t.jpg"></A>
<BR><A HREF="javascript:displayWindow('images/11-21.jpg',500,507)"><FONT COLOR="#000077"><B>Figure 11.21</B></FONT></A>&nbsp;&nbsp;GA-NN opt. (50 hidden).</P>
<P><A NAME="Fig22"></A><A HREF="javascript:displayWindow('images/11-22.jpg',500,456)"><IMG SRC="images/11-22t.jpg"></A>
<BR><A HREF="javascript:displayWindow('images/11-22.jpg',500,456)"><FONT COLOR="#000077"><B>Figure 11.22</B></FONT></A>&nbsp;&nbsp;GA-NN opt. (100 hidden).</P>
<P><A NAME="Fig23"></A><A HREF="javascript:displayWindow('images/11-23.jpg',500,466)"><IMG SRC="images/11-23t.jpg"></A>
<BR><A HREF="javascript:displayWindow('images/11-23.jpg',500,466)"><FONT COLOR="#000077"><B>Figure 11.23</B></FONT></A>&nbsp;&nbsp;GA-NN opt. (100 hidden).</P>
<P><FONT SIZE="+1"><B>CONCLUSION</B></FONT></P>
<P>GAs are highly efficient and robust optimization algorithms that have been used effectively in a variety of disciplines. In this chapter, simple GA was applied to enhance the performance of Bama Optimized Recurrent Neural Networks algorithm. In order to measure the performance of BORN with GA, two different studies are considered.
</P>
<P>First, BORN was optimized with a GA and the result was compared with the result that was generated by only the BORN algorithm. Although GA-NN found a near optimal solution, the BORN algorithm performed better in a training case.</P>
<P>Second, a GA was implemented as a BORN connectivities adaptive unit to improve the BORN algorithm. The GA improved its performance for a hydrocyclone modeling problem. However, the GA needed a fairly large NN structure to minimize the error. The advantage of this method is that a GA can improve the performance of BORN by changing only the NN connections.</P>
<P><FONT SIZE="+1"><B>REFERENCES</B></FONT></P>
<DL>
<DD><B>1</B>&nbsp;&nbsp;Montana, D., J. and Davis, L. (1989) <I>Training feedforward neural networks using genetic algorithms</I>, Proceedings of 11<SUP><SMALL>th</SMALL></SUP> International Joint Conference on Artificial Intelligence, pp. 762-767, San Mateo, CA.
<DD><B>2</B>&nbsp;&nbsp;Goldberg, D. E., (1989) <I>Genetic algorithms in search, optimization, and machine learning</I>. Reading, MA: Addison Wesley.
<DD><B>3</B>&nbsp;&nbsp;Belew, R.K., McInerney, J., and Schraudolph, C. (1990) <I>N.N. Evolving networks: Using genetic algorithms with connectionist learning</I>. CSE technical report CS90-174, La Jolla, CA: University of California at San Diego, June.
<DD><B>4</B>&nbsp;&nbsp;Anderson, C. W. (1989) <I>Learning to control an inverted pendulum using neural networks</I>, IEEE Control Systems Magazine, 9, 31-37.
<DD><B>5</B>&nbsp;&nbsp;Wieland, A. P. (1990) <I>Evolving neural network controllers for unstable systems</I>, IEEE International Joint Conference on Neural Networks, pp. II-667 - II-672, Seattle, WA.
<DD><B>6</B>&nbsp;&nbsp;Werbos, P. J., (1992) <I>Neural Networks, System Identification, and Control in the Chemical Process Industries</I>, Handbook of Intelligent Control, Ed. White and Sofge, Van Nostrand Reinhold, New York, NY.
<DD><B>7</B>&nbsp;&nbsp;Narendra, K. S., (1992) <I>Adaptive Control of Dynamical Systems Using Neural Networks</I>, Handbook of Intelligent Control, Ed. White and Sofge, Van Nostrand Reinhold, New York, NY.
<DD><B>8</B>&nbsp;&nbsp;KrishnaKumar, K., (1993) <I>Optimization of the neural net connectivity pattern using a back-propagation algorithm</I>, Neurocomputing 5, p273-286.
<DD><B>9</B>&nbsp;&nbsp;KrishnaKumar, K. and Nishita, K., <I>Robustness of Recurrent Neural Networks</I>, WCNN'96, San Diego, CA, June 1996.
<DD><B>10</B>&nbsp;&nbsp;KrishnaKumar, K. and Nishita, K., (1995) <I>BORN &#151; Bama Optimized Recurrent Neural Networks</I> WCNN'95, San Diego, CA.
<DD><B>11</B>&nbsp;&nbsp;Willis, B.A., (1979) <I>Mineral Processing Technology</I>, Toronto: Pergamon Press.
<DD><B>12</B>&nbsp;&nbsp;Plitt, L.R., (1976) <I>A mathematical model of the hydrocyclone classifier</I>, CIM Bulletin, 69, pp. 114-123.
</DL>
<P><BR></P>
<CENTER>
<TABLE BORDER>
<TR>
<TD><A HREF="231-234.html">Previous</A></TD>
<TD><A HREF="../ewtoc.html">Table of Contents</A></TD>
<TD><A HREF="../ch12/243-245.html">Next</A></TD>
</TR>
</TABLE>
</CENTER>

<hr width="90%" size="1" noshade>
<div align="center">
<font face="Verdana,sans-serif" size="1">Copyright &copy; <a href="/reference/crc00001.html">CRC Press LLC</a></font>
</div>
<!-- all of the reference materials (books) have the footer and subfoot reveresed -->
<!-- reference_subfoot = footer -->
<!-- reference_footer = subfoot -->

</BODY>
</HTML>

<!-- END FOOTER -->

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -