📄 215-217.html
字号:
<HTML>
<HEAD>
<META name=vsisbn content="0849398010">
<META name=vstitle content="Industrial Applications of Genetic Algorithms">
<META name=vsauthor content="Charles Karr; L. Michael Freeman">
<META name=vsimprint content="CRC Press">
<META name=vspublisher content="CRC Press LLC">
<META name=vspubdate content="12/01/98">
<META name=vscategory content="Web and Software Development: Artificial Intelligence: Other">
<TITLE>Industrial Applications of Genetic Algorithms:Tuning Bama Optimized Recurrent Neural Networks Using Genetic Algorithms</TITLE>
<!-- HEADER -->
<STYLE type="text/css">
<!--
A:hover {
color : Red;
}
-->
</STYLE>
<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">
<!--ISBN=0849398010//-->
<!--TITLE=Industrial Applications of Genetic Algorithms//-->
<!--AUTHOR=Charles Karr//-->
<!--AUTHOR=L. Michael Freeman//-->
<!--PUBLISHER=CRC Press LLC//-->
<!--IMPRINT=CRC Press//-->
<!--CHAPTER=11//-->
<!--PAGES=215-217//-->
<!--UNASSIGNED1//-->
<!--UNASSIGNED2//-->
<CENTER>
<TABLE BORDER>
<TR>
<TD><A HREF="../ch10/211-214.html">Previous</A></TD>
<TD><A HREF="../ewtoc.html">Table of Contents</A></TD>
<TD><A HREF="217-221.html">Next</A></TD>
</TR>
</TABLE>
</CENTER>
<P><BR></P>
<H2><A NAME="Heading1"></A><FONT COLOR="#000077">Chapter 11<BR>Tuning Bama Optimized Recurrent Neural Networks Using Genetic Algorithms
</FONT></H2>
<P><I>K. Nishita</I></P>
<P>Graduate Student<BR>Department of Aerospace Engineering and Mechanics<BR>University of Alabama<BR>Box 870280<BR>Tuscaloosa, AL 35487-0280<BR>e-mail: knishita@eng.ua.edu</P>
<P><FONT SIZE="+1"><B>ABSTRACT</B></FONT></P>
<P>Genetic Algorithms (GAs) have demonstrated highly efficient and robust in optimization capabilities in various fields. This chapter discusses the use of a simple GA for enhancing the performance of a Bama Optimized Recurrent Neural Network (BORN). In order to measure the performance of BORN with a GA, two different studies are considered. First, the results produced by the GA optimized BORN were compared to results produced using the BORN algorithm alone. Second, a GA was used to alter the connectivities in a BORN adaptive unit to enhance the performance of the BORN algorithm. The advantage of this method is that a GA can improve the performance of the BORN by changing only the NN connections. In other words, since previously trained BORN weights are not changed, the BORN is capable of retaining its memory. Results show that a GA enhanced BORN’s adaptive capabilities with a high degree of accuracy.
</P>
<P><FONT SIZE="+1"><B>INTRODUCTION</B></FONT></P>
<P><FONT SIZE="+1"><B><I>Background of GA Application in NN</I></B></FONT></P>
<P>Genetic Algorithms (GA) have been demonstrated highly efficient and robust optimization capabilities in various fields. In recent years, many researchers have sought to apply GAs in the area of neural networks (NNs). NNs have been successfully adapted to the number of practical engineering problems such as controllers, classifiers, etc. NNs have great potential in representing arbitrary surfaces and can be easily implemented. Various schemes for combining GA and NN have been proposed and tested to optimize NN performance in recent years. Montana and Davis [1] reported the first successful results of a relatively large NN (500 connections) optimization by GA. However, the GA used by Montana and Davis was different in several ways from the more traditional GA described by Goldberg [2]. Belew [3] used a GA to select initial NN weights and trained NN by using a gradient descent method. His results showed GA and NN combined method performed better than instances in which only a GA or a NN gradient decent method were used. Anderson [4] applied a GA for NN reinforcement learning problems in which desired outputs are not available for training purposes. He used his system to successfully control an inverted pendulum. Anderson also compared these results with his adaptive critic controller and found that GA for NN reinforcement learning was competitive with NN training method.
</P>
<P>Recently, the most considerable attention has been focused on recurrent neural networks. The effectiveness of recurrent neural networks for the identification and control of nonlinear dynamic systems has been demonstrated by Werbos [6] and Narendra [7] et al. Wieland [5] also used GA to optimize recurrent NN for two poles jointed inverted pendulum. He used a simple GA [2] to optimize a recurrent NN controller to successfully control this difficult, highly nonlinear problem. For nonlinear system controllers such as an inverted pendulum, the effectiveness of the control system must be gauged not only by the speed of which the goal is achieved, but also by the stability and the robustness. KrishnaKumar [8,9,10] et al. demonstrated recurrent neural network stability and robustness by using the method of network connection optimization (Bama Optimized Recurrent Neural Network). Bama Optimized Recurrent Neural Network (BORN) is a back-propagation neural network (BPNN) which has the capabilities of network connectivities optimization. This optimization algorithm was first introduced by KrishnaKumar [8] for static neural networks and adapted for recurrent neural networks in [9]. BORN has demonstrated high robustness and generalization capability in various system control areas [10]. However, a disadvantage of this algorithm is that once the BORN algorithm is performed, connectivities cannot be altered without re-initializing its weights and connections. This chapter explores control capabilities of the BORN algorithm and considers a mechanism to address this disadvantage by using a GA.</P>
<P><FONT SIZE="+1"><B><I>Study Objective</I></B></FONT></P>
<P>The objective of the work presented in this chapter is to employ a GA in a BORN algorithm to alter the NN connections after the BORN algorithm execution. In order to fulfill this objective, the following studies will be undertaken:
</P>
<DL>
<DD><B>•</B> Perform BPNN weight selections and connectivities optimizations using a GA and compare the results with classic BORN algorithm results.
<DD><B>•</B> Once the BORN algorithm is executed, the network connectivities cannot be adapted in BORN. In order to enhance a BORN, a GA will be used as a BORN connectivities adaptive unit for further optimal weight and connection selections. The GA adaptive unit will be executed only when BORN detects the failure to achieve its goals due to sudden changes in input patterns.
</DL>
<P><FONT SIZE="+1"><B><I>Chapter Outline</I></B></FONT></P>
<P>The reminder of this chapter is organized into four sections. First, BORN algorithms are introduced and the details of the mathematical model for a BORN are emphasized. Second, the details of BP Recurrent Neural Network weights selections and connectivities optimizations by GA are presented. The results of the GA optimization performance will be presented at the end of this section. Third, the concepts of GA-BORN connectivities adaptive unit are presented. Finally, the GA-BORN results are compared to those of a non-adaptive BORN.
</P>
<P><FONT SIZE="+1"><B>SIMPLE GA AND BORN ALGORITHM</B></FONT></P>
<P>This section introduces simple GA and BORN algorithm. For the sake of completeness, the conventional backpropagation recurrent neural network algorithm is also presented here.
</P><P><BR></P>
<CENTER>
<TABLE BORDER>
<TR>
<TD><A HREF="../ch10/211-214.html">Previous</A></TD>
<TD><A HREF="../ewtoc.html">Table of Contents</A></TD>
<TD><A HREF="217-221.html">Next</A></TD>
</TR>
</TABLE>
</CENTER>
<hr width="90%" size="1" noshade>
<div align="center">
<font face="Verdana,sans-serif" size="1">Copyright © <a href="/reference/crc00001.html">CRC Press LLC</a></font>
</div>
<!-- all of the reference materials (books) have the footer and subfoot reveresed -->
<!-- reference_subfoot = footer -->
<!-- reference_footer = subfoot -->
</BODY>
</HTML>
<!-- END FOOTER -->
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -