⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 197-198.html

📁 遗传算法经典书籍-英文原版 是研究遗传算法的很好的资料
💻 HTML
字号:
<HTML>
<HEAD>
<META name=vsisbn content="0849398010">
<META name=vstitle content="Industrial Applications of Genetic Algorithms">
<META name=vsauthor content="Charles Karr; L. Michael Freeman">
<META name=vsimprint content="CRC Press">
<META name=vspublisher content="CRC Press LLC">
<META name=vspubdate content="12/01/98">
<META name=vscategory content="Web and Software Development: Artificial Intelligence: Other">




<TITLE>Industrial Applications of Genetic Algorithms:Space Shuttle Main Engine Condition Monitoring Using Genetic Algorithms and Radial Basis Function Neural Network</TITLE>

<!-- HEADER -->

<STYLE type="text/css"> 
 <!--
 A:hover  {
 	color : Red;
 }
 -->
</STYLE>

<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">

<!--ISBN=0849398010//-->
<!--TITLE=Industrial Applications of Genetic Algorithms//-->
<!--AUTHOR=Charles Karr//-->
<!--AUTHOR=L. Michael Freeman//-->
<!--PUBLISHER=CRC Press LLC//-->
<!--IMPRINT=CRC Press//-->
<!--CHAPTER=10//-->
<!--PAGES=197-198//-->
<!--UNASSIGNED1//-->
<!--UNASSIGNED2//-->

<CENTER>
<TABLE BORDER>
<TR>
<TD><A HREF="../ch09/192-196.html">Previous</A></TD>
<TD><A HREF="../ewtoc.html">Table of Contents</A></TD>
<TD><A HREF="199-201.html">Next</A></TD>
</TR>
</TABLE>
</CENTER>
<P><BR></P>
<H2><A NAME="Heading1"></A><FONT COLOR="#000077">Chapter 10<BR>Space Shuttle Main Engine Condition Monitoring Using Genetic Algorithms and Radial Basis Function Neural Network
</FONT></H2>
<P><I>Daniel A. Benzing</I></P>
<P>Department of Aerospace Engineering and Mechanics<BR>The University of Alabama;Tuscaloosa, Alabama</P>
<P><FONT SIZE="+1"><B>ABSTRACT</B></FONT></P>
<P>The Radial Basis Function Neural Network (RBFNN) has been utilized, with varying degrees of success, as a function approximator in many engineering applications. The principle discriminate which ultimately determines an RBFNN&#146;s viability for a particular case is its ability to construct higher order surfaces that can sufficiently perform the desired functional mapping. This can be difficult if the underlying relationship between the input and output variables exhibits an extremely nonlinear propensity. The general architecture of the RBFNN offers many parameters for its construction. While numerous methods for the optimization of these parameters have been documented, most have attempted to establish bits and pieces of the RBF structure, leaving some of the parameters to pure random choice, because of the immense computational cost incurred by the optimization of all parts. The genetic algorithm (GA) supplies an attractive means for accomplishing such a global parameter optimization. The purpose of this chapter is to establish a GA procedure that allows all the neurons of the RBF architecture to evolve simultaneously for the creation of higher order response surfaces that are accurate predictors and have adequate generalization ability. The hybrid architecture thus developed will be applied to a neural health monitoring system for the Space Shuttle Main Engine (SSME). In this case, the GA/RBFNN component will be required to quantify the metallic species content of the SSME plume, using the plume&#146;s electromagnetic emission. The underlying function which maps a plume&#146;s spectral signature to its metallic state is very nonlinear, and it will provide an excellent measure of the GA/RBFNN&#146;s ability.
</P>
<P><FONT SIZE="+1"><B>INTRODUCTION</B></FONT></P>
<P>There are a multitude of problems in physics and engineering which involve the classic statistics problem of estimating a function from a set of input-output pairings. This task has typically been managed by traditional nonparametric methods; that is, there is no prior knowledge of the true function being approximated, and the free parameters in the model have no physical meaning connected with the problem at hand. There are, however, neural techniques which incorporate supervised learning schemes that are capable of effectively learning extremely nonlinear mappings. Supervised learning asserts that the function is learned from a set of training patterns in conjunction with feedback from a teacher or critic. These training patterns are a sequence of input (independent variable) to output (dependent variable) pairings:
</P>
<P ALIGN="CENTER"><IMG SRC="images/10-01d.jpg"></P>
<P>For a given input, the teacher or critic awards or penalizes the approximator based on comparison of the predicted outputs and the known correct outputs. Two established neural methodologies have met with success in this arena, namely: 1) traditional backpropagation and 2) RBFNN Classifiers. In general, these methodologies can be considered nonparametric models. Traditional backpropagation networks have multiple layers of activation functions, and use costly nonlinear gradient search methods in the free parameter optimization. Conversely, RBFNN networks have a single layer which is accompanied by a training methodology involving only basic linear algebraic concepts. This produces the desired effect of detaching the burden of lengthy calculations and provides an attractive, computationally inexpensive means of function approximation.
</P>
<P>As shown in [1], there are times when the performance of an RBFNN can be significantly affected by perturbations of its internal parameters, with their establishment ultimately determining an application&#146;s success. For the RBFNN, the basic internal parameters are the neuron activation functions, neuron placement in the functional space, and the extent of each neuron&#146;s coverage. The optimization of all these parameters at once involves forbidding computational costs, and the job is typically reduced to only local optimizations of a few parameters. The overall task of global parameter optimization, however, is suitable for a GA application.</P>
<P>The focus of the discussion herein will be the extension of the seminal work submitted by Whithead and Choate [2]. The foregoing paper establishes a cooperative-competitive GA that allows an RBFNN to evolve both its neurons positions and widths. The procedure was tested on the benchmark Mackey-Glass differential equation [3] with the results showing the evolution of RBFNNs which had a 50-70% lower prediction error than those obtained using traditional k-means clustering [4]. The efforts of this chapter will involve the optimization of not only a neuron&#146;s position and width, but also to allow the neuron to take on a different form of radial response function. Moreover, the chosen radial function will be given more freedom and control over its area of coverage. The optimization of all Radial Basis Function (RBF) neurons will proceed simultaneously throughout the genetic evolution.</P><P><BR></P>
<CENTER>
<TABLE BORDER>
<TR>
<TD><A HREF="../ch09/192-196.html">Previous</A></TD>
<TD><A HREF="../ewtoc.html">Table of Contents</A></TD>
<TD><A HREF="199-201.html">Next</A></TD>
</TR>
</TABLE>
</CENTER>

<hr width="90%" size="1" noshade>
<div align="center">
<font face="Verdana,sans-serif" size="1">Copyright &copy; <a href="/reference/crc00001.html">CRC Press LLC</a></font>
</div>
<!-- all of the reference materials (books) have the footer and subfoot reveresed -->
<!-- reference_subfoot = footer -->
<!-- reference_footer = subfoot -->

</BODY>
</HTML>

<!-- END FOOTER -->

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -