📄 204-206.html
字号:
<HTML>
<HEAD>
<META name=vsisbn content="0849398010">
<META name=vstitle content="Industrial Applications of Genetic Algorithms">
<META name=vsauthor content="Charles Karr; L. Michael Freeman">
<META name=vsimprint content="CRC Press">
<META name=vspublisher content="CRC Press LLC">
<META name=vspubdate content="12/01/98">
<META name=vscategory content="Web and Software Development: Artificial Intelligence: Other">
<TITLE>Industrial Applications of Genetic Algorithms:Space Shuttle Main Engine Condition Monitoring Using Genetic Algorithms and Radial Basis Function Neural Network</TITLE>
<!-- HEADER -->
<STYLE type="text/css">
<!--
A:hover {
color : Red;
}
-->
</STYLE>
<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">
<!--ISBN=0849398010//-->
<!--TITLE=Industrial Applications of Genetic Algorithms//-->
<!--AUTHOR=Charles Karr//-->
<!--AUTHOR=L. Michael Freeman//-->
<!--PUBLISHER=CRC Press LLC//-->
<!--IMPRINT=CRC Press//-->
<!--CHAPTER=10//-->
<!--PAGES=204-206//-->
<!--UNASSIGNED1//-->
<!--UNASSIGNED2//-->
<CENTER>
<TABLE BORDER>
<TR>
<TD><A HREF="202-204.html">Previous</A></TD>
<TD><A HREF="../ewtoc.html">Table of Contents</A></TD>
<TD><A HREF="206-208.html">Next</A></TD>
</TR>
</TABLE>
</CENTER>
<P><BR></P>
<P><FONT SIZE="+1"><B><I>Genetic Approaches</I></B></FONT></P>
<P>The first major genetic invasion into neural network optimization can be found in the traditional backpropagation realm. The traditional backprop algorithm can incur long training times with subsequent poor generalization ability. To counter these effects, GA methods were developed that encoded individual layers of weights and optimized the network by de-linking (or pruning) those weights which were considered redundant. One such method was used successfully in [1].
</P>
<P>Application of the GA to the RBFNN seemed to follow two paths: 1) Evolution based on search among whole networks and 2) Evolution of one neuron at a time. The methods of (1) would have a whole network be a single individual in the population. The GA would then choose among the various network structures (different centers, weights, and widths) as to which was the best. Those of method (2) would have a set of competing RBF units for one hidden unit addition (much like cascade correlation). Successive runs of the GA would then evolve the RBFNN one hidden unit at a time.</P>
<P>None of the aforementioned procedures give the desired flexibility necessary for the creation of higher order surfaces. To do this requires the simultaneous optimization of all RBF units within a single neural network. Whitehead and Choate contributed a seminal paper which attempted to do exactly this [2]. A cooperative-competitive algorithm was established in reference [2] that uses a population of strings to represent an RBFNN; not a population of RBFNNs. In other words, a single bit string was used to encode the center and width of one RBF. The algorithm attempted to evolve RBFs that would compete when doing the same job (in terms of the functional mapping), and implicitly share fitness (niching) when covering different parts of the overall functional region. In other words, if two RBFs are trying to cover the same area within the functional space, then they should compete, but if they’re supplying individual contributions to the overall job of approximating the function, then they should form niches. The algorithm thus established mandated a fixed number of hidden layer neurons and a pre-chosen kernel type which applied to all neurons. Subsequent testing revealed that the GA approach offered 50-70% better performance than the best k-means approach.</P>
<P>The work of Whitehead and Choate [2] is significant in that it optimizes a single RBF network by evolving all the RBF units at once. As suggested earlier, anything that introduces more flexibility into the RBF construction can potentially improve performance (this was shown in [2]). The attempt of the current paper is to extend the viability of that statement by augmenting the work of [2]. Specifically, the individual kernel type choice, widths, and centers will all be allowed to vary during the evolution. The individual RBF neurons will have their parameters encoded as a binary bit string and the collection of these will comprise a single population.</P>
<P><FONT SIZE="+1"><B>GENETIC ENCODING</B></FONT></P>
<P>The space spanned by the center vector <IMG SRC="images/10-13i.jpg"> defined on <I>R</I><SUP><SMALL>n</SMALL></SUP>, where n is the dimension of the input vector <I>x</I><SUB><SMALL>i</SMALL></SUB>. In [2], the inputs were scaled to occupy the unit hypercube [0,1]<SUP><SMALL>n</SMALL></SUP>. This scaling forced the kernel centers (μ<SUB><SMALL>i</SMALL></SUB>) to fall within the same unit hypercube. Encoding this arrangement required a translation of the fractional component along each dimension of the hypercube. The encoding scheme presented in the current effort does not place this restriction on center scaling. Centers are allowed to have components that have floating point numbers outside the range of [0,1].</P>
<P>The overall objective is to evolve a set of basis function parameters for a layer of neurons. The parameters are the neuron’s center, width, and basis function type. Assuming a binary alphabet {0,1} with <I>l</I> bits of precision for the integer part and <I>k</I> bits for the floating point part of a center component, then (log(max_comp_in_pop - 2) + <I>l</I>)*n bits will completely encode a single center. The width can be encoded with (<I>l</I>+<I>k</I>) bits of precision as well. Equations (10.1)-(10.3) give the various functions that a neuron can employ, thus, to encode the function type, will only require 2 bits (Gaussian is used for the redundant bit map). In summary, the parameters specifying a neuron are encoded in the following string form:</P>
<DL>
<DD>φ<SUB><SMALL>i</SMALL></SUB> = (center)(width)(function type)
</DL>
<P>The bit length can be written as:
</P>
<DL>
<DD>φ<SUB><SMALL>i</SMALL></SUB> = {[log(max_comp_inpop - 2)+<I>l</I>]*n} [<I>l</I>+<I>k</I>][2]
</DL>
<P>Therefore, a single generation will consist of a string population that completely encodes a single RBFNN architecture. <I>Successive generations will evolve the network as opposed to evolving a population of networks</I>.</P><P><BR></P>
<CENTER>
<TABLE BORDER>
<TR>
<TD><A HREF="202-204.html">Previous</A></TD>
<TD><A HREF="../ewtoc.html">Table of Contents</A></TD>
<TD><A HREF="206-208.html">Next</A></TD>
</TR>
</TABLE>
</CENTER>
<hr width="90%" size="1" noshade>
<div align="center">
<font face="Verdana,sans-serif" size="1">Copyright © <a href="/reference/crc00001.html">CRC Press LLC</a></font>
</div>
<!-- all of the reference materials (books) have the footer and subfoot reveresed -->
<!-- reference_subfoot = footer -->
<!-- reference_footer = subfoot -->
</BODY>
</HTML>
<!-- END FOOTER -->
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -