⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 310-316.html

📁 遗传算法经典书籍-英文原版 是研究遗传算法的很好的资料
💻 HTML
字号:
<HTML>
<HEAD>
<META name=vsisbn content="0849398010">
<META name=vstitle content="Industrial Applications of Genetic Algorithms">
<META name=vsauthor content="Charles Karr; L. Michael Freeman">
<META name=vsimprint content="CRC Press">
<META name=vspublisher content="CRC Press LLC">
<META name=vspubdate content="12/01/98">
<META name=vscategory content="Web and Software Development: Artificial Intelligence: Other">




<TITLE>Industrial Applications of Genetic Algorithms:What Can I Do with a Learning Classifier System?</TITLE>

<!-- HEADER -->

<STYLE type="text/css"> 
 <!--
 A:hover  {
 	color : Red;
 }
 -->
</STYLE>

<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">

<!--ISBN=0849398010//-->
<!--TITLE=Industrial Applications of Genetic Algorithms//-->
<!--AUTHOR=Charles Karr//-->
<!--AUTHOR=L. Michael Freeman//-->
<!--PUBLISHER=CRC Press LLC//-->
<!--IMPRINT=CRC Press//-->
<!--CHAPTER=16//-->
<!--PAGES=310-316//-->
<!--UNASSIGNED1//-->
<!--UNASSIGNED2//-->

<CENTER>
<TABLE BORDER>
<TR>
<TD><A HREF="308-310.html">Previous</A></TD>
<TD><A HREF="../ewtoc.html">Table of Contents</A></TD>
<TD><A HREF="316-320.html">Next</A></TD>
</TR>
</TABLE>
</CENTER>
<P><BR></P>
<P><FONT SIZE="+1"><B><I>Discussion</I></B></FONT></P>
<P>The performance history (Figure 16.4) shows that an ANN with genetically evolved input connectivity can classify the correct values for the input parameters. Furthermore, the results indicate that the difference in sharing function contribute to the overall final network.
</P>
<P>It was pointed out earlier that sharing was responsible for speciation via degrading fitness. Degradation of fitness based on the composition of the population led to an interesting difference in speciation. Varying levels of speciation using implicit sharing are possible by inclusion or exclusion of terms in the sharing equation. By direct sharing of fitness by the number of a given specie, uniqueness becomes a deciding factor in the size of the population. Figures 16.5 and 16.6 depict the evolutionary history of a training run on the 6 multiplexor with sharing by copy count and sharing by firing behavior (see Equation 16.3). Using the copy count of a specie generates a larger ending population. Lower population sizes (lower levels of speciation) were observed when sharing utilized only the first term of Equation 16.3, i.e., sharing type firing behavior only (see Figures 16.7 and 16.8).</P>
<P>Another point about the results presented involves the output activation function. The structure of the ANN was a hidden layer of <I>bipolar sigmoidal</I> activation functions<SUP><SMALL><B>2</B></SMALL></SUP> and a linear weighted (sum) output (linear activation function). Additionally, training of the ANN involved only the hidden to output weights. The two constraints, linear weighted sum and one layer of modifiable weights, lead to an approximator that can only update its weight via gradient descent. All nonlinear effects emerged from the GA evolved input to hidden connectivity. The LCS paradigm uses GA evolved rule conditions, much like the above ANN. This similarity leads to the aforementioned analogy of an LCS as a type of ANN (Smith &amp; Cribbs, 1994).</P>

<BLOCKQUOTE>
<HR>
<SUP><SMALL><B>2</B></SMALL></SUP><FONT SIZE="-1">Bipolar activation denotes binary output centered around zero. The active state is positive one (+1) and the rest, or &#147;off,&#148; state is negative one (-1). Sigmoidal activation indicates continuous activation with saturation occurring quickly at one of the two extremes (-1 or +1).
</FONT>
<HR>
</BLOCKQUOTE>

<P>Differences in the LCS and ANN prevail in the interpretation of rule (hidden layer neuron) firing. The ANN simply takes a <I>weighted sum</I> of all the active neurons, while the LCS sums all active rules with the same action. This summing effect in both cases is similar, but the difference becomes apparent considering <I>negative valued weights</I>. Negative values in an ANN act as analog complements, i.e., the negative value weights oppose an action. Opposition in the LCS takes the form of competing rules advocating different actions. The cumulative effect is that the LCS relies on the GA to &#147;fill in the blanks,&#148; i.e., discover rules to cover conditions when the existing rule set does not perform in a desired fashion. The ANN takes another approach; allowing the gradient descent procedure to change the interpretation of the neuron activations.</P>
<P><A NAME="Fig4"></A><A HREF="javascript:displayWindow('images/16-04.jpg',500,408)"><IMG SRC="images/16-04t.jpg"></A>
<BR><A HREF="javascript:displayWindow('images/16-04.jpg',500,408)"><FONT COLOR="#000077"><B>Figure 16.4</B></FONT></A>&nbsp;&nbsp;Performance (learning) history for the 6 multiplexor problem using an ANN with GA-based input partitioning. Note that this result uses the fitness and sharing scheme defined by Equations 2 and 3.</P>
<TABLE WIDTH="100%" BORDER>
<CAPTION ALIGN=LEFT><B>Table 16.1</B> Training parameters used the train the ANN for the 6 multiplexor problem.
<TR>
<TH WIDTH="60%" ALIGN="LEFT">Parameter
<TH WIDTH="40%" ALIGN="LEFT">Value
<TR>
<TD>Learning Rate, &#951;
<TD>0.02
<TR>
<TD>Exponential Smoothing Constant<SUP><SMALL><B>3</B></SMALL></SUP>, &#966;<TD>0.10
<TR>
<TD>&#147;Digital&#148; Error Threshold
<TD>0.05
<TR>
<TD>GA Trigger Frequency
<TD>every 675 epochs
<TR>
<TD>Selection Block Size
<TD>2 individuals
<TR>
<TD>Probability of Recombination, p<SUB><SMALL>c</SMALL></SUB>
<TD>0.90
<TR>
<TD>Probability of Mutation, p<SUB><SMALL>m</SMALL></SUB>
<TD>0.01
<TR>
<TD>Probability of Hash Mutation, p<SUB><SMALL>h</SMALL></SUB>
<TD>0.10
</TABLE>

<BLOCKQUOTE>
<HR>
<SUP><SMALL><B>3</B></SMALL></SUP><FONT SIZE="-1">Exponential smoothing is defined as a continuous averaging method. The method may be defined as,
<P ALIGN="CENTER"><IMG SRC="images/16-04d.jpg"></P>
<BR>where &#147;s hat&#148; is the current measurement and S is the running average.
</FONT>
<HR>
</BLOCKQUOTE>

<P>The ANN&#146;s gradient descent mechanism can change the output weights of all active neurons; hence, the gradient descent mechanism continually tunes the existing neurons&#146; weights. Continual tuning could potentially modify an advocate neuron into an opponent if environmental cues exist to force a weight change. The results presented are not quite as good as those presented by Cribbs (1995), but the linear output of the ANN in this study contributes much to this fact. Cribbs (1995) work entailed bipolar sigmoid activation functions throughout the ANN (including output node). The added nonlinear activation at output confers greater classification ability than the linear activation used on output in this study.
</P>
<P><A NAME="Fig5"></A><A HREF="javascript:displayWindow('images/16-05.jpg',500,395)"><IMG SRC="images/16-05t.jpg"></A>
<BR><A HREF="javascript:displayWindow('images/16-05.jpg',500,395)"><FONT COLOR="#000077"><B>Figure 16.5</B></FONT></A>&nbsp;&nbsp;Generational history of population (hidden layer) size. Note this result reflects sharing by both firing behavior (phenotype) and by specie count (genotype).</P>
<P><A NAME="Fig6"></A><A HREF="javascript:displayWindow('images/16-06.jpg',500,404)"><IMG SRC="images/16-06t.jpg"></A>
<BR><A HREF="javascript:displayWindow('images/16-06.jpg',500,404)"><FONT COLOR="#000077"><B>Figure 16.6</B></FONT></A>&nbsp;&nbsp;Generational history of connectivity between the input lines (layer) and the hidden layer (by percentage, 100% indicates full connectivity).</P>
<P><A NAME="Fig7"></A><A HREF="javascript:displayWindow('images/16-07.jpg',500,407)"><IMG SRC="images/16-07t.jpg"></A>
<BR><A HREF="javascript:displayWindow('images/16-07.jpg',500,407)"><FONT COLOR="#000077"><B>Figure 16.7</B></FONT></A>&nbsp;&nbsp;Generational history of population (hidden layer) size. Note this history depicts evolution with sharing by phenotype only, i.e., no sharing by specie (genotype).</P>
<P><A NAME="Fig8"></A><A HREF="javascript:displayWindow('images/16-08.jpg',500,410)"><IMG SRC="images/16-08t.jpg"></A>
<BR><A HREF="javascript:displayWindow('images/16-08.jpg',500,410)"><FONT COLOR="#000077"><B>Figure 16.8</B></FONT></A>&nbsp;&nbsp;Generational history of connectivity between the input lines and the hidden layer (by percentage). Note this result emerged from sharing by firing behavior only.</P>
<P><FONT SIZE="+1"><B><I>Conclusion</I></B></FONT></P>
<P>This chapter presents a discussion of the major components of the LCS paradigm (including references to help the reader discover more about the LCS), a review of an analogy of LCS principles in a widely known machine learning technique, i.e., ANNs, and an example where GA-induced nonlinearity via input partitions contributed to the solution of a linearly inseparable problem.<SUP><SMALL><B>4</B></SMALL></SUP></P>

<BLOCKQUOTE>
<HR>
<SUP><SMALL><B>4</B></SMALL></SUP><FONT SIZE="-1">As previously pointed out, the use of a linear activation function on the output shows that the nonlinearity derived from input partitioning confirms Wilson&#146;s (1990) result.
</FONT>
<HR>
</BLOCKQUOTE>

<P>The LCS represents an interesting facet of machine learning. Genetics-based machine learning techniques take advantage of evolutionary search as their main search mechanism. Other, more traditional computational techniques are prevalent in machine learning with the GA-based avenues less explored. Finally, the presentation of <I>behavior-based sharing</I> shows a method of speciation compatible with ANNs.</P><P><BR></P>
<CENTER>
<TABLE BORDER>
<TR>
<TD><A HREF="308-310.html">Previous</A></TD>
<TD><A HREF="../ewtoc.html">Table of Contents</A></TD>
<TD><A HREF="316-320.html">Next</A></TD>
</TR>
</TABLE>
</CENTER>

<hr width="90%" size="1" noshade>
<div align="center">
<font face="Verdana,sans-serif" size="1">Copyright &copy; <a href="/reference/crc00001.html">CRC Press LLC</a></font>
</div>
<!-- all of the reference materials (books) have the footer and subfoot reveresed -->
<!-- reference_subfoot = footer -->
<!-- reference_footer = subfoot -->

</BODY>
</HTML>

<!-- END FOOTER -->

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -