⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 chapter4.ps

📁 收集了遗传算法、进化计算、神经网络、模糊系统、人工生命、复杂适应系统等相关领域近期的参考论文和研究报告
💻 PS
📖 第 1 页 / 共 5 页
字号:
493.81 592.66 475.65 567.15 2 LN7 X90 450 6.81 5.44 511.97 567.15 G0 X90 450 6.81 5.44 511.97 567.15 A90 450 6.81 5.44 475.65 567.15 G90 450 6.81 5.44 475.65 567.15 A4 X90 450 6.81 5.44 511.97 541.64 G0 X90 450 6.81 5.44 511.97 541.64 A4 X90 450 6.81 5.44 475.65 541.64 G0 X90 450 6.81 5.44 475.65 541.64 A4 X90 450 6.81 5.44 493.81 592.66 G0 X90 450 6.81 5.44 493.81 592.66 A491.08 611.28 454.76 636.79 2 LN454.76 611.28 491.08 636.79 2 LN491.08 636.79 491.08 611.28 2 LN454.76 636.79 454.76 611.28 2 LN472.92 662.3 491.08 636.79 2 LN472.92 662.3 454.76 636.79 2 LN90 450 6.81 5.44 491.08 636.79 G90 450 6.81 5.44 491.08 636.79 A90 450 6.81 5.44 454.76 636.79 G90 450 6.81 5.44 454.76 636.79 A4 X90 450 6.81 5.44 491.08 611.28 G0 X90 450 6.81 5.44 491.08 611.28 A4 X90 450 6.81 5.44 454.76 611.28 G0 X90 450 6.81 5.44 454.76 611.28 A4 X90 450 6.81 5.44 472.92 662.3 G0 X90 450 6.81 5.44 472.92 662.3 A90 450 71.73 103.01 454.76 576.69 A(Evaluation) 389.86 592.9 T(space) 402.57 583.91 T0 F(A) 129.62 574.61 T(B) 185.13 550.04 T114.72 384.63 533.46 446.81 R7 XV4 12 Q0 X-0.45 (Figure 12:) 114.72 438.81 P3 F-0.45 (The competing conventions pr) 169.78 438.81 P-0.45 (oblem \050Schaffer) 312.88 438.81 P-0.45 (, Whitley and Eshelman 1992\051.) 387.37 438.81 P0.8 (Bit strings A and B map to structurally and computationally equivalent networks that) 114.72 426.81 P3.69 (assign the hidden units in differ) 114.72 414.81 P3.69 (ent or) 284.32 414.81 P3.69 (ders. Because the bit strings ar) 315.88 414.81 P3.69 (e distinct,) 483.13 414.81 P1.4 (cr) 114.72 402.81 P1.4 (ossover is likely to pr) 124.26 402.81 P1.4 (oduce an offspring that contains multiple copies of the same) 232.02 402.81 P(hidden node, yielding a network with less computational ability than either par) 114.72 390.81 T(ent.) 494.07 390.81 T0 0 612 792 CFMENDPAGE%%EndPage: "72" 9%%Page: "73" 9612 792 0 FMBEGINPAGE108 63 540 702 R7 X0 KV108 711 540 720 RV0 12 Q0 X(73) 528.01 712 T108 90 540 702 R7 XV0 X0.28 (Finally) 126 694 P0.28 (, when the architecture is variable, deception can occur when the parents dif) 159.21 694 P0.28 (fer) 526.69 694 P1.61 (topologically) 108 676 P1.61 (. The types of distributed representations that might develop in a network) 170.52 676 P0.82 (varies widely with the number of hidden units and the network\325) 108 658 P0.82 (s connectivity) 420.71 658 P0.82 (. Thus, the) 487.71 658 P2.1 (distributed representations of topologically distinct networks have a greater chance of) 108 640 P-0.28 (being incompatible parents. This further reduces the likelihood that crossover will produce) 108 622 P(good of) 108 604 T(fspring.) 144.76 604 T0.25 (In short, for crossover to be a viable operator when evolving networks, the interpreta-) 126 574 P1.51 (tion function must somehow compensate for all three types of deceptiveness described) 108 556 P0.57 (above. This suggests that the complexity of an appropriate interpretation function should) 108 538 P1.59 (more than rival the complexity of the original learning problem. Thus, the prospect of) 108 520 P2 (evolving connectionist networks with crossover goes against current theory associated) 108 502 P4.12 (with both genetic algorithms and connectionist networks. Better results should be) 108 484 P0.1 (expected with reproduction operators that respect the uniqueness and potential incompati-) 108 466 P(bility of the developing distributed representations.) 108 448 T1 F(4.3.2  Network Induction with Evolutionary Pr) 108 412 T(ogramming) 346.99 412 T0 F1.66 (Unlike genetic algorithms, evolutionary programming de\336nes representation-depen-) 126 386 P1.14 (dent operators that create of) 108 368 P1.14 (fspring within a speci\336c behavioral locus of the parent \050see) 246.24 368 P-0.08 (Figure 13\051. Because EP models evolution at the level of the behavior of unmixing species,) 108 350 P2.19 (recombination of distinct population members is not considered. EP\325) 108 332 P2.19 (s commitment to) 454.99 332 P0.77 (mutation as the sole reproductive operator for searching over a space is preferable when) 108 314 P-0.03 (there is no suf) 108 296 P-0.03 (\336cient calculus to guide recombination by crossover) 175.33 296 P-0.03 (, or when separating the) 425.19 296 P(search and evaluation spaces does not af) 108 278 T(ford an advantage.) 301.62 278 T0.12 (Some previous EP systems have addressed the problem of evolving connectionist net-) 126 248 P-0 (works. Fogel, Fogel and Porto \0501990\051 investigates training feedforward networks on some) 108 230 P-0.14 (classic connectionist problems. McDonnell and W) 108 212 P-0.14 (aagen \0501992\051 uses EP to evolve the con-) 348.15 212 P1.52 (nectivity of feedforward networks with a constant number of hidden units by evolving) 108 194 P0.47 (both a weight matrix and a connectivity matrix. Fogel \0501992a\051 and Fogel \0501993a\051 use EP) 108 176 P1.9 (to induce three-layer fully-connected feedforward networks with a variable number of) 108 158 P0.52 (hidden units that employ good strategies for playing T) 108 140 P0.52 (ic-T) 372.27 140 P0.52 (ac-T) 391.41 140 P0.52 (oe. Saravanan \0501992\051 uses) 412.54 140 P(an EP algorithm to evolve the weights of designed architectures.) 108 122 TFMENDPAGE%%EndPage: "73" 10%%Page: "74" 10612 792 0 FMBEGINPAGE108 63 540 702 R7 X0 KV108 711 540 720 RV0 12 Q0 X(74) 528.01 712 T108 90 540 702 R7 XV0 X0.15 (In most of the above studies, the mutation operator alters the parameters of network) 126 399.82 P2 F0.15 (h) 532.77 399.82 P0 F(by the function:) 108 381.82 T0 10 Q(\050EQ 10\051) 507.53 347.42 T0 12 Q0.45 (where) 108 307.69 P3 F0.45 (w) 140.75 307.69 P0 F0.45 ( is a weight,) 148.75 307.69 P2 14 Q0.53 (e) 211.52 307.69 P0 12 Q0.45 (\050) 217.67 307.69 P2 F0.45 (h) 221.66 307.69 P0 F0.45 (\051 is the error of the network on the task,) 228.89 307.69 P2 F0.45 (a) 426.28 307.69 P0 F0.45 ( is a user) 433.85 307.69 P0.45 (-de\336ned pro-) 477.27 307.69 P0.37 (portionality constant, and) 108 289.69 P3 F0.37 (N) 234.38 289.69 P0 F0.37 (\050) 242.38 289.69 P2 F0.37 (m) 246.38 289.69 P0 F0.37 (,) 253.28 289.69 P2 F0.37 (s) 259.65 289.69 P0 F0.37 (\051 is a gaussian variable with mean) 266.89 289.69 P2 F0.37 (m) 435.05 289.69 P0 F0.37 (and standard devia-) 445.33 289.69 P(tion) 108 271.69 T2 F(s) 129.66 271.69 T0 F(.) 136.89 271.69 T0.36 (The implementations of structural mutations in the studies that make structural mutations) 108 241.69 P1.21 (dif) 108 223.69 P1.21 (fer somewhat. McDonnell and W) 121.11 223.69 P1.21 (aagen \0501992\051 randomly selects a set of weights and) 285.19 223.69 P0.69 (alters their values with a probability based on the variance of the incident nodes\325 activa-) 108 205.69 P1.02 (tion over the training set; connections from nodes with a high variance having less of a) 108 187.69 P1.51 (chance of being altered. The structural mutation used in Fogel \0501992a; 1992b\051 adds or) 108 169.69 P0.64 (deletes a single hidden unit with equal probability) 108 151.69 P0.64 (. As in connectionist and some genetic) 351.25 151.69 P0.39 (algorithm approaches to inducing network architectures, both of these EP studies employ) 108 133.69 P(weak heuristics of modi\336cation that limit the accessibility of certain architectures.) 108 115.69 T108 90 540 702 C108 407.82 540 702 C215.04 608.61 215.04 588.59 233.04 578.58 233.04 608.61 4 Y5 X0 KV1 H2 Z0 XN3 14 Q(Structur) 132.04 675.95 T(e) 177.39 675.95 T(space) 141.04 662.61 T7 X270 360 15.75 21.45 208.29 582.87 G0.5 H2 X270 360 15.75 21.45 208.29 582.87 A7 X180 270 25.45 21.45 208.99 582.87 G183.74 570.68 183.54 582.87 190.17 572.64 186.6 572.83 4 Y2 XV208 270 25.45 21.45 208.99 582.87 A188.04 603.47 188.04 582.87 170.04 589.74 170.04 603.47 179.04 617.19 5 Y12 XV0 XN2 X90 450 58.5 55.77 206.04 591.93 A261.06 620.7 255.21 622.07 259.32 626.45 3 L0 Z0 XN255.46 622.14 312.22 639.23 2 L2 ZN(Locus of) 317.04 660.89 T(mutation) 322.04 645.64 T2 X90 450 45 42.9 464.04 595.74 A422.9 630.55 426.53 625.78 420.58 625.02 3 L0 Z0 XN394.22 639.23 426.31 625.87 2 L2 ZN3 X90 450 9 8.58 467.04 604.32 G0 X90 450 9 8.58 467.04 604.32 A7 X90 180 22.5 12.87 458.04 591.45 G2 X90 180 22.5 12.87 458.04 591.45 A7 X180 270 22.5 13.44 458.04 592.02 G446.75 583.73 458.04 578.58 445.74 576.97 447.48 580.17 4 Y2 XV180 242 22.5 13.44 458.04 592.02 A3 X90 450 9 4.29 467.04 578.58 G0 X90 450 9 4.29 467.04 578.58 A(Mutation) 328.74 537.4 T(operation) 334.74 524.05 T429.59 582.05 435.53 582.88 433.28 577.32 3 L0 ZN396.22 552.47 435.34 582.72 2 L2 ZN224.1 561.82 219.54 565.72 225.19 567.72 3 L0 ZN318.22 547.7 219.79 565.67 2 L2 ZN124.04 510.2 524.54 695.85 RN120.99 449.14 527.59 488.88 R7 XV4 12 Q0 X0.71 (Figure 13:) 120.99 480.88 P3 F0.71 (The evolutionary pr) 178.38 480.88 P0.71 (ogramming appr) 274.64 480.88 P0.71 (oach to modeling evolution. Unlike) 355.86 480.88 P0.77 (genetic algorithm, evolutionary pr) 120.99 468.88 P0.77 (ograms perform sear) 288.1 468.88 P0.77 (ch in the space of networks.) 390.48 468.88 P(Offspring cr) 120.99 456.88 T(eated by mutation r) 179.51 456.88 T(emain within a locus of similarity to their par) 272.68 456.88 T(ents.) 490.8 456.88 T108 90 540 702 C0 0 612 792 C202.88 339.69 412.65 359.82 C3 12 Q0 X0 K(w) 221.75 347.42 T(w) 248.33 347.42 T(N) 268.91 347.42 T0 F(0) 284.72 347.42 T2 F(a) 296.72 347.42 T(e) 304.99 347.42 T(h) 314.25 347.42 T(\050) 310.26 347.42 T(\051) 321.48 347.42 T(,) 290.72 347.42 T(\050) 279.62 347.42 T(\051) 326.08 347.42 T(+) 259.33 347.42 T3 F(w) 365.33 347.42 T2 F(h) 387.88 347.42 T(\316) 376.33 347.42 T(") 356.78 347.42 T(=) 235.75 347.42 T0 0 612 792 CFMENDPAGE%%EndPage: "74" 11%%Page: "75" 11612 792 0 FMBEGINPAGE108 63 540 702 R7 X0 KV108 711 540 720 RV0 12 Q0 X(75) 528.01 712 T108 90 540 702 R7 XV0 X-0.13 (Evolutionary programming provides distinct advantages over genetic algorithms when) 126 694 P0.14 (evolving networks. First, EP manipulates the network representation directly) 108 676 P0.14 (, thus obviat-) 476.76 676 P0.69 (ing the need for a dual representation and the associated interpretation function. Second,) 108 658 P1.14 (by avoiding crossover between networks in creating of) 108 640 P1.14 (fspring, the individuality of each) 378.55 640 P2.37 (network\325) 108 622 P2.37 (s distributed representation is respected. For these reasons, evolutionary pro-) 150.64 622 P0.97 (gramming provides a more appropriate framework for simultaneous structural and para-) 108 604 P2.69 (metric learning in recurrent networks. But rather than supply simplistic heuristics to) 108 586 P0.11 (signify when to manipulate the architecture, the interaction with the task environment can) 108 568 P1.45 (fully specify an appropriate architecture. The GNARL algorithm, presented in the next) 108 550 P(section and investigated in the remainder of this chapter) 108 532 T(, describes one such approach.) 375.99 532 T1 F(4.4  The GNARL Algorithm) 108 496 T0 F1.12 (GNARL, which stands for) 126 472 P3 F1.12 ( GeNeralized Acquisition of Recurr) 255.95 472 P1.12 (ent Links) 429.9 472 P0 F1.12 (, is an evolu-) 474.67 472 P1.12 (tionary algorithm that non-monotonically constructs recurrent connectionist networks to) 108 454 P-0.04 (solve a given task. The name GNARL re\337ects the types of networks that arise from a gen-) 108 436 P0.48 (eralized network induction algorithm performing both structural and parametric learning.) 108 418 P-0.03 (Instead of having uniform or symmetric topologies, the resulting networks have \322gnarled\323) 108 400 P0.85 (interconnections of hidden units which may accurately re\337ect the constraints inherent in) 108 382 P(the task.) 108 364 T0.56 (The general architecture of a GNARL network is straightforward. The input and out-) 126 334 P0.53 (put nodes are considered to be provided by the task and are immutable by the algorithm;) 108 316 P0.07 (thus each network for a given task always has) 108 298 P3 F0.07 (m) 331.44 298 P3 10 Q0.06 (in) 340.1 295 P0 12 Q0.07 (input nodes and) 350.43 298 P3 F0.07 (m) 429.59 298 P3 10 Q0.06 (out) 438.25 295 P0 12 Q0.07 (output nodes. The) 453.58 298 P0.54 (number of hidden nodes varies from 0 to a user) 108 280 P0.54 (-supplied maximum) 339.44 280 P3 F0.54 (h) 439.8 280 P3 10 Q0.45 (max) 445.79 277 P3 12 Q0.54 (.) 462.44 280 P0 F0.54 (Bias nodes are) 468.98 280 P0.59 (optional; if provided in an experiment, they are implemented as an additional input node) 108 262 P2.2 (with constant value one. All non-input nodes employ the standard sigmoid activation) 108 244 P(function. Links use real-valued weights, and must obey three restrictions:) 108 226 T3 F(R) 162 196 T3 10 Q(1) 169.33 193 T0 12 Q(: There can be no links) 174.32 196 T3 F(to) 286.91 196 T0 F( an input node.) 296.24 196 T3 F(R) 162 172 T3 10 Q(2) 169.33 169 T0 12 Q(: There can be no links) 174.32 172 T3 F(fr) 286.91 172 T(om) 294.46 172 T0 F( an output node.) 309.12 172 T3 F(R) 162 148 T3 10 Q(3) 169.33 145 T0 12 Q(: Given two nodes) 174.32 148 T3 F(x) 264.93 148 T0 F( and) 270.26 148 T3 F( y) 290.58 148 T0 F(, there is at most one link from) 298.12 148 T3 F(x) 448.36 148 T0 F( to) 453.69 148 T3 F(y) 469.02 148 T0 F(.) 473.56 148 T-0.39 (Thus GNARL networks may have no connections, sparse connections, or full connectivity) 108 118 P-0.39 (.) 537 118 P1.48 (Consequently) 108 94 P1.48 (, GNARL) 173.18 94 P1.48 (\325) 220.86 94 P1.48 (s search space is all networks,) 224.2 94

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -