📄 gendemo.m
字号:
% This script demonstrates the use of the simple genetic algorithm% rountine genetic.m% Remark: How genetic might be improved% Remark: % Remark: More options for selection, crossover, and mutation schemes% Remark: e.g. better scaling (no super-individual dominance)% Remark: no self-mating% Remark: reproduction handle non-positive fitness% Remark: % Remark: Optimize code for speed% Remark: % Remark: Better visualization% Remark: % Remark: Support continuous mutation% Remark: % Remark: Provide ability to alter (enhance) each child via% Remark: local hill climbing% Remark: echo on% So far, I've attempted three problems% 1.) maximizing the function f(x)=x^10 0<x<1% 2.) finding the minimum of the banana function (re: bandemo.m)% 3.) finding the hightest point of a modification of peaks% First consider the function% f = x^10, 0<x<1%% This example is taken from Goldberg, Chapter 3.% To find the maximum of this function within the given range,% you may call genetic one of three ways.%% 1.) with the expression fun='x^10'% 2.) calling the M-file fun.m (which does not exist) in which the% fitness function is not encoded. The function would say simply % function f = fun.m(x)% f = x^10;% 3.) calling the M-file testgen (which does exist) in which the% fitness function has already been encoded% First invoke genetic using an expression for the argument funoptions = foptions([1 1e-3]);options(13) = 0.03;options(14) = 50;vlb = 0;vub = 1;bits = 30;pause % Hit any key to continue[x,stats,options,bf,fgen,lgen] = genetic('x^10',[],options,vlb,vub,bits);% A few notes:% First notice that x is returned as a real between 0 and 1xpause % Hit any key to continue% Also the initial and final generations, fgen and lgen, are% returned as reals between 0 and 1fgenpause % Hit any key to continue% The total number of times the fitness function was evaluated is% equal to the ((number of generations)+1)*size_pop% re: initial population also requires size_pop function evaluations%% In the present case, the fitness function was callednum_fit_call = (options(10)+1)*options(11)pause % Hit any key to continue% Now envoke the genetic algorithm using the encoded M-file testgen.m% Notice since the function has already been encoded as a binary% string there is no need to supply a bits argument.vlb = zeros(1,30);vub = ones(1,30);[x1,st1,options,bf1,fg1,lg1] = genetic('testgen',[],options,vlb,vub);% Now notice that the answer (x1) and the initial and final% populations (fg1 and lg1) are returned as a binary stringsx1size(fg1)pause % Hit any key to continue% These may be decoded using decode.mx1d = decode(x1,0,1,size(vlb,2))fg1d = decode(fg1,0,1,size(vlb,2))pause % Hit any key to continue% Now consider the banana function explored in bandemo.m% This example will probably (GA's are probabilistic) show% that although the simple genetic algorithm is good at finding% the region in which a global minimum (maximum) might exist,% it does not possess the convergence properties of calcluls% based techniques once it near the minimum (maximum).%% First recall that the banana function equation is% f(x) = 100*(x(2)-x(1))^2 + (1-x(1))^2% and that the goal is to find a minimum of the function% in the range -2<x(1)<2 and -1<x(2)<3.%% First a couple details. Since genetic will maximize a function,% to do a minimization, set FUN=-f. Also, since genetic.m% (presently - this may change) expects the fitness function to % always evaluate positive, a constant must be added. Thusf='1000 - (100*(x(2)-x(1))^2 + (1-x(1))^2)' % Recall too the contour plot of the banana function.%% << Please be patient while the contour plot is generated. >>%figurexx = [-2:0.125:2];yy = [-1:0.125:3];[x,y]=meshgrid(xx,yy) ;meshd = 100.*(y-x.*x).^2 + (1-x).^2; conts = exp(3:20);contour(xx,yy,meshd,conts)hold on xlabel('x1')ylabel('x2')title('Minimization of the Banana function')plot(1,1,'o')text(1.1,1,'Solution')% We encode x(1) and x(2) to have 16 bits eachbits = [16 16];vlb = [-2 -1];vub = [2 3];options = foptions([1 -1]);options(13) = 0.0333;pause % Hit any key to continue[x,stats,options,bf,fgen,lgen] = genetic(f,[],options,vlb,vub,bits);% What you probably just saw was that the genetic algorithm% quickly (within a couple generations) settled into a region% near the minimum of the function.%% The present best fitness is atx% Now consider that the genetic algorithm has created 30 generations% each of population size 30, thus calling for 31*30=930% function evaluations. Compare this to the results of the% calulus based techniques found in bandemo.pause % Hit any key to continue% So, if GAs aren't good at converging, what CAN they do?% Well, they are supposed to be better able to determine a% region where a global maximum might exist for functions% which might otherwise mislead a more classical (calculus % based, hill climbing) optimization technique.%% Consider for example the problem of determining the highest% peak in a region with multiple hills. One might describe such% a region mathematically via an adaptation of the equation which% generates peaks. Specifically, consider an equation of the form%% n% ___ 2 2% \ - (x-X ) - (y-Y )% \ i i% z = f(x,y) = / H e% / i% ---% i=1%% If n were one, this would be the equation of a single symmetric% hill with a peak of height H centered at (X,Y). One could make% the hill nonsymmetric by sprinkling multiplicative constants in% the exponent terms. More complicated hill landscapes could be% created by making H=H(x,y). For more information type help peaks.%% For the present, we avoid such complications. By merely having% n hills of different heights close enough, a fairly interesting% topology should be created. Let's see what ten randomly generated% hills in the range 0<x<6 and 0<y<6 look like. Also define the Hi's% to be randomly distributed between 10 and 20.pause % Hit any key to continue%% << If exp is column vector, making H a row vector allows H*exp >>%X = 6*rand(10,1);Y = 6*rand(10,1);H = 10*rand(1,10) + 10;P1 = X;P2 = Y;P3 = H;% Show a surface plot of the function% Note additional parameters are referred to as P1...P10 in genetic.mf = 'P3*exp(-(x(1)-P1).^2 -(x(2)-P2).^2)'pts = 0:0.05:6;x_y = ones(length(pts),1)*pts;% To give valley points some probability of reproducingz = 5*ones(size(x_y)); for i=1:10, z = z + H(i)*exp(-(x_y-X(i)).^2 -(x_y'-Y(i)).^2);endhold offcontour(x_y,x_y',z)pause % Hit any key to continuesurf(x_y,x_y',z);hold on % Does this look like a tough problem?% Finally, let's call the genetic algorithm.% Notice how the matrices X,Y, and H are passed as additional argumentsvlb = [0 0];vub = [6 6];options = foptions(1);options(13) = 0.0333;bits = [16 16];pause % Hit any key to continue[x,stats,options,bf,fg,lg] = genetic(f,[],options,vlb,vub,bits,X,Y,H);% The highest point found by the genetic algorithm isbf% Which is found atx% Compare this to the highest point on the grid[i,j] = find(max(max(z))==z);z(i,j)% Which is found at0.05*[i j]% closeecho off% end gendemo
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -