⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 elm_de.m

📁 模糊神经网络实现函数逼近与分类
💻 M
📖 第 1 页 / 共 2 页
字号:
function [TrainingTime, TrainingAccuracy, TestingAccuracy]=ELM_DE(TrainingData_File, TestingData_File, Elm_Type, NumberofHiddenNeurons, ActivationFunction);% minimization of a user-supplied function with respect to x(1:D),% using the differential evolution (DE) algorithm of Rainer Storn% (http://www.icsi.berkeley.edu/~storn/code.html)% % Special thanks go to Ken Price (kprice@solano.community.net) and% Arnold Neumaier (http://solon.cma.univie.ac.at/~neum/) for their% valuable contributions to improve the code.% % Strategies with exponential crossover, further input variable% tests, and arbitrary function name implemented by Jim Van Zandt % <jrv@vanzandt.mv.com>, 12/97.%% Output arguments:% ----------------% bestmem        parameter vector with best solution% bestval        best objective function value% nfeval         number of function evaluations%% Input arguments:  % ---------------%% fname          string naming a function f(x,y) to minimize% VTR            "Value To Reach". devec3 will stop its minimization%                if either the maximum number of iterations "itermax"%                is reached or the best parameter vector "bestmem" %                has found a value f(bestmem,y) <= VTR.% D              number of parameters of the objective function % XVmin          vector of lower bounds XVmin(1) ... XVmin(D)%                of initial population%                *** note: these are not bound constraints!! ***% XVmax          vector of upper bounds XVmax(1) ... XVmax(D)%                of initial population% y		        problem data vector (must remain fixed during the%                minimization)% NP             number of population members% itermax        maximum number of iterations (generations)% F              DE-stepsize F from interval [0, 2]% CR             crossover probability constant from interval [0, 1]% strategy       1 --> DE/best/1/exp           6 --> DE/best/1/bin%                2 --> DE/rand/1/exp           7 --> DE/rand/1/bin%                3 --> DE/rand-to-best/1/exp   8 --> DE/rand-to-best/1/bin%                4 --> DE/best/2/exp           9 --> DE/best/2/bin%                5 --> DE/rand/2/exp           else  DE/rand/2/bin%                Experiments suggest that /bin likes to have a slightly%                larger CR than /exp.% refresh        intermediate output will be produced after "refresh"%                iterations. No intermediate output will be produced%                if refresh is < 1%%       The first four arguments are essential (though they have%       default values, too). In particular, the algorithm seems to%       work well only if [XVmin,XVmax] covers the region where the%       global minimum is expected. DE is also somewhat sensitive to%       the choice of the stepsize F. A good initial guess is to%       choose F from interval [0.5, 1], e.g. 0.8. CR, the crossover%       probability constant from interval [0, 1] helps to maintain%       the diversity of the population and is rather uncritical. The%       number of population members NP is also not very critical. A%       good initial guess is 10*D. Depending on the difficulty of the%       problem NP can be lower than 10*D or must be higher than 10*D%       to achieve convergence.%       If the parameters are correlated, high values of CR work better.%       The reverse is true for no correlation.%% default values in case of missing input arguments:% 	VTR = 1.e-6;% 	D = 2; % 	XVmin = [-2 -2]; % 	XVmax = [2 2]; %	y=[];% 	NP = 10*D; % 	itermax = 200; % 	F = 0.8; % 	CR = 0.5; % 	strategy = 7;% 	refresh = 10; %% Cost function:  	function result = f(x,y);%                      	has to be defined by the user and is minimized%			w.r. to  x(1:D).%% Example to find the minimum of the Rosenbrock saddle:% ----------------------------------------------------% Define f.m as:%                    function result = f(x,y);%                    result = 100*(x(2)-x(1)^2)^2+(1-x(1))^2;%                    end% Then type:%% 	VTR = 1.e-6;% 	D = 2; % 	XVmin = [-2 -2]; % 	XVmax = [2 2]; % 	[bestmem,bestval,nfeval] = devec3("f",VTR,D,XVmin,XVmax);%% The same example with a more complete argument list is handled in % run1.m%% About devec3.m% --------------% Differential Evolution for MATLAB% Copyright (C) 1996, 1997 R. Storn% International Computer Science Institute (ICSI)% 1947 Center Street, Suite 600% Berkeley, CA 94704% E-mail: storn@icsi.berkeley.edu% WWW:    http://http.icsi.berkeley.edu/~storn%% devec is a vectorized variant of DE which, however, has a% propertiy which differs from the original version of DE:% 1) The random selection of vectors is performed by shuffling the%    population array. Hence a certain vector can't be chosen twice%    in the same term of the perturbation expression.%% Due to the vectorized expressions devec3 executes fairly fast% in MATLAB's interpreter environment.%% This program is free software; you can redistribute it and/or modify% it under the terms of the GNU General Public License as published by% the Free Software Foundation; either version 1, or (at your option)% any later version.%% This program is distributed in the hope that it will be useful,% but WITHOUT ANY WARRANTY; without even the implied warranty of% MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the% GNU General Public License for more details. A copy of the GNU % General Public License can be obtained from the % Free Software Foundation, Inc., 675 Mass Ave, Cambridge, MA 02139, USA.%-----Check input variables---------------------------------------------err=[];XVmin=-1;XVmax=1;% if nargin<1, error('devec3 1st argument must be function name'); else %   if exist(fname)<1; err(1,length(err)+1)=1; end; end;% if nargin<2, VTR = 1.e-6; else %   if length(VTR)~=1; err(1,length(err)+1)=2; end; end;% if nargin<3, D = 2; else%   if length(D)~=1; err(1,length(err)+1)=3; end; end; % if nargin<4, XVmin = [-2 -2];else%   if length(XVmin)~=D; err(1,length(err)+1)=4; end; end; % if nargin<5, XVmax = [2 2]; else%   if length(XVmax)~=D; err(1,length(err)+1)=5; end; end; % if nargin<6, y=[]; end; % if nargin<7, NP = 10*D; else%   if length(NP)~=1; err(1,length(err)+1)=7; end; end; % if nargin<8, itermax = 200; else%   if length(itermax)~=1; err(1,length(err)+1)=8; end; end; % if nargin<9, F = 0.8; else%   if length(F)~=1; err(1,length(err)+1)=9; end; end;% if nargin<10, CR = 0.5; else%   if length(CR)~=1; err(1,length(err)+1)=10; end; end; % if nargin<11, strategy = 7; else%   if length(strategy)~=1; err(1,length(err)+1)=11; end; end;% if nargin<12, refresh = 10; else%   if length(refresh)~=1; err(1,length(err)+1)=12; end; end; % if length(err)>0%   fprintf(stdout,'error in parameter %d\n', err);%   usage('devec3 (string,scalar,scalar,vector,vector,any,integer,integer,scalar,scalar,integer,integer)');    	% endREGRESSION=0;CLASSIFIER=1;Gain = 1;                                           %  Gain parameter for sigmoid%%%%%%%%%%% Load training datasettrain_data=load(TrainingData_File);T=train_data(:,1)';P=train_data(:,2:size(train_data,2))';clear train_data;                                   %   Release raw training data array%%%%%%%%%%% Load testing datasettest_data=load(TestingData_File);TV.T=test_data(:,1)';TV.P=test_data(:,2:size(test_data,2))';clear test_data;                                    %   Release raw testing data arrayNumberofTrainingData=size(P,2);NumberofTestingData=size(TV.P,2);NumberofInputNeurons=size(P,1);NumberofValidationData = round(NumberofTestingData / 2);if Elm_Type~=REGRESSION    %%%%%%%%%%%% Preprocessing the data of classification    sorted_target=sort(cat(2,T,TV.T),2);    label=zeros(1,1);                               %   Find and save in 'label' class label from training and testing data sets    label(1,1)=sorted_target(1,1);    j=1;    for i = 2:(NumberofTrainingData+NumberofTestingData)        if sorted_target(1,i) ~= label(1,j)            j=j+1;            label(1,j) = sorted_target(1,i);        end    end    number_class=j;    NumberofOutputNeurons=number_class;        %%%%%%%%%% Processing the targets of training    temp_T=zeros(NumberofOutputNeurons, NumberofTrainingData);    for i = 1:NumberofTrainingData        for j = 1:number_class            if label(1,j) == T(1,i)                break;             end        end        temp_T(j,i)=1;    end    T=temp_T*2-1;    %%%%%%%%%% Processing the targets of testing    temp_TV_T=zeros(NumberofOutputNeurons, NumberofTestingData);    for i = 1:NumberofTestingData        for j = 1:number_class            if label(1,j) == TV.T(1,i)                break;             end        end        temp_TV_T(j,i)=1;    end    TV.T=temp_TV_T*2-1;end                                                 %   end if of Elm_Typeclear temp_T;clear temp_T;VV.P = TV.P(:,1:NumberofValidationData);VV.T = TV.T(:,1:NumberofValidationData);TV.P(:,1:NumberofValidationData)=[];TV.T(:,1:NumberofValidationData)=[];NumberofTestingData = NumberofTestingData - NumberofValidationData;%%%%%%%%%%% Calculate weights & biasesCR=0.8;NP=200;D=NumberofHiddenNeurons*(NumberofInputNeurons+1);itermax=20;refresh=1;strategy = 3;

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -