⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 dcl.m

📁 FISMAT accommodates different arithmetic operators, fuzzification and defuzzification algorithm, imp
💻 M
字号:
function [y2,m2x,m2y]=dcl(samples,p,c,wii,wij,rands)% [y2,m2x,m2y]= dcl(samples,p,c,wii,wij,rands)%% Stochastic Unsupervised-Differential-Competive-Learning (DCL) algorithm.%% An autoassociative AVQ  two-layer feedforward neural network is trained by% the sample vectors in 'samples' with competive learning. The input field% neuronal field Fx receives the sample data and passes it forward through % synaptic connection matrix M to the p competing neurons in field Fy. The% metaphor of competing neurons reduces to nearest neighbor classification.% The system compares the current vector random sample x(t) in Euclidian % distance to the p synaptic vectors m1(t),...,mp(t). If the jth synaptic% vector is closest to x(t), then the jth neuron "wins" the competition for% activation at time t. The jth competing neuron should behave as a class% indicator function: Sj=IDj. More generally the jth Fy neuron only esti-% mates IDj. % rands is a vector of random permutations: % rands=randperm(number_of_samples)%% Reference:% Kong, Kosko : 'Differential Competive Learning for Centroid Estimation% and Phoneme Recognition'. % IEEE Transcactions on Neural Networks, Vol.2, No.1, Jan.1991% pp.118-124%% FSTB - Fuzzy Systems Toolbox for MATLAB% Copyright (c) 1993-1996 by Olaf Wolkenhauer% Control Systems Centre at UMIST% Manchester M60 1QD, UK%% 20-May-1994% p=  number of synaptic vectors mj defining p columns of the synaptic%     connection matrix M.%     p is also equal to the number of competing nonlinear neurons in the%     output field, Fy.% n=  number of inputs or linear neurons in the input neuronal field, Fx.% M=  synaptic connection matrix, which interconnects the input field with %     the output field.% x=  matrix which stores all training sample vectors. One vector in a row.%     In this function the variable is called 'samples'.% W=  pxp matrix containing the Fy within-field synaptic connection %     strengths. Diagonal elements wii are positive, off-diagonal elements%     negative. % yt= Fy neuronal activations. yt equals y(t) and yt1 equals y(t+1).% Competive AVQ Algorithm%% The AVQ system compares a current vector random sample x(t) in Euclidean% distance to the p columns of the matrix M.% 1.) Initialize synaptic vectors: mi(0)=x(i) , i=1,...,p. % 2.) For random sample x(t), find the closest or "winning" synaptic vector%     mj(t). (nearest neighbor classification)%     Define N synaptic vectors closest to x as "winners".% 3.) Update the winning synaptic vector(s) mj(t) with an appropriate learn-%     ing algorithm.[nu_of_v,n]=size(samples); % nu_of_v equals the number of sample vectors.% Observing yt(1) and M(1,2):y2=zeros(1,nu_of_v);m2x=zeros(1,nu_of_v);m2y=zeros(1,nu_of_v);M=zeros(n,p);%wii=1;  Positive diagonal elements - winning neurons excites themselves %wij=-1; and inhibit all other neurons. => Off-diagonal elements negative.W=wij.*ones(p,p)+(wii+abs(wij)).*eye(p,p); % other pos. or neg.values ? ? ?yt=zeros(1,p);                  % initialisation ? ? ?yt1=yt;% Initialisation: % (sample dependence avoids pathologies disturbing the nearest neigbor %  learning)for i=1:p,  M(:,i)=samples(i,:)'; end;% If x is randomly chosed, no on-line calculation possible ? ? ? % rands=randperm(nu_of_v); eunorm=ones(1,p);        % For measuring the distance to a sample vector.runtime=clock;for t=1:nu_of_v,  % For random sample x(t) find the closest synaptic vector mj(t):  sample=samples(rands(t),:);  for i=1:p,    eunorm(i)=norm(M(:,i)-sample');  end;  [winnorm, j]=min(eunorm);  % Updating the Fy neuronal activations yi with an additive model:  % Nonnegative signal function approximating a steep binary logistic   % sigmoid, with some constant c>0 :  % c=1;                                     Syt=1./(1+exp(-c.*yt));                yt1=yt+(sample*M)+Syt*W;  y2(t)=yt1(2);  % Updating the winning synaptic vector(s) mj(t) with the differential  % competive learning algorithm:  % ct=1/t;  % harmonic series coefficient for a decreasing gain sequence over  % the learning period. For fast robust stochastic approximation,   % only the h.s.c satisfy the contraints in [Kong].  ct=0.1*(1-(t/nu_of_v));  % Activation difference as an approximation of the competive signal   % difference DSj:  Dyjt=sign(yt1(j)-yt(j));  M(:,j)=M(:,j)+ct*Dyjt.*(sample'-M(:,j));  yt=yt1;   plot(M(1,:),M(2,:),'.','EraseMode','none'); drawnow  m2x(t)=M(1,2);  m2y(t)=M(2,2);end; % of outer t loop for the DCL training period.runtime=etime(clock,runtime)

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -