📄 truckpsc.m
字号:
function [FAMbank,msum,y2,m2x,m2p,M,mdens]=truckpsc(xsec,phisec,thetasec,samples,N,p,wii,wij,c,rands)% [FAMbank,msum,y2,m2x,m2p,M,mdens]=truckpsc(xsec,phisec,thetasec,samples,N,p,wii,wij,c,rands)%% Product Space Clustering (PSC) to generate FAM rules, using a adaptive% vector quantisation (AVQ) system. (Realised by an autoassociative two-% layer feedforward neural network). The stochastic learning system is using% the Differential-Competive-Learning (DCL) algorithm.%% xsec,phisec and thetasec are matrixes with pairs (in each row) describing% the intervals in which the spaces of the variables x,phi,theta are divided% samples is a matrix containing all training sample vectors.% FAMbank is the estimated FAMbank from the training data.% msum is a matrix from the same size like FAMbank. A field number % indicates how many DCL trained synaptic vectors felt in the corresponding% cell. %% FSTB - Fuzzy Systems Toolbox for MATLAB% Copyright (c) 1993-1996 by Olaf Wolkenhauer% Control Systems Centre at UMIST% Manchester M60 1QD, UK%% 29-May-1994 % This example is taken from the paper % [1] S.G Kong and B.Kosko: % 'Adaptive Fuzzy Systems for Backing up a Truck-and-Trailer', % IEEE Transactions on Neural Networks, Vol.3, No.2, March 1992.% Methods are explained in% [2] B.Kosko:% 'Neural Networks and Fuzzy Systems'% Prentice Hall, 1992% [3] B.Kosko (June 1992):% 'Fuzzy Systems as Universal Approximators'% Notes to appear in the IEEE Transactions on Computers, 1993% Product space clustering% % Product space clustering is a form of stochastic adaptive vector quanti-% sation. Adaptive vector quantization (AVQ) systems adaptively quantize % pattern clusters in R^n and thus estimates clusters. Stochastic competive% learning systems are AVQ systems. % N= number of nearest synaptic vectors (or "winners")% p= number of synaptic vectors mj defining p columns of the synaptic% connection matrix M.% p is also equal to the number of competing nonlinear neurons in the% output field, Fy.% n= number of inputs or linear neurons in the input neuronal field, Fx.% M= synaptic connection matrix, which interconnects the input field with % the output field.% x= matrix which stores all training sample vectors. One vector in a row.% In this function the variable is called 'samples'.% W= pxp matrix containing the Fy within-field synaptic connection % strengths. Diagonal elements wii are positive, off-diagonal elements% negative. % yt= Fy neuronal activations. yt equals y(t) and yt1 equals y(t+1).% Competive AVQ Algorithm%% The AVQ system compares a current vector random sample x(t) in Euclidean% distance to the p columns of the matrix M.% 1.) Initialize synaptic vectors: mi(0)=x(i) , i=1,...,p. % 2.) For random sample x(t), find the closest or "winning" synaptic vector% mj(t). (nearest neighbor classification)% Define N synaptic vectors closest to x as "winners".% 3.) Update the winning synaptic vector(s) mj(t) with an appropriate learn-% ing algorithm. Here the DCL algorithm.FAMbank=zeros(length(phisec),length(xsec));msum=FAMbank; % msum contains the number of quant.vector in each resulting rule cell.% Density of m vectors in the cells: (simulating a 3D-array)mdens=zeros(length(phisec),length(xsec)*length(thetasec)); [nu_of_v,n]=size(samples); % nu_of_v equals the number of sample vectors.nu_of_cells=length(xsec)*length(phisec)*length(thetasec); %p=nu_of_cells;% observing yt(2), M(2,1) and M(2,2):y2=zeros(1,nu_of_v);m2x=zeros(1,nu_of_v);m2p=m2x; M=zeros(n,p);% Positive diagonal elements - winning neurons excites themselves % and inhibit all other neurons. => Off-diagonal elements negative.%wii=2; wij=-1; W=wij.*ones(p,p)+(wii+abs(wij)).*eye(p,p); yt=zeros(1,p); yt1=yt;% Initialisation: % (sample dependence avoids pathologies disturbing the nearest neigbor % learning)for i=1:p, M(:,i)=samples(i,:)'; % In the truck example: x(i,:)=(x,phi) y=thetaend;% rands=randperm(nu_of_v); eunorm=ones(1,p); % For measuring the distance to a sample vector.%figure%axis([0 100 -90 270]);grid on; %xlabel('x');ylabel('phi');%title('distribution of quantization vectors mj(t)');%hold onruntime=clock;for t=1:nu_of_v, % For random sample x(t) find the closest synaptic vector mj(t): for i=1:p, sample=samples(rands(t),:); eunorm(i)=norm(M(:,i)-sample'); end; [winnorm, col_in_M]=sort(eunorm); % Updating the Fy neuronal activations yi with an additive model: % Nonnegative signal function approximating a steep binary logistic % sigmoid, with some constant c>0 : %c=1; Syt=1./(1+exp(-c.*yt)); yt1=yt+(sample*M)+Syt*W; y2(t)=yt1(2); % Updating the winning synaptic vector(s) mj(t) with the differential % competive learning algorithm: % Harmonic series coefficient for a decreasing gain sequence over % the learning period would be ct=1/t. ct=0.1*(1-(t/nu_of_v)); for no_of_winner=1:N, % N winners mj(t) nearest to x(t) j=col_in_M(no_of_winner); % Activation difference as an approximation of the competive signal % difference DSj: (DCL-learning) %Dyjt=sign(yt1(j)-yt(j)); %M(:,j)=M(:,j)+ct*Dyjt.*(sample'-M(:,j)); % Using UCL learning M(:,j)=M(:,j)+ct*(sample'-M(:,j)); end; % of winning synaptic vector loop. yt=yt1; %plot(M(1,:),M(2,:),'.','EraseMode','none'); m2x(t)=M(1,2); m2u(t)=M(2,2);end; % of outer t loop for the DCL training period.runtime=etime(clock,runtime)% Building the FAMbank: % Adding FAM rule if necessary: (if mj(t) fell in FAM cell) % (non-overlapping sectors assumed :) for j=1:p, for co=length(xsec):-1:1, % starting with RI, going to LE if M(1,j)>=xsec(co,1) & M(1,j)<=xsec(co,2), xs=co;break; else xs=0;end; end; for co=length(phisec):-1:1, % starting with LB, going to RB if M(2,j)>=phisec(co,1) & M(2,j)<=phisec(co,2), phis=co;break; else phis=0;end; end; for co=length(thetasec):-1:1, % starting with PB, going to NB if M(3,j)>=thetasec(co,1) & M(3,j)<=thetasec(co,2), thetas=co;break; else thetas=0; end; end; if (xs+phis+thetas)>=3, % Synaptic vector fells in cell. % increment number of vectors in cell: mdens(phis,xs+(thetas-1)*length(xsec))=... mdens(phis,xs+(thetas-1)*length(xsec))+1; end;end; % of testing synaptic vectors fitting in a cell.% Finding the theta-cell with the most densely clustered data:% (Other ways for the decision which consequent set is used, see for example% Kosko,Pacini: 'Adaptive Fuzzy Systems for Target Tracking' in [2] !)secvec=zeros(1,length(thetasec)); % For cell specified by x and phi, this% vector contains the number of quantisation vectors in each cell correspon-% ding to all consequences of theta.for xs=1:length(xsec), for phis=1:length(phisec), for thetas=0:length(thetasec)-1, secvec(thetas+1)=mdens(phis,xs+thetas*length(xsec)); end; if ~isempty(find(secvec)), [msum(phis,xs),FAMbank(phis,xs)]=max(secvec); else msum(phis,xs)=0; FAMbank(phis,xs)=0; end; end;end;% mdens is a length(phisec) x (length(xsec)*length(thetasec)) matrix.% The fields are ordered from left to right the fields for the variable % theta (Three dimensional arrays are not possible, therefore this % ordering). After building the FAMbank, the matrix msum contains the number% of quantisation vectors in each resulting rule cell.% The index for the consequence cell with the maximal number of quantisation% vectors corresponds to a linguistic term description. All set in the % fuzzy toolbox are ordered from left to right - from most negative to most% positive. E.g.: Five ling.terms: "NB" "NS" "NZ" "PS" "PB". Therefore a 1% stands for "NB" or "Negative Big".
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -