⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 nnigls.m

📁 类神经网路─MATLAB的应用(范例程式)
💻 M
字号:
function [W1,W2,lambda,GAMMA]=nnigls(NetDef,NN,W1,W2,trparms,repeat,GAMMA,Y,U)
%  NNIGLS
%  ------
%          Train a multi-output NNARX model using an iterated generalized
%          least squares method (IGLS).
%
%  CALL:
%     [W1,W2,lambda,GAMMA] = nnigls(NetDef,NN,W1,W2,trparms,repeat,GAMMA,Y,U)
%
%  INPUTS: 
%    U,Y,NN,W1,W2 : See NNARXM
%    trparms : Contains parameters associated with the training (see MARQ)
%              If trparms=[] it is reset to trparms = [50 0 1 0]
%    repeat  : Number of times the procedure should be repeated
%              If repeat=[] it is set to repeat=5
%    GAMMA   : Covariance matrix. If passed as [] it is initialized to 
%              the identity matrix.
%
%  OUTPUTS:
%    W1, W2, lambda: See MARQ
%    GAMMA         : Estimated covariance matrix
%                                                                                  
%  Programmed by : Magnus Norgaard, IAU/IMM technical University of Denmark
%  LastEditDate  : June 15, 1997

% >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   INITIALIZATIONS   <<<<<<<<<<<<<<<<<<<<<<<<<<<<<
[ny,N]  = size(Y);                 % Size of data set
[ny,NNn]= size(NN);
na = NN(:,1);
if NNn==1
  nb = 0;                          % nnar model
  nk = 0;
  nu = 0;
  nab=na;
else
  [nu,N] = size(U); 
  nb     = NN(:,2:1+nu);           % nnarx model
  nk     = NN(:,2+nu:1+2*nu);
  if nu>1,
    nab  = na + sum(nb')';
  else
    nab    = na+nb;
  end
end
if isempty(repeat), repeat = 5; end 
nmax        = max(max([na nb+nk-1]));


% -- Initialize weights if nescessary --
if isempty(W1) | isempty(W2),
  hidden = length(NetDef(1,:));    % Number of hidden neurons
  W1 = rand(hidden,sum(nab)+1)-0.5;
  W2 = rand(ny,hidden+1)-0.5;
end

% -- Initialize 'trparms' if nescessary --
if isempty(trparms), trparms=[50 0 1 0]; end



% >>>>>>>>>>>>>>>>>>>>  CONSTRUCT THE REGRESSION MATRIX PHI   <<<<<<<<<<<<<<<<<<<<<
PHI = zeros(sum(nab),N-nmax);
jj  = nmax+1:N;
index = 0;
for o=1:ny,
  for k = 1:na(o), PHI(k+index,:)    = Y(o,jj-k); end
  index = index+na(o);
  for kk = 1:nu,
    for k = 1:nb(o,kk), PHI(k+index,:) = U(kk,jj-k-nk(o,kk)+1); end
    index = index + nb(o,kk);
  end
end


% >>>>>>>>>>>>>>>>>>>>         CALL TRAINING FUNCTION         <<<<<<<<<<<<<<<<<<<<<
if isempty(GAMMA), GAMMA=eye(ny); end
GAMMAi = inv(GAMMA);
for iglsiter=1:repeat,
  S = sqrtm(GAMMAi);
  YS = S*Y;
  [W1,W2,PIvec,iteration,lambda]=marq(NetDef,W1,W2,PHI,YS(:,nmax+1:N),trparms);
  [Yhat,E] = nneval(NetDef,W1,W2,PHI,YS(:,nmax+1:N),1);
  E=(GAMMA*S')*E;
  GAMMA = (E*E')/(N-nmax);
  GAMMAi= inv(GAMMA);
end

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -