⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 ardemo_hzh.m

📁 利用神经网络预测时间序列
💻 M
字号:
% ARdemo.m - Time series prediction demonstration% % call tsgenf_hzh.m, autocorr.m% then perform prediction using% (a) auto-regressive method% (b) TDNN method%% copyright (C) 2004 by Hao Zhihua% created: 12/4/2004% modified: 12/5/2004 add time seris analysis, %                      AR model is updatedclear all,clf% generate time series data加载到2003年的太阳黑子的数据load sunspot2003;[xx_scale,xmin,xmax]=scale(sunspot2003,0,1);%normalised to 0-1. xx_scale就是规格化的分析数据xx=xx_scale;% make x zero meannmax=304; %   length of time seriesnr=294;    %   length of this sequence used for trainingnstep=1;   %   n step predictionnlag=9;    %   # of past samples used for prediction 时间序列的阶数disp('****************************')disp(['Generate a time series of ' int2str(nmax) ' points.'])disp(['Use the first ' int2str(nr) ' points for training, and'])disp(['last ' int2str(nmax-nr) ' points for testing'])disp(['Use ' int2str(nstep) '-step prediction based on past ' int2str(nlag) ' samples.'])disp('****************************')[x0,train,test]=tsgenf_hzh(xx_scale,nmax,nr,nstep,nlag);% x0 is the time series% train is nr-nlag-nstep+1 by nlag+1,  test is nmax-nr by nlag + 1%% test= [x(nmax-nlag-nstep+1)     x(nmax-nstep) |        x(nmax)]%       [                     ...               |      x(nmax-1)]%       [x(nr-nlag-nstep+2)   ... x(nr-nstep+1) |        x(nr+1)]%% train=[x(nr-nlag-nstep+1)   ... x(nr-nstep)   |        x(nr)  ]%       [x(nr-nlag-nstep)     ... x(nr-nstep-1) |       x(nr-1) ]%       [                     ...               |               ]%       [x(1)      x(2)       ... x(nlag)       |  x(nlag+nstep)]%% **************************% analyzing the time series% **************************rxx=autocorr(xx,50); % compute the first 50 auto-correlation lagsfigure(1),clf,subplot(211),plot([1:nmax],xx)subplot(212),stem([0:49],rxx),title('auto-correlation lags')disp('Press any key to continue ...');%pause% **************************% auto-regressive method   *% **************************%a0=levinson(rxx(1:nlag+1),nlag+1);  % a row vector, a(1)=1a0=arburg(xx(1:nr),nlag)  % a row vector, a(1)=1  是否应该取train data to get the cofficenta=fliplr(-a0(2:nlag+1)); % filter coefficients a=[-a(nlag) -a(nlag-1) ... -a(1)]% the reverse of the order is because the train file is organized % that each row is [x(n-nlag) x(n-nlag+1) ... x(n-1)]dap=train(:,1:nlag)*a';     % AR model training set fitting resulttap=test(:,1:nlag)*a';      % AR model nstep prediction results。tap是直接预测的结果lag=test(1,1:nlag);   % last nlag of training datafor i=1:length(tap), % recursive prediction using only last nlag training data!                     %用训练数据的最后几个延迟数据进行递归预测,tr是递归预测的结果   tr(i)=lag*a';  % i-th prediction   lag=[lag(2:nlag) tr(i)];endfigure(2),subplot(211),plot([nlag+1:nr],train(:,nlag+1),'g-',...   [nlag+1:nr],dap,'b:')legend('original','training out')figure(2),subplot(212),plot([nr+1:nmax],test(:,nlag+1),'g-',...   [nr+1:nmax],tap,'c',...   [nr+1:nmax],tr,'kd')figure(1)e1=test(:,nlag+1)-tap;e2=test(:,nlag+1)-tr';sum(e1.*e1)/length(tap)%得到结果为0。0107sum(e2.*e2)/length(tr)%得到结果为0。0827   

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -