⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 sirdemo2.m

📁 非常不错的非线性非高斯环境下的粒子滤波程序进化算法
💻 M
字号:
% PURPOSE : To estimate the input-output mapping with inputs x%           and outputs y generated by the following nonlinear,  %           nonstationary state space model:%           x(t+1) = 0.5x(t) + [25x(t)]/[(1+x(t))^(2)] %                    + 8cos(1.2t) + process noise%           y(t) =  x(t)^(2) / 20  + 6 squareWave(0.05(t-1)) + 3%                   + time varying measurement noise%           using a multi-layer perceptron (MLP) and both the EKF and %           the hybrid importance-samping resampling (SIR) algorithm.                                        % AUTHOR  : Nando de Freitas - Thanks for the acknowledgement :-)% DATE    : 08-09-98clear;echo off;% INITIALISATION AND PARAMETERS:% =============================N = 120;               % Number of time steps.t = 1:1:N;             % Time.  x0 = 0.1;              % Initial input.x = zeros(N,1);        % Input observation.y = zeros(N,1);        % Output observation.x(1,1) = x0;           % Initia input. actualR = 2;           % Measurement noise variance.actualQ = 1e-2;        % Process noise variance.        numSamples=50;         % Number of Monte Carlo samples per time step. s1=10;                 % Neurons in the hidden layer.s2=1;                  % Neurons in the output layer - only one in this implementation.Q = 1e-2;              % Process noise variance.R = 2;                 % Measurement noise variance.initVar1= 10;          % Variance of prior for weights in the hidden layer.initVar2= 10;          % Variance of prior for weights in the output layer.KalmanR = 2;           % Kalman filter measurement noise covariance hyperparameter;KalmanQ = 1e-2;        % Kalman filter process noise covariance hyperparameter;KalmanP = 1;           % Kalman filter initial weights covariance hyperparameter;% GENERATE PROCESS AND MEASUREMENT NOISE:% ======================================v = sqrt(actualR)*randn(N,1); v = sqrt(actualR)*sin(0.1*t)'.*randn(N,1); w = sqrt(actualQ)*randn(N,1);% GENERATE INPUT-OUTPUT DATA:% ==========================y(1,1) = (x(1,1)^(2))./20 + v(1,1); for t=2:N,  x(t,1) = 0.5*x(t-1,1) + 25*x(t-1,1)/(1+x(t-1,1)^(2)) + 8*cos(1.2*(t-1)) + w(t,1);  y(t,1) = (x(t,1).^(2))./20 + 6*square(0.05*(t-1))+ 3 + v(t,1); end;figure(1)clf;subplot(211)plot(x)ylabel('Input','fontsize',15);xlabel('Time','fontsize',15);subplot(212)plot(y)ylabel('Output','fontsize',15);xlabel('Time','fontsize',15);fprintf('Press a key to continue')  pause;fprintf('\n')fprintf('Training the MLP with EKF')fprintf('\n')% PERFORM EXTENDED KALMAN FILTERING TO TRAIN MLP:% ==============================================tInit=clock;[ekfp,theta,thetaR,PR,innovations] = mlpekf(x',y',s1,s2,KalmanR,KalmanQ,KalmanP,initVar1,N);durationekf=etime(clock,tInit);% PERFORM SEQUENTIAL MONTE CARLO FILTERING TO TRAIN MLP:% =====================================================fprintf('Training the MLP with hybrid SIR')fprintf('\n')tInit=clock;[samples, q, m] = hybridsir(x,y,s1,s2,numSamples,Q,initVar1,initVar2,R,KalmanR,KalmanQ,KalmanP,N);durationsir=etime(clock,tInit);% COMPUTE CENTROID, MAP AND VARIANCE ESTIMATES:% ============================================% Posterior mean estimate of weightspredictedWeights = mean(samples);% Posterior mean estimate of one-step-ahead predictionprediction = mean(m);% MAP estimate of one-step-ahead predictionbins = 20;predmap=zeros(N,1);for t=1:N  [p,pos]=hist(m(:,t),bins);  map=find(p==max(p));  predmap(t,1)=pos(map(1));end;% Posterior standard deviation estimate of weightsxstd=std(samples);% COMPUTE ONE-STEP-AHEAD PREDICTION ERRORS:% ========================================errorsir=norm(prediction-y');errorekf=norm(ekfp'-y);errorsirmap=norm(predmap-y);durationekf=durationekf;durationsir=durationsir;fprintf('Posterior mean error = %d',errorsir);fprintf('\n');fprintf('MAP error            = %d',errorsirmap)fprintf('\n')fprintf('EKF error            = %d',errorekf)fprintf('\n')fprintf('The EKF took %d seconds.\n',durationekf)fprintf('The hybrid sir took %d seconds.\n',durationsir)% PLOT RESULTS:% ============figure(1)clf;subplot(221)plot(1:length(x),y,'g',1:length(x),prediction,'r',1:length(x),predmap,'m',1:length(x),ekfp,'b--')legend('True value','Posterior mean estimate','MAP estimate','EKF estimate');ylabel('One-step-ahead prediction','fontsize',15)xlabel('Time','fontsize',15)subplot(222)plot(prediction,y,'+')ylabel('True output','fontsize',15)xlabel('Posterior mean estimate','fontsize',15)hold onc=min(y):1:max(y);plot(c,c,'r');axis([-10 30 -10 30]);hold offsubplot(223)plot(ekfp,y,'+')ylabel('True output','fontsize',15)xlabel('EKF estimate','fontsize',15)hold onc=min(y):1:max(y);plot(c,c,'r');axis([-10 30 -10 30]);hold off% Plot histogram:[r,c]=size(q);domain = zeros(r,1);range = zeros(r,1);thex=[-5:.4:25];subplot(224)hold onylabel('Time','fontsize',15)xlabel('Sample space','fontsize',15)zlabel('Output density','fontsize',15)v=[0 1];caxis(v);for t=1:5:100,  [range,domain]=hist(m(:,t),thex);  waterfall(domain,t,range)end;

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -