📄 bpn01.m
字号:
clear
clc
%Input the input-neurons P.
P=xlsread('input.xls');%The content of "input.xls" is shown in Table 3.
%Input the output-neuron T, compare with the input-neurons.
T=[0.1059 0.1513 0.1884 0.2724 0.3356 0.3948 0.4842 0.2724 0.3370 0.4072 0.5337 0.5791 0.5516 0.5488 0.6052 0.6479 0.6396 0.7428];
%Network's initialization
net.layers{1}.initFcn = 'initwb';
net.inputWeights{1,1}.initFcn = 'rands';
net.biases{1,1}.initFcn ='rands';
net.biases{2,1}.initFcn ='rands';
Pr=[min(P(:,1)),max(P(:,1));
min(P(:,2)),max(P(:,2));
min(P(:,3)),max(P(:,3));
min(P(:,4)),max(P(:,4));];
%Create the BPN.
net=newff(Pr,[6,1],{'tansig','purelin'},'trainlm');
net.trainParam.show=1;
net.trainParam.lr = 0.01;
net.trainParam.epochs=500;
net.trainParam.goal=1e-004;
net=train(net,P',T);
a=sim(net,P')
%Test the network. Because the data of 2003(0.4443) has effected a lot by
%SARS, so we use the mean value of data of 2002 and
%2004,[P(2002)+P(2004)]/2=0.9374,to instead it, for an accurate prediction.
P=[0.6052 0.6479 0.6396 0.7428;
0.6479 0.6396 0.7428 0.8748;
0.6396 0.7428 0.8748 0.9051;
0.7428 0.8748 0.9051 1.0000;
0.8748 0.9051 1.0000 0.9374;
0.9051 1.0000 0.9374 0.8748];
Y=sim(net,P')
%Calculate the Error.
T=[0.8748 0.9051 1.0000 0.4443 0.8748 1.0481];
Err=(Y-T)./T %If the Error is small enough, you can go to the next step.
%Next, you can make your predictions by enter P.
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -