⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 bp2dd.m

📁 this code for backpropagation on matlab. with indonesian language
💻 M
字号:
%%%%%%%% Back Propagation Algorithm : Naresh K Bansal%%%%%%%%%%
clc;
clear all;
close all;
tic;
Inp = [0 0; 0 1; 1 0; 1 1]; %%%%%%% Input
Targ = [0 ; 1 ; 1 ; 0];
%Tp = 0.20;      %%%%%%% Target Output
%Xo(1) = input('Enter First Input : ');
%Xo(2) = input('Enter Second Input : ');
%Tp = input('Enter Targeted Output : ');
Wi = [1.9 1.9; 17 17]; %%%% Initialize Input Weights
Wo = [-20 20]'; %%%% Initialize Output Weights
Bi = [0.5 -1]; %%%%%%%% Bias to Hidden Units
Bo = -1; %%%%%%%%%%%%% Bias to Output Unit
eta = 0.3;  %%%%%%%% Learning Constant
%Out_Hid = 0.5; %%%%%%%%%%%%%% Just to start the process
S = 1;
i = 1;
p = 1;
%%%%%%%%% Sigmoidal Activation Function Used == tansig %%%%%%%%%%

%while abs(Tp - Out_Hid) > 0.0001
%save Naresh Wo;
%for S = 1:50,
Xo = Inp(1,:);
L = Wi*Xo' + Bi';  %%%%%%%%%%%%%%%%%%% L Vector consists of output from each hidden node
%O = tansig(L);  %%%%%%%% Passing the output from activation function
O = sigmf(L,[0.5 0]);
%%%%%%%%%%%%%% Error Term %%%%%%%%%
%% CHECK AGAIN
%%%%%%%%%% For Output Units
M = Wo'*O + Bo;

%Out_Hid = tansig(M);
Out_Hid = sigmf(M,[0.5 0]);

Tp = Targ(1);

MSE = 10;

while MSE > 0.25
%while abs(Tp - Out_Hid) > 0.20
L = Wi*Xo' + Bi'; %output before HL from the hidden layer
O = sigmf(L,[0.5 0]); % output of the hidden layer
M = Wo'*O + Bo; % Before HL from the output
Out_Hid = sigmf(M,[0.5 0]);%output

%%%%%%%Error for output units
    del_o = Out_Hid*(1-Out_Hid)*(Tp - Out_Hid); %%% For output Unit

%%% For Hidden Units

%del_j1 = O*(1-O)*(sum(Wo.*del_o));
%del_j2 = O*(1-O)*(sum(Wo.*del_o));

del_j = O.*(1-O).*(del_o.*(Wo));  %%% 1-by-2 vector giving 2 errors corresponding to 2 hidden nodes 

%%%%%%%%%% Weights Updation %%%%%%%%%%%%%%%%%%%%
%%% Here O shud be vector corresponding to output coming from hidden nodes

%Wo = Wo';

Wo = Wo + (eta*del_o).*O;  %%% Output Weight Updation

%%% CHECK IT AGAIN           eta*del*x OR eta*del*O1

Wi = Wi + eta.*(del_j*Xo);  %%% Input Weight Updation


disp('Iteation Number');disp(S);
Mean_Square_Error = MSE
Err(S) = abs(Tp - Out_Hid);
if(mod(S,4) == 0)
MSE = sqrt(sum(Err(S-3:S).^2));
% plot(S,MSE);
% hold on
end

S = S+1;
%i = i+1;
i = mod(i,4)+1;
Xo = Inp(i,:);
Tp = Targ(i);
% if(S>50),
% for g = 1:4:S-5,
%     
%        K(p) = sqrt(sum(Err(g:g+4).^2));
%     p = p+1;
% end
% plot(K)
% hold on
% grid on;
% xlabel('Iteration Number --->');
% ylabel('Mean Square Error');
% title('Convergence of BackPropagation Algorithm')
% end
% if(S==200),
%     pause(10);
% end
end

%toc;
disp('***************************  RESULTS    ***************************************')

Final_Input_Weights = Wi
Final_Output_Weights = Wo
Total_Number_of_Iterations_Taken = S
Total_Time_Taken_in_Seconds = toc

%********************************** Plot Results **********************************


%plot(Converge(1, 2:S)')
% p = 1;
for i = 1:4:S-5,
   
       K(p) = sqrt(sum(Err(i:i+4).^2));
    p = p+1;
end

%MSE = sqrt(sum(Err(S-3:S).^2));
plot(K)
hold on
grid on;
xlabel('Iteration Number --->');
ylabel('Mean Square Error');
title('Convergence of BackPropagation Algorithm')

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -