⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 backpropagation_stochastic_multioutput.m

📁 Duda《模式分类》第二版第1、3、5章部分课后习题和上机题的解答和程序代码
💻 M
字号:
function [test_targets, tvh, Wh, Wo, J] = Backpropagation_Stochastic_MultiOutput(train_patterns, train_targets, test_patterns, params)

% Classify using a backpropagation network with stochastic learning algorithm
% Inputs:
% 	training_patterns   - Train patterns
%	training_targets	- Train targets
%   test_patterns       - Test  patterns
%	params              - Number of hidden units and output units,
%                         Convergence criterion, Convergence raterate, Maximum iterations
%
% Outputs
%	test_targets        - Predicted targets
%   tvh                 - Train_verify / Test_hiddens
%   Wh                  - Hidden unit weights
%   Wo                  - Output unit weights
%   J                   - Error throughout the training

Nh              = params(1);
No		        = params(2);
Theta           = params(3);
eta             = params(4);
Mit             = params(5);
iter	         = 1;

[Ni, M]          = size(train_patterns);  % Ni is the number of input, M is the number of training samples
% No		         = 1;

% Find out the number of classes
% 'unique(A)' returns the same values as in A but with no repetitions
if No==1
    Uc               = length(unique(train_targets));
else
    Uc               = size(unique(train_targets','rows'),1);
end
%If there are only two classes, remap to {-1,1}
if (Uc == 2)
    train_targets    = (train_targets>0)*2-1;   % Note that 'A>0' is a logical expression
end     % it compares every component of matrix A with 0 and returns a matrix of the same size with A
        % of which the component value is either logical '1' if A(i,j)>0 or logical '0' if A(i,j)<=0

%Initialize the net
w0		= max(abs(std(train_patterns')'));
Wh		= rand(Nh, Ni+1).*w0*2-w0; % Hidden weights
Wo		= rand(No, Nh+1).*w0*2-w0; % Output weights

Wo    = Wo/mean(std(Wo'))*(Nh+1)^(-0.5);
Wh    = Wh/mean(std(Wh'))*(Ni+1)^(-0.5);

Theta = Theta*ones(No,1);
rate	= 10*Theta;
J       = 1e3*ones(No,1);

while ~isequal(rate > Theta,zeros(No,1)) & (iter<Mit),
    %Randomally choose an example
    i	= randperm(M); % 'randperm(n)' returns a random permutation of the integers 1:n
    m	= i(1);
    Xm = train_patterns(:,m);
%     if No>1
        tk = train_targets(:,m);
%     else
%         tk = train_targets(m);
%     end
    
    %Forward propagate the input:
    %First to the hidden units
    gh				= Wh*[Xm; 1];       % '1' is the bias value
    [y, dfh]		= activation(gh);
    %Now to the output unit
    go				= Wo*[y; 1];
    [zk, dfo]	= activation(go);
    
    %Now, evaluate delta_k at the output: delta_k = (tk-zk)*f'(net)
    delta_k		= (tk - zk).*dfo;
    
    %...and delta_j: delta_j = f'(net)*w_j*delta_k
    for j=1:Nh
        delta_j(j)		= dfh(j)*Wo(:,j)'*delta_k;
    end
    %w_kj <- w_kj + eta*delta_k*y_j
    Wo				= Wo + eta*delta_k*[y;1]';
    
    %w_ji <- w_ji + eta*delta_j*[Xm;1]
    Wh				= Wh + eta*delta_j'*[Xm;1]';
    
    iter 			= iter + 1;

    %Calculate total error
    Ji    = zeros(No,1);
    for i = 1:M,
        error = (train_targets(:,i) - activation(Wo*[activation(Wh*[train_patterns(:,i); 1]); 1])).^2;
        Ji = Ji + error;
    end
    J = [J,Ji/M];
    if No==1
        rate  = abs(J(:,iter) - J(:,iter-1))/J(:,iter-1)*100;
    else
        rate  = abs(J(:,iter) - J(:,iter-1));
    end
end
plot(J(1,2:end),'b');hold on;plot(J(2,2:end),'g');
rate
disp(['Backpropagation converged after ' num2str(iter) ' iterations.'])

%Classify the test patterns
train_verify = zeros(No, size(train_patterns,2));
for i = 1:size(train_patterns,2),
	train_verify(:,i) = activation(Wo*[activation(Wh*[train_patterns(:,i); 1]); 1]);
end

test_targets = zeros(No,size(test_patterns,2));
test_hiddens = zeros(Nh,size(test_patterns,2));
for i = 1:size(test_patterns,2),
    test_hiddens(:,i) = activation(Wh*[test_patterns(:,i); 1]); 
    test_targets(:,i) = activation(Wo*[activation(Wh*[test_patterns(:,i); 1]); 1]);
end

if (Uc == 2)
    test_hiddens = test_hiddens >0;
    test_targets  = test_targets >0;
end
if No==1
    tvh = test_hiddens;
else
    tvh = train_verify;
end


function [f, df] = activation(x)

a = 1.716;
b = 2/3;
f	= a*tanh(b*x);
df	= a*b*sech(b*x).^2;

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -