⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 fisher_nc_snn.m

📁 神经网络的工具箱, 神经网络的工具箱,
💻 M
字号:
function F = fisher_nc_snn(X, net, data)%FISHER_NC_SNN approximation of derivatives for costfunction with respect %              to connection weights and biases (Fisher matrix).%% Syntax%%   F = fisher_nc_snn(net, data)%   F = fisher_nc_snn(X, net, data)%% Description%%   FISHER_NC_SNN takes%    net    - a net_struct%    data   - the data for net.costFcn.name%    X      - a vector containing connection weights and biases (optional).%   and returns%    F      - an approximation of the fisher matrix. %% Algorithm%%   The approximation is obtained by ignoring the cross terms that%   occur from changes in weights across layers, and by neglecting%   correlations between derivatives gamma_l and activities y_{l-1}.%   (See: On Natural Learning and Pruning in Multilayered%         Perceptrions; Tom Heskes; Neural Computation 12)% %#function lintf_snn exptf_snn logsigtf_snn radbastf_snn tansigtf_snn %#function dlintf_snn dexptf_snn dlogsigtf_snn dradbastf_snn dtansigtf_snn %#function wcf_snn d2wcf_snn if (nargin == 2)         % fisherff_snn(net, data)   data = net;   net = X;elseif (nargin == 3)     % fisherff_snn(X, net, data)   net = setx_snn(net,X);endM = net.numLayers;q = size(data.P,2);indb = 1;inde = 0;for m = 1:M    indb = inde + 1;    inde = inde + prod(size(net.weights{m}));    Xindices{m}.w = [indb:inde];    indb = inde + 1;    inde = inde + size(net.biases{m},1);    Xindices{m}.b = [indb:inde];endN{1} = net.weights{1}*data.P + repmat(net.biases{1}, 1, size(data.P,2));V{1} = feval(net.transferFcn{1}, N{1});n0 = size(net.weights{1},2);nl{1} = size(net.biases{1},1);if (M > 1)    for m = 2:M       N{m} = net.weights{m}*V{m-1} + repmat(net.biases{m}, 1, q);       V{m} = feval(net.transferFcn{m}, N{m});       nl{m} = size(net.biases{m},1);    endendgamma{M} = repmat(...             reshape(...	       feval(feval(net.transferFcn{M}, 'deriv'), N{M}, V{M}),...	       nl{M}*q, 1), ...	     1,nl{M}) ...           .* ...	   reshape(shiftdim(repdim(diag(ones(nl{M},1)), q),1), nl{M}*q, nl{M});for m = [(M-1):-1:1]    gamma{m} = (gamma{m+1}*net.weights{m+1}) .* ...                reshape(...                    shiftdim(...                    repdim(...		      feval(feval(net.transferFcn{m}, 'deriv'), N{m}, V{m})', ...	            nl{M}),...                  2),...                nl{M}*q, nl{m});endH = feval(feval(net.costFcn.name, '2deriv'), net, data, V{M}, V{M});C_0 = (data.P * data.P')/q; for m = 1:M    GAMMA{m} = gamma{m}' * H * gamma{m};    C{m} = (V{m} * V{m}')/q; endF = zeros(size(getx_snn(net), 1));F(Xindices{1}.w, Xindices{1}.w) = kron(C_0, GAMMA{1});  F(Xindices{1}.b, Xindices{1}.b) = GAMMA{1};  F(Xindices{1}.w, Xindices{1}.b) = kron(mean(data.P,2), GAMMA{1});  F(Xindices{1}.b, Xindices{1}.w) = F(Xindices{1}.w, Xindices{1}.b)';  if (M > 1)   for m = 2:M       F(Xindices{m}.w, Xindices{m}.w) = kron(C{m-1}, GAMMA{m});       F(Xindices{m}.b, Xindices{m}.b) = GAMMA{m};       F(Xindices{m}.w, Xindices{m}.b) = kron(mean(V{m-1},2), GAMMA{m});       F(Xindices{m}.b, Xindices{m}.w) = F(Xindices{m}.w, Xindices{m}.b)';   endend%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%function B = repdim(A, q)sizeA = size(A);B = reshape(repmat(reshape(A, prod(sizeA), 1),1,q),[sizeA q]);%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -