⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 mlpjacob.m

📁 递归贝叶斯估计的工具包
💻 M
字号:
function g = mlpjacob(net, x)%MLPJACOB Backpropagate gradient of error function for 2-layer network.%%   Description%   G = MLPJACOB(NET, X) takes a network data structure NET and an%   input vector X and returns a matrix G whose J, K%   element contains the derivative of network output K with respect to%   input parameter J.%%   See also%   MLP, MLPGRAD, MLPBKP%%   Copyright (c) Christopher M Bishop, Ian T Nabney (1996, 1997)%                 This function coded by Rudolph van der Merwe (2002)[y, z, a] = mlpfwd(net, x);switch net.outfncase 'linear'  deltas = eye(net.nout);  delhid = deltas*net.w2';  delhid = delhid.*repmat((1.0 - z.*z),net.nout,1);case {'logistic'}  deltas = diag(y.*(1-y));  delhid = deltas*net.w2';  delhid = delhid.*repmat((1.0 - z.*z),net.nout,1);case {'softmax'}  deltas = diag(y)-y'*y;  delhid = deltas*net.w2';  delhid = delhid.*repmat((1.0 - z.*z),net.nout,1);otherwise  error(['Unknown activation function ', net.actfn]);end% Finally, evaluate the first-layer gradients.g = (delhid*net.w1')';

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -