📄 grad1.m
字号:
function df=grad(x)
%GRAD1 Calcualtes Jocobian of objective function for training NNPLS
% Routine to calculate the Jacobian of fun1 for optimization with leastsq.
% grad1 here is different from gradient that is used with conjugate
% gradient routine cgrdsrch.
% See Optimization Toolbox.
% I/O syntax is df = grad(x)
% Copyright
% Thomas Mc Avoy
% 1994
% Distributed by Eigenvector Technologies
% Modified by BMW 5-8-95
global Tscores Uscores
% these global variable need to be added to inner.m
t=Tscores;
u=Uscores;
n=(length(x)-1)/3;
beta=x(1:n+1);
kin=[x(n+2:2*n+1)';x(2*n+2:3*n+1)'];
% gradnet1 calculates the Jacobian
df=gradnet1(t,u,kin,beta);
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -