grad1.m
来自「偏最小二乘算法在MATLAB中的实现」· M 代码 · 共 23 行
M
23 行
function df=grad(x)
%GRAD1 Calcualtes Jocobian of objective function for training NNPLS
% Routine to calculate the Jacobian of fun1 for optimization with leastsq.
% grad1 here is different from gradient that is used with conjugate
% gradient routine cgrdsrch.
% See Optimization Toolbox.
% I/O syntax is df = grad(x)
% Copyright
% Thomas Mc Avoy
% 1994
% Distributed by Eigenvector Technologies
% Modified by BMW 5-8-95
global Tscores Uscores
% these global variable need to be added to inner.m
t=Tscores;
u=Uscores;
n=(length(x)-1)/3;
beta=x(1:n+1);
kin=[x(n+2:2*n+1)';x(2*n+2:3*n+1)'];
% gradnet1 calculates the Jacobian
df=gradnet1(t,u,kin,beta);
⌨️ 快捷键说明
复制代码Ctrl + C
搜索代码Ctrl + F
全屏模式F11
增大字号Ctrl + =
减小字号Ctrl + -
显示快捷键?