代码搜索:gradient
找到约 2,951 项符合「gradient」的源代码
代码结果 2,951
www.eeworm.com/read/395537/8168827
m uvw.m
function f = uvw(xy);
%
% uvw
% Objective function to minimize in Conjugate Gradient Homework.
% Vectorized to handle a 2 x n input matrix xy,
% whose columns give the evaluation points in the
www.eeworm.com/read/394381/8227702
m fminusub.m
function [x,FVAL,GRADIENT,HESSIAN,EXITFLAG,OUTPUT] = fminusub(funfcn,x,verbosity,options,Fval,Gval,Hval,varargin)
%FMINUSUB Finds the minimum of a function of several variables.
% Copyright (c)
www.eeworm.com/read/415311/11077208
m backpropagation_cgd.m
function [D, Wh, Wo] = Backpropagation_CGD(train_features, train_targets, params, region)
% Classify using a backpropagation network with a batch learning algorithm and conjugate gradient descent
www.eeworm.com/read/334860/12568150
m fminusub.m
function [x,FVAL,GRADIENT,HESSIAN,EXITFLAG,OUTPUT] = fminusub(funfcn,x,verbosity,options,Fval,Gval,Hval,varargin)
%FMINUSUB Finds the minimum of a function of several variables.
% Copyright (c)
www.eeworm.com/read/216626/4889417
bas_fun triangle.dg.1.bas_fun
3
2 0
0.5 0.5
1 0 0 0
./triangle.DG.1.bas_fun.so
lambda_1 gradient_lambda_1
2 0
0.0 0.5
1 0 0 0
./triangle.DG.1.bas_fun.so
lambda_2 gradient_lambda_2
2 0
0.5 0.0
1 0 0 0
./triangle.DG.1.bas_fun.s
www.eeworm.com/read/216626/4889431
bas_fun triangle.rt.1.bas_fun
3
1 0
0.5 0.5
1 0 0 0
./triangle.RT.1.bas_fun.so
lambda_1 gradient_lambda_1
1 1
0.0 0.5
1 0 0 0
./triangle.RT.1.bas_fun.so
lambda_2 gradient_lambda_2
1 2
0.5 0.0
1 0 0 0
./triangle.RT.1.bas_fun.s
www.eeworm.com/read/216626/4889531
bas_fun twin_triangle.2.bas_fun
9
0 0
0.0 0.0
2 0 0 0
./twin_triangle.2.bas_fun.so
phi_1 gradient_phi_1
0 1
1.0 0.0
2 0 0 0
./twin_triangle.2.bas_fun.so
phi_2 gradient_phi_2
0 2
0.5 0.5
2 0 0 0
./twin_triangle.2.bas_fun.so
phi
www.eeworm.com/read/216626/4889554
bas_fun tetrahedron.1.d.bas_fun
4
3 0
0.0 0.0 0.0
1 0 0 0 0
./tetrahedron.1.D.bas_fun.so
lambda_1 gradient_lambda_1
3 0
1.0 0.0 0.0
1 0 0 0 0
./tetrahedron.1.D.bas_fun.so
lambda_2 gradient_lambda_2
3 0
0.0 1.0 0.0
1 0 0 0 0
./t
www.eeworm.com/read/177674/9442393
m demolgd1.m
%DEMOLGD1 Demonstrate simple MLP optimisation with on-line gradient descent
%
% Description
% The problem consists of one input variable X and one target variable
% T with data generated by sampling X
www.eeworm.com/read/176823/9483098
m demolgd1.m
%DEMOLGD1 Demonstrate simple MLP optimisation with on-line gradient descent
%
% Description
% The problem consists of one input variable X and one target variable
% T with data generated by sampling X