代码搜索:Gradient

找到约 2,951 项符合「Gradient」的源代码

代码结果 2,951
www.eeworm.com/read/395332/8184142

c filter.c

// Filtering for Image with variaty filtering kernel // // CV_PREWITT_3x3_V A gradient filter (vertical Prewitt operator). // -1 0 1 // -1 0 1 // -1 0 1 // CV_PREWITT_3x
www.eeworm.com/read/294886/8195827

m nnd12ls.m

function nnd12ls(cmd,arg1) %NND12LS Conjugate gradient lines search demonstration. % % This demonstration requires the Neural Network Toolbox. % First Version, 8-31-95. %=====================
www.eeworm.com/read/367442/9748220

m gganders.m

function [alpha,theta,solution,minr,t,maxerr]=... gganders(MI,SG,J,tmax,stopCond,t,alpha,theta) % GGANDERS solves Generalized Anderson's task, generalized gradient. % [alpha,theta,solution,minr,t,m
www.eeworm.com/read/414357/11119051

asv nnd12cg.asv

function nnd12cg(cmd,arg1) %NND12CG Conjugate gradient backpropagation demonstration. % % This demonstration requires the Neural Network Toolbox. % Copyright 1994-2002 PWS Publishing Company an
www.eeworm.com/read/414357/11119196

m nnd12cg.m

function nnd12cg(cmd,arg1) %NND12CG Conjugate gradient backpropagation demonstration. % % This demonstration requires the Neural Network Toolbox. % Copyright 1994-2002 PWS Publishing Company an
www.eeworm.com/read/147096/12584632

m graderr.m

function graderr(finite_diff_deriv, analytic_deriv, evalstr2) %GRADERR Used to check gradient discrepancy in optimization routines. % Copyright (c) 1990-94 by The MathWorks, Inc. err=max(max(a
www.eeworm.com/read/134893/13972131

m nnd12ls.m

function nnd12ls(cmd,arg1) %NND12LS Conjugate gradient lines search demonstration. % % This demonstration requires the Neural Network Toolbox. % First Version, 8-31-95. %=====================
www.eeworm.com/read/101557/15826753

m graderr.m

function graderr(finite_diff_deriv, analytic_deriv, evalstr2) %GRADERR Used to check gradient discrepancy in optimization routines. % Copyright (c) 1990-94 by The MathWorks, Inc. err=max(max(a
www.eeworm.com/read/165864/10048490

m calcdeltajacobian.m

function jac = CalcDeltaJacobian (x,array,h1,h2,measuredDelay) % function jac = CalcDeltaJacobian (x,array,h1,h2,measuredDelay) % % Computes gradient of the time delay difference function % (see CalcD
www.eeworm.com/read/161189/10439665

m cgls.m

function [X,rho,eta,F] = cgls(A,b,k,reorth,s) %CGLS Conjugate gradient algorithm applied implicitly to the normal equations. % % [X,rho,eta,F] = cgls(A,b,k,reorth,s) % % Performs k steps of the c