代码搜索:Gradient

找到约 2,951 项符合「Gradient」的源代码

代码结果 2,951
www.eeworm.com/read/386625/8734521

m pcgs.m

%PCGS Preconditioned conjugate gradient squared method % % [X,RESIDS,ITS]=PCGS(A,B,X0,RTOL,PRTOL,MAX_IT,MAX_TIME,MAX_MFLOP) % solves the system AX = B using the preconditioned conjuga
www.eeworm.com/read/386625/8734557

m pcg.m

%PCG Preconditioned conjugate gradient method % % [X,RESIDS,ITS]=PCG(A,B,X0,RTOL,PRTOL,MAX_IT,MAX_TIME,MAX_MFLOP) % solves the system AX = B using the preconditioned conjugate gradie
www.eeworm.com/read/429481/8806340

cnt skinseg.cnt

:Base skinseg.hlp :Title skinseg help 1 Segmentation Parameters 2 Image Smoothing=IMAGE_SMOOTHING@skinseg.hlp 2 Percentage of Highest Gradient Edges=HIGHEST_GRADIENTS@skinseg.hlp 2 Minimum Region
www.eeworm.com/read/377948/9256331

m nnd12ls.m

function nnd12ls(cmd,arg1) %NND12LS Conjugate gradient lines search demonstration. % % This demonstration requires the Neural Network Toolbox. % First Version, 8-31-95. %=====================
www.eeworm.com/read/375212/9369011

m nplsbld1.m

function [wts,upred]= nplsbld1(t,u,ii,n,plots) %NNPLSBLD1 Carries out NNPLS when model structure is already known. % A conjugate gradient optimization subroutine is supplied. If the user % has the
www.eeworm.com/read/361257/10062725

m nnd12ls.m

function nnd12ls(cmd,arg1) %NND12LS Conjugate gradient lines search demonstration. % % This demonstration requires the Neural Network Toolbox. % First Version, 8-31-95. %=====================
www.eeworm.com/read/424063/10501002

m graderr.m

function graderr(finite_diff_deriv, analytic_deriv, evalstr2) %GRADERR Used to check gradient discrepancy in optimization routines. % Copyright (c) 1990-94 by The MathWorks, Inc. err=max(max(a
www.eeworm.com/read/352364/10559332

f90 iccg在解有限差分方程中的应用.f90

SUBROUTINE ICCG(A,N,N1,N2,M1,M2,B,X,D,R,P,Q,EPS,ITR,IER,S) !*********************************************************************** ! INCOMPLETE CHOLESKY DECOMPOSITION CONJUGATED GRADIENT METH
www.eeworm.com/read/159921/10588509

m gganders.m

function [alpha,theta,solution,minr,t,maxerr]=... gganders(MI,SG,J,tmax,stopCond,t,alpha,theta) % GGANDERS solves Generalized Anderson's task, generalized gradient. % [alpha,theta,solution,minr,t,m
www.eeworm.com/read/421949/10677201

m gganders.m

function [alpha,theta,solution,minr,t,maxerr]=... gganders(MI,SG,J,tmax,stopCond,t,alpha,theta) % GGANDERS solves Generalized Anderson's task, generalized gradient. % [alpha,theta,solution,minr,t,m