代码搜索:Gradient

找到约 2,951 项符合「Gradient」的源代码

代码结果 2,951
www.eeworm.com/read/414357/11119009

m nnd12ls.m

function nnd12ls(cmd,arg1) %NND12LS Conjugate gradient lines search demonstration. % % This demonstration requires the Neural Network Toolbox. % Copyright 1994-2002 PWS Publishing Company and T
www.eeworm.com/read/413912/11137204

m rbfbkp.m

function g = rbfbkp(net, x, z, n2, deltas) %RBFBKP Backpropagate gradient of error function for RBF network. % % Description % G = RBFBKP(NET, X, Z, N2, DELTAS) takes a network data structure NET % to
www.eeworm.com/read/411674/11233818

m contents.m

% Pre-image problem for RBF kernel. % % rbfpreimg - Schoelkopf's fixed-point algorithm. % rbfpreimg2 - Gradient optimization. % rbfpreimg3 - Kwok-Tsang's algorithm. % % About: Statistical Pattern
www.eeworm.com/read/375212/9369069

m nnpls1.m

function [n,wts,upred]=nnpls1(t,u,ttest,utest,ii,opts) %NNPLS1 Calculates a single NN-PLS factor % Routine to carry out NNPLS. A conjugate gradient optimization % subroutine is supplied. If the u
www.eeworm.com/read/279380/10442817

m grad.m

function[G,Gx,Gy,Gz] = grad(dx,dy,dz) % [G] = grad(dx,dy,dz) %Creates the 3D finite volume gradient operator %operator is set up to handle variable grid discretization %dx,dy,dz are vectors contai
www.eeworm.com/read/159921/10588555

m gganders2.m

function [alpha,theta,solution,minr,t,maxerr]=... gganders2(MI,SG,J,tmax,stopCond,t,alpha,theta) % GGANDERS2 solves Generalized Anderson's task, generalized gradient. % [alpha,theta,solution,minr,t
www.eeworm.com/read/421949/10677249

m gganders2.m

function [alpha,theta,solution,minr,t,maxerr]=... gganders2(MI,SG,J,tmax,stopCond,t,alpha,theta) % GGANDERS2 solves Generalized Anderson's task, generalized gradient. % [alpha,theta,solution,minr,t
www.eeworm.com/read/448535/7531255

m ellipsecg.m

% Plot contours of an ellipse with large eigenvalue disparity % and the results of conjugate gradient. % Copyright 1999 by Todd K. Moon v1 = [1;1]; v2 = [1; -1]; lambda1 = 100; lambda2 = 5;
www.eeworm.com/read/448535/7531305

m conjgrad1.m

function [x,D] = conjgrad1(Q,b) % function [x,D] = conjgrad1(Q,b) % % Solve the equation Qx = b using conjugate gradient, where Q is symmetric % % Q = symmetric matrix % b = right-hand side %
www.eeworm.com/read/396828/8088373

m traingd_snn.m

function [net, result] = traingd_snn(net, dataLV, dataVV, dataTV) %TRAINGD_SNN Gradient Descent training. % % Syntax % % [net, tr_info] = traingd_snn(net, dataLV) % [net, tr_info] = traingd_snn(