代码搜索:Gradient

找到约 2,951 项符合「Gradient」的源代码

代码结果 2,951
www.eeworm.com/read/216626/4889581

bas_fun rectangle.rt.1.bas_fun

4 1 0 0.0 -1.0 1 0 0 0 ./rectangle.RT.1.bas_fun.so lambda_1 gradient_lambda_1 1 1 1.0 0.0 1 0 0 0 ./rectangle.RT.1.bas_fun.so lambda_2 gradient_lambda_2 1 2 0.0 1.0 1 0 0 0 ./rectangle.RT.1.ba
www.eeworm.com/read/216626/4889590

bas_fun four_tetrahedron.1.bas_fun

7 0 0 0.0 0.0 0.0 1 0 0 0 0 ./four_tetrahedron.1.bas_fun.so lambda_1 gradient_lambda_1 0 1 1.0 0.0 0.0 1 0 0 0 0 ./four_tetrahedron.1.bas_fun.so lambda_2 gradient_lambda_2 0 2 0.0 1.0 0.0 1 0 0 0
www.eeworm.com/read/216626/4889600

bas_fun four_tetrahedron.1.d.bas_fun

4 3 0 0.0 0.0 0.0 1 0 0 0 0 ./four_tetrahedron.1.D.bas_fun.so lambda_1 gradient_lambda_1 3 0 1.0 0.0 0.0 1 0 0 0 0 ./four_tetrahedron.1.D.bas_fun.so lambda_2 gradient_lambda_2 3 0 0.0 1.0 0.0 1 0
www.eeworm.com/read/389321/8533632

m mkramp.m

% IM = mkRamp(SIZE, DIRECTION, SLOPE, INTERCEPT, ORIGIN) % % Compute a matrix of dimension SIZE (a [Y X] 2-vector, or a scalar) % containing samples of a ramp function, with given gradient DIRECTIO
www.eeworm.com/read/397477/8043482

m modskew.m

function [chm, snrk] = modskew(ch,sk,p); % Adjust the sample skewness of a vector/matrix, using gradient projection, % without affecting its sample mean and variance. % % This operation is not an ort
www.eeworm.com/read/331444/12828240

m mkramp.m

% IM = mkRamp(SIZE, DIRECTION, SLOPE, INTERCEPT, ORIGIN) % % Compute a matrix of dimension SIZE (a [Y X] 2-vector, or a scalar) % containing samples of a ramp function, with given gradient DIRECTION %
www.eeworm.com/read/101039/6260180

jbx compierecolor.jbx

[PropertyInfo] flat,boolean,false,false, , ,true, flatColor,Color,false,false, , ,true, gradient,boolean,false,false, , ,true, gradientLowerColor,Color,false,false, , ,t
www.eeworm.com/read/234502/14110953

1st readme.1st

GRADIENT VECTOR FLOW DEMONSTRATION USING MATLAB Chenyang Xu and Jerry Prince Image Analysis and Communications Laboratory Johns Hopkins University June 17, 1997, updated on September 9, 1999
www.eeworm.com/read/233016/14173608

m bpdn_obj.m

function [obj,grad,hess] = FnameBPDN( x ) % [obj,grad,hess] = FnameBPDN( x ) % computes the objective value, gradient and diagonal Hessian % of the linear function lambda e'x, where la
www.eeworm.com/read/224369/14595526

m sep96.m

% sep96.m implements the learning rule described in Bell \& Sejnowski, Vision % Research, in press for 1997, that contained the natural gradient (w'w). % % Bell & Sejnowski hold the patent for this le