代码搜索:Gradient
找到约 2,951 项符合「Gradient」的源代码
代码结果 2,951
www.eeworm.com/read/232704/14185040
m gradient_example.m
%gradient_example.m
%计算二维高斯函数的梯度场
v=-2:0.25:2;
[x,y]=meshgrid(v,v); %产生自变量x,y
z= exp(-(x.^2+y.^2+0.5*x.*y)); %二维高斯函数
[px py]=gradient(
www.eeworm.com/read/129915/14217725
m gradient_descent.m
function Min = gradient_descent(a, theta, eta, fun)
% Minimize a function using the basic gradient descent algorithm
%
% Inputs:
% a - Initial search point
% theta - Convergence criterion
%
www.eeworm.com/read/223324/14645524
bmp gradient2.bmp
www.eeworm.com/read/223324/14645526
bmp gradient4.bmp
www.eeworm.com/read/223324/14645531
bmp gradient256.bmp
www.eeworm.com/read/223324/14645535
bmp gradient16.bmp
www.eeworm.com/read/123019/14652000
gif gradient_progress.gif
www.eeworm.com/read/218367/14925463
m forward_gradient.m
function [fdy,fdx]=forward_gradient(f);
% function [fdx,fdy]=forward_gradient(f);
[nr,nc]=size(f);
fdx=zeros(nr,nc);
fdy=zeros(nr,nc);
a=f(2:nr,:)-f(1:nr-1,:);
fdx(1:nr-1,:)=a;
b=f(:,2:nc)-f(
www.eeworm.com/read/218367/14925473
m backward_gradient.m
function [bdy,bdx]=backward_gradient(f);
% function [bdx,bdy]=backward_gradient(f);
[nr,nc]=size(f);
bdx=zeros(nr,nc);
bdy=zeros(nr,nc);
bdx(2:nr,:)=f(2:nr,:)-f(1:nr-1,:);
bdy(:,2:nc)=f(:,2:nc