📄 cg.m
字号:
%% xstar=cg(func,delf,x0,tol,maxiter)%% Use Fletcher-Reeves conjugate gradient method to minimize a function f(x).%% func name of the function f(x)% delf name of the gradient del f(x)% x0 initial guess% tol stopping tolerance% maxiter maximum number of iterations allowed%function xstar=cg(func,grad,x0,tol,maxiter)global FUN;global X;global P;%% Figure out the size of the problem. %n=size(x0,1);%% Initialize x and oldx. We use oldx=1000*x just to make sure that% it isn't too close to x. %x=x0;fx=feval(func,x);oldfx=fx*10;oldx=x*10;%% The main loop. While the current solution isn't good enough, keep% trying... Stop after 500 iterations in the worst case. %%% Initialize the iteration count.% iter=0;%% Get the gradient at the initial point.% g=feval(grad,x); oldg=g; oldp=zeros(size(g));%% The loop.% while (iter <maxiter),%% Check the termination criteria.% if ((norm(g,2)< sqrt(tol)*(1+abs(fx))) & ... (abs(oldfx-fx)<tol*(1+abs(fx))) & ... (norm(oldx-x,2)<sqrt(tol)*(1+norm(x,2)))),% disp('Problem solved');% iter xstar=x; return; end; beta=g'*g/(oldg'*oldg); p=-g+beta*oldp;%% If p is not a good descent direction, then restart with a steepest% descent step.% if (g'*p >= 0), p=-g; end;%% Setup to minimize f(x+alpha*p)% X=x; P=p; FUN=func;%% Minimize along the line.% [a,u,b]=bracket('subfun');% [alphamin,falphamin,newa,newb]=gssearch('subfun',a,u,b,1.0e-5); [alphamin,falphamin,newa,newb]=gssearch('subfun',a,u,b,1.0e-10);%% The new point is at X+alphamin*P. Update oldx to.% oldx=x; oldfx=fx; oldp=p; x=X+alphamin*P; fx=falphamin;%% Compute the gradient at the new point.% oldg=g; g=feval(grad,x);%% Update the iteration counter.% iter=iter+1;%% End of the loop.% end;%% Save the results.%xstar=x;%% Print out some information%disp('CG method failed');disp('Iterations were');iter
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -