⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 conjgrad.m

📁 matlab算法集 matlab算法集
💻 M
字号:
function [x,ev,j] = conjgrad (x0,tol,v,m,f)
%-----------------------------------------------------------------------
% Usage:       [x,ev,j] = conjgrad (x0,tol,v,m,f)
%
% Description: Use the Fletcher-Reeves version of the conjugate
%              gradient method to solve the following n-dimensional
%              unconstrained optimization problem:
%
%                  minimize: f(x)
%
% Inputs:       x0  = n by 1 vector containing initial guess
%               tol = error tolerance used to terminate search (tol >= 0)
%               v   = order of method:
%
%                        v        method
%                        ---------------------------
%                        1   steepest descent
%                        n   full conjugate gradient
%                        ---------------------------  
%
%               m   = maximum number of iterations (m >= 1)
%               f   = string specifying name of objective function:
%                     f(x). The form of f is:
%
%                       function y = f(x)
%
%                     When f is called with the n by 1 vector x, it must
%                     it must return the scalar value of the objective 
%                     function f(x). 
%
% Outputs:      x  = n by 1 solution vector
%               ev = number of scalar function evaluations performed
%               j  = number of iterations performed. If j < m, then
%                    the following convergence criterion was satisfied 
%                    where eps denotes the machine epsilon and h is 
%                    the step length:
%
%                       (||df(x)/dx|| < tol) or (h*||x|| < eps) 
%
% Notes:        If f(x) is quadratic, conjgrad should converge in at
%               most n iterations. 
%-----------------------------------------------------------------------

% Initialize

   n = length(x0);
   chkvec (x0,1,'conjgrad');
   tol = args (tol,0,tol,3,'conjgrad');
   v   = args (v,1,n,4,'conjgrad');
   m   = args (m,1,m,5,'conjgrad');
   e = 2*sqrt(eps);
   h = 1;
   j = 0;
   k = 0;
   ev = 0;
   d  = zeros (n,1);        
   g1 = zeros (n,1);
   gamma = tol + 1;      
   err = 0;
   dx = 1;
   x  = x0;

   hwbar = waitbar(0,'Computing Optimum: conjgrad');
   while (j <= m) & (gamma > tol) & (dx > e*norm(x,inf)) & ~err

% Compute gradient and its norm 

      waitbar (max(j/m,tol/gamma))
      r = dot (g1,g1);
      g1 = gradfmu (f,'','',x,0);
      ev = ev + 2*n;
      gamma = norm (g1,inf);
      
% Compute search direction 
         
      if k < 1
         d = -g1;
      else 
         beta = dot(g1,g1)/r;
         d = -g1 + beta*d;
      end
      delta = norm (d,inf);
            
% Perform a line search along direction d 

      [a,b,c,err,eval] = bracket (delta,x,d,0,f,'','');
      ev = ev + eval;
      if ~err 
         [h,eval] = golden (x,d,a,c,e,0,f,'','');
         ev = ev + eval;
         dx = h*norm(d,inf);
         x = x + h*d;
      end

% Update indices, check for restart 
   
      j = j + 1;
      k = k + 1;
      if k == v 
         k = 0; 
      end
   end
   close(hwbar)
%-----------------------------------------------------------------------




⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -