⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 nonlinear_conjugate_gradient.m

📁 RankNGG 算法实现. 含有dll文件。源码为matlab
💻 M
字号:
function  [x,iter,time_elapsed]=nonlinear_conjugate_gradient(start_x, NCG_parameters, gradient_function, gradient_arguments,verbose)
% Minimize a function using nonlinear conjugate gradient with secant and Polak-Ribiere.
%
% See nonlinear_conjugate_gradient_driver for an example to use this function.
%
% Uses only gradient information.
%
% No function evaluations needed.
%
% No second derivatives needed.
%
% I have implemented the Polak-Ribiere variant of nonlinear conjugate gradients algorithm, 
% which only needs the gradient and does not require evaluation of the function.
% By using a secant method for the line search, this algorithm also avoids the
% need for computing the second derivatives (Hessian matrix). 
%
%% Input
% * start_x                      --> 1Xd column vector, starting value.
% * NCG_parameters       --> parameters of the algorithm (see below).
% * gradient_function       --> name of the function which evaluates the gradient. [Returns 1xd column vector]
% * gradient_arguments   --> function specific arguments for the gradient_function.
% * verbose                    -->if 1 shows the progress.
%
% Nonlinear Conjugate Gradient parameters
%
% * NCG_parameters.epsilon_CG---CG error tolerance
% * NCG_parameters.epsilon_secant---secant method error tolerance
% * NCG_parameters.iter_max_CG---maximum number of CG iterations 
% * NCG_parameters.iter_max_secant---maximum number of secant method iterations
% * NCG_parameters.sigma_0---secant method step parameter
%
% Gradient function should be of the form
% 
% [gradient]=gradient_function(x, gradient_arguments);
%
%
%% Output
% * x                     --> 1Xd column vector, the minimizer.
% * iter                  --> Number of outer iterations.
% * time_elapsed   --> Time taken in seconds.
%
%% Signature
% Author: Vikas Chandrakant Raykar
% E-Mail: vikas.raykar@siemens.com, vikas@cs.umd.edu
% Date: June 29 2006
%
% See also: nonlinear_conjugate_gradient_driver,  preconditioned_nonlinear_conjugate_gradient
%

x=start_x;
n=length(x);

to=clock;

i=0;
k=0;
r=-feval(gradient_function,x, gradient_arguments); 
s=r;
d=s;
delta_new=r'*d;
delta_0=delta_new;

RHS_CG= NCG_parameters.epsilon_CG* NCG_parameters.epsilon_CG*delta_0;
RHS_secant= NCG_parameters.epsilon_secant* NCG_parameters.epsilon_secant;

while i <  NCG_parameters.iter_max_CG & delta_new>RHS_CG
    
   
    j=0;
    delta_d=d'*d;
  
    alpha=-NCG_parameters.sigma_0;
    
    eta_prev=feval(gradient_function,x+NCG_parameters.sigma_0*d, gradient_arguments)'*d;
      
    % Line search using secant method
    eta=feval(gradient_function,x, gradient_arguments)'*d;
    alpha=alpha*eta/(eta_prev-eta);
    x=x+alpha*d;
    eta_prev=eta;
    j=j+1;
    
    while j< NCG_parameters.iter_max_secant &  (alpha*alpha*delta_d)>RHS_secant
        
        eta=feval(gradient_function,x, gradient_arguments)'*d;
        alpha=alpha*eta/(eta_prev-eta);
        x=x+alpha*d;
        eta_prev=eta;
        j=j+1;
                
    end
    
    r=-feval(gradient_function,x, gradient_arguments);
    delta_old=delta_new;
    delta_mid=r'*s;
    s=r;
    delta_new=r'*s;
    beta=(delta_new-delta_mid)/delta_old;
    k=k+1;
   
    if k== n | beta <=0
        d=s;
        k=0;
    else
        d=s+beta*d;
    end
    
    i=i+1;    
    
    if verbose==1
        fprintf(1,'Iteration %3d[%4d] Line search iterations=%3d Error=%e[%e]\n',i,NCG_parameters.iter_max_CG,j,sqrt(delta_new/delta_0),NCG_parameters.epsilon_CG);
    end
        
end

time_elapsed=etime(clock,to);

iter=i;

return;

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -