optmum.src
来自「没有说明」· SRC 代码 · 共 388 行 · 第 1/2 页
SRC
388 行
/*
** optmum.src - General Nonlinear Optimization
** (C) Copyright 1988-1998 by Aptech Systems, Inc.
** All Rights Reserved.
**
** This Software Product is PROPRIETARY SOURCE CODE OF APTECH
** SYSTEMS, INC. This File Header must accompany all files using
** any portion, in whole or in part, of this Source Code. In
** addition, the right to create such files is strictly limited by
** Section 2.A. of the GAUSS Applications License Agreement
** accompanying this Software Product.
**
** If you wish to distribute any portion of the proprietary Source
** Code, in whole or in part, you must first obtain written
** permission from Aptech Systems.
**
** Written by Ronald Schoenberg
**
** CONTENTS LINE
** -------- ----
** PROC OPTMUM 29
** Global Variables 69
** Using OPTMUM recursively 232
** Source Code 249
**
**-------------------**------------------**-------------------**-----------**
**-------------------**------------------**-------------------**-----------**
**
** PROC OPTMUM
**
** FORMAT
** { x,f,g,retcode } = optmum(&fct,x0)
**
** INPUT
**
** &fct - pointer to a procedure that computes the function to
** be minimized. This procedure must have one input
** argument, a vector of parameter values, and one
** output argument, the value of the function evaluated
** at the input vector of parameter values.
**
** x0 - vector of start values
**
** OUTPUT
** x - vector of parameters at minimum
** f - function evaluated at x
** g - gradient evaluated at x
** retcode - return code:
**
** 0 normal convergence
** 1 forced exit
** 2 maximum number of iterations exceeded
** 3 function calculation failed
** 4 gradient calculation failed
** 5 Hessian calculation failed
** 6 step length calculation failed
** 7 function cannot be evaluated at initial parameter values
** 8 number of elements in the gradient vector inconsistent
** with number of starting values
** 9 gradient function returned a column vector rather than
** the required row vector
** 10 secant update failed
** 20 Hessian failed to invert
**
**
**-------------------**------------------**-------------------**-----------**
**-------------------**------------------**-------------------**-----------**
**
** GLOBAL VARIABLES LINE
**
** (default values in parantheses)
** __title - string, title ("") 84
** _opalgr - scalar, optimization algorithm (2) 86
** _opstep - scalar, selects type of step length (2) 93
** _opshess - scalar or KxK matrix, selects starting Hessian (0) 105
** _opfhess - KxK matrix, contains final Hessian 111
** _opmbkst - scalar, # of backsteps in computing steplength (10) 114
** _opgtol - scalar, convergence tolerance for gradient (1e-5) 120
** _opgdprc - scalar, pointer to gradient procedure (0) 124
** _ophsprc - scalar, pointer to Hessian procedure (0) 143
** _opgdmd - scalar, numerical gradient method (1) 158
** _opparnm - Kx1 Char. vector, parameter names (0) 163
** _opdfct - scalar, criterion for change in function (.001) 165
** _opditer - scalar, # of iters to switch algorithms (20) 169
** _opmiter - scalar, maximum number of iterations (1e+5) 173
** _opmtime - scalar, maximum time in iterations in minutes (1e+5) 175
** _oprteps - scalar, radius of random direction (0) 183
** _opusrch - scalar, flag for user-controlled line search (0) 188
** _opdelta - scalar, floor of Hessian eigenvalues in NEWTON (.1) 192
** _opstmth - string, contains starting method ("") 196
** _opmdmth - string, contains "middle" method ("") 204
** _opkey - scalar, keyboard capture flag (1) 210
** _opgrdh - scalar, increment size for computing gradient (0) 219
**
** __title - string, title of run
**
** _opalgr - scalar, indicator for optimization method:
** = 1 SD (steepest descent - default)
** = 2 BFGS (Broyden, Fletcher, Goldfarb, Shanno)
** = 3 Scaled BFGS
** = 4 Self-Scaling DFP (Davidon, Fletcher, Powell)
** = 5 NEWTON (Newton-Raphson)
** = 6 Polak-Ribiere Conjugate Gradient
**
** _opstep - scalar, indicator determining the method for computing step
** length.
** = 1, steplength = 1
** = 2, STEPBT (default)
** = 3, golden steplength.
** = 4, Brent
**
** Usually _opstep = 2 will be best. If the optimization bogs
** down try setting _opstep = 1 or 3. _opstep = 3 will generate
** slow iterations but faster convergence and _opstep = 1 will
** generate fast iterations but slower convergence.
**
** _opshess - scalar or KxK matrix, determines the starting hessian for
** BFGS, DFP, and Newton methods.
** = 0, start with identity matrix (default)
** = 1, computing starting hessian
** = matrix, user-defined starting hessian.
**
** _opfhess - KxK matrix, contains the final Hessian, if one has been
** calculated. If the inversion of the Hessian fails during
** NEWTON iterations this matrix can be analyzed for linear
** dependencies which will suggest tactics for re-specifying
** the model.
**
** _opmbkst - scalar, maximum number of backsteps taken to find step length.
** Default = 10.
**
** _opgtol - scalar, convergence tolerance for gradient of estimated
** coefficients. Default = 1e-5. When this criterion has been
** satisifed OPTMUM will exit the iterations.
**
** _opgdprc - scalar, pointer to a procedure that computes the gradient of the
** function with respect to the parameters. For example,
** the instruction:
**
** _opgdprc=&gradproc
**
** will tell OPTMUM that a gradient procedure exists as well
** where to find it. The user-provided procedure has a
** single input argument, a vector of parameter values, and
** a single output argument, a vector of gradients of the
** function with respect to the parameters evaluated at the
** vector of parameter values. For example, suppose the
** procedure is named gradproc and the function is a quadratic
** function with one parameter: y=x^2+2*x+1, then
**
** proc gradproc(x); retp(2*x+2); endp;
**
** Default = 0, i.e., no gradient procedure has been provided.
**
** _ophsprc - scalar, pointer to a procedure that computes the hessian,
** i.e., the matrix of second order partial derivatives of the
** function with respect to the parameters. For example, the
** instruction:
**
** _ophsprc=&hessproc;
**
** will tell OPTMUM that a procedure has been provided for the
** computation of the hessian and where to find it. The
** procedure that is provided by the user must have a single
** input argument, the vector of parameter values, and a single
** output argument, the symmetric matrix of second order
** derivatives of the function evaluated at the parameter
** values.
**
** _opgdmd - scalar, method for computing numerical gradient.
** = 0, central difference
** = 1, forward difference (default)
** = 2, forward difference, Richardson Extrapolation
**
** _opparnm - Kx1 character vector, parameter labels.
**
** _opdfct - scalar, criterion for change in function which will cause
** OPTMUM to switch algorithms when __design is nonzero.
** Default = .001.
**
** _opditer - scalar, criterion for maximum number of iterations before
** switching algorithms when __design is nonzero.
** Default = 20.
**
** _opmiter - scalar, maximum number of iterations. Default = 1e+5.
**
** _opmtime - scalar, maximum time in iterations in minutes.
** Default = 1e+5, about 10 weeks.
**
** _oprteps - scalar, if _oprteps is set to a nonzero value (1e-2, say)
** and all other line search methods fail then
** OPTMUM will attempt a random direction with radius
** determined by _oprteps.
**
⌨️ 快捷键说明
复制代码Ctrl + C
搜索代码Ctrl + F
全屏模式F11
增大字号Ctrl + =
减小字号Ctrl + -
显示快捷键?