⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 asa-readme.html

📁 simulated annealing code ASA
💻 HTML
📖 第 1 页 / 共 5 页
字号:
range,   with  two  "parameters,"  e.g.,  along  the  North-South  and
East-West directions.  We wish to  find  the  lowest  valley  in  this
terrain.  ASA approaches this problem similar to using a bouncing ball
that can bounce over mountains from valley to valley.  We start  at  a
high  "temperature,"  where  the  temperature is an ASA parameter that
mimics the effect of a fast moving particle in a hot object like a hot
molten  metal,  thereby  permitting the ball to make very high bounces
and being able to bounce over any mountain to access any valley, given
enough  bounces.   As  the  temperature is made relatively colder, the
ball cannot bounce so high, and it also can settle to  become  trapped
in relatively smaller ranges of valleys.
<P>
We  imagine that our mountain range is aptly described by a &quot;cost
function.&quot;  We define probability distributions of the two directional
<P>
parameters,   called  generating  distributions  since  they  generate
possible valleys or states we  are  to  explore.   We  define  another
distribution, called the acceptance distribution, which depends on the
difference of cost functions of the present generated valley we are to
explore and the last saved lowest valley.  The acceptance distribution
decides probabilistically whether to stay in a new lower valley or  to
bounce  out  of  it.   All the generating and acceptance distributions
depend on temperatures.
<P>
The ASA code was first developed in 1987 as Very  Fast  Simulated
Reannealing  (VFSR)  to deal with the necessity of performing adaptive
global optimization on multivariate nonlinear  stochastic  systems[2].
The  first  published  use of VFSR for a complex systems was in combat
analysis, using a model of combat first developed in  1986,  and  then
applied  to  exercise  and  simulation data in a series of papers that
spanned 1988-1993[3].  The first applications to combat analysis  used
code written in RATFOR and converted into FORTRAN.  Other applications
since then have used new code  written  in  C.   (The  ASA-NOTES  file
contains some comments on interfacing ASA with FORTRAN codes.)
<P>
In  November  1992, the VFSR C-code was rewritten, e.g., changing
to the use of long descriptive names, and made publicly  available  as
version  6.35  under a &quot;copyleft&quot; GNU General Public License (GPL)[4],
and copies were placed in NETLIB and STATLIB.
<P>
Beginning in January 93, many adaptive features  were  developed,
largely  in  response  to  users'  requests, leading to this ASA code.
Until 1996, ASA was located at <A HREF="http://www.alumni.caltech.edu/~ingber/">http://www.alumni.caltech.edu/~ingber/.</A>
Pointers  were  placed  in  NETLIB  and STATLIB to this location.  ASA
versions 1.1 through 5.13 retained the GPL,  but  subsequent  versions
through  this  one  have  incorporated a simpler <A HREF="#ASA-LICENSE">ASA-LICENSE</A>, based in
part  on  a  University  of  California  license,  that  protects  the
integrity  of  the  algorithm, promotes widespread usage, and requires
reference to current source code.  As the archive grew, more room  and
maintenance  was  required, and in February 1996 the site was moved to
the present ingber.com location.  Pointers were placed in the  Caltech
site  to  this  location.   <A HREF="http://alumni.caltech.edu/~ingber">http://alumni.caltech.edu/~ingber</A>  is  the
mirror homepage for the ASA site.  Beginning in January 2007, ASA also
is    listed    at   <A HREF="http://asa-caltech.sourceforge.net">http://asa-caltech.sourceforge.net</A>   (<A HREF="http://asa-">http://asa-</A>
caltech.sf.net).
<P>
ASA has been examined in the context of a review  of  methods  of
simulated   annealing   using   annealing   versus  quenching  (faster
temperature schedules than  permitted  by  basic  heuristic  proof  of
ergodicity)[5].   A  paper  has  indicated  how  this technique can be
enhanced by combining it with some other powerful algorithms, e.g., to
produce  an  algorithm  for  parallel computation[6].  ASA is now used
world-wide  across  many  disciplines[7,8,9,10],  including   specific
disciplines         such         as        finance[11,12,13,14,15,16],
neuroscience[17,18,19,20], and combat analyses[21,22,23,24,25].   Some
papers  illustrate  the  combined  use  of  ASA  for  optimization and
sampling[26].  The <A HREF="http://www.ingber.com/asa_papers.html">http://www.ingber.com/asa_papers.html</A> file  in  the
ASA  archive  contains references to some patents and papers using ASA
<P>
and VFSR.
<BR>
<P>
5.2.  <A NAME="Outline-of-ASA-Algorithm">Outline of ASA Algorithm</A> [<A HREF="#To-Top-of-ASA-READMEhtml">To-Top-of-ASA-README.html</A>]
<P>
Details of the ASA algorithm are best obtained from the published
papers.  There are three parts to its basic structure.
<P>
5.2.1.  <A NAME="Generating-Probability-Density-Function">Generating Probability Density Function</A> [<A HREF="#To-Top-of-ASA-READMEhtml">To-Top-of-ASA-README.html</A>]
<P>
In  a  D-dimensional  parameter  space with parameters p^i having
ranges [A_i, B_i], about the k'th  last  saved  point  (e.g,  a  local
optima),  p_k^i, a new point is generated using a distribution defined
by the product of distributions for each parameter, g^i(y^i; T_i),  in
terms  of  random  variables  y^i  in [-1, 1], where p_k+1^i = p_k^i +
y^i(B_i - A_i), and &quot;temperatures&quot; T_i,
<BR>
<BR> g^i(y^i; T_i) = 1/[2(|y^i| + T_i)(1 + 1/T_i)].
<BR>
The   DEFINE_OPTIONS   <A HREF="#USER-GENERATING-FUNCTION-FALSE">USER_GENERATING_FUNCTION</A>   permits   using   an
alternative to this ASA distribution function.
<P>
5.2.2.  <A NAME="Acceptance-Probability-Density-Function">Acceptance Probability Density Function</A> [<A HREF="#To-Top-of-ASA-READMEhtml">To-Top-of-ASA-README.html</A>]
<P>
The  cost  functions,  C(p_k+1)  -  C(p_k),  are compared using a
uniform random generator, U in [0, 1), in a &quot;Boltzmann&quot; test: If
<BR>
<BR> exp[-(C(p_k+1) - C(p_k))/T_cost] &gt; U,
<BR>
where T_cost is the &quot;temperature&quot; used for this  test,  then  the  new
point  is  accepted  as  the  new  saved point for the next iteration.
Otherwise, the last  saved  point  is  retained.   The  DEFINE_OPTIONS
<A HREF="#USER-ACCEPT-ASYMP-EXP-FALSE">USER_ACCEPT_ASYMP_EXP</A>    or    <A HREF="#USER-ACCEPT-THRESHOLD-FALSE">USER_ACCEPT_THRESHOLD</A>    permit   using
alternatives to this Boltzmann distribution function.
<P>
5.2.3.  <A NAME="Reannealing-Temperature-Schedule">Reannealing Temperature Schedule</A> [<A HREF="#To-Top-of-ASA-READMEhtml">To-Top-of-ASA-README.html</A>]
<P>
The annealing schedule for each parameter temperature, T_i,  from
a starting temperature T_i0, is
<BR>
<BR> T_i(k_i) = T_0i exp(-c_i k_i^(1/D)).
<BR>
This is discussed further below.
<P>
The  annealing  schedule  for  the  cost temperature is developed
similarly to the  parameter  temperatures.   However,  the  index  for
reannealing  the cost function, k_cost, is determined by the number of
accepted points, instead of the number of generated points as used for
the parameters.  This choice was made because the Boltzmann acceptance
criteria uses an exponential distribution which is not  as  fat-tailed
as the ASA distribution used for the parameters.  This schedule can be
modified  using  several  OPTIONS.   In  particular,  the  Pre-Compile
DEFINE_OPTIONS  <A HREF="#USER-COST-SCHEDULE-FALSE">USER_COST_SCHEDULE</A>  permits quite arbitrary functional
<P>
modifications  for  this  annealing  schedule,  and  the   Pre-Compile
DEFINE_OPTIONS
<P>
As  determined  by  the  Program  Options selected, the parameter
&quot;temperatures&quot; may be periodically adaptively reannealed, or increased
relative   to  their  previous  values,  using  their  relative  first
derivatives with respect to the cost function,  to  guide  the  search
&quot;fairly&quot; among the parameters.
<P>
As determined by the Program Options selected, the reannealing of
the cost temperature resets the scale of the the annealing of the cost
acceptance criteria as
<BR>
<BR> T_cost(k_cost) = T_0cost exp(-c_cost k_cost^(1/D)).
<BR>
The new T_0cost is taken to be the minimum of the current initial cost
temperature and the maximum of the absolute values  of  the  best  and
last   cost  functions  and  their  difference.   The  new  k_cost  is
calculated taking T_cost as the maximum of the current value  and  the
absolute  value  of  the  difference  between  the last and best saved
minima of the cost function, constrained not  to  exceed  the  current
initial cost temperature.  This procedure essentially resets the scale
of the annealing of the cost  temperature  within  the  scale  of  the
current best or last saved minimum.
<P>
This  default  algorithm  for  reannealing  the cost temperature,
taking advantage of the ASA  importance  sampling  that  relates  most
specifically  to  the  parameter  temperatures,  while  often is quite
efficient for some systems, may lead to problems in dwelling too  long
in  local  minima  for other systems.  In such case, the user may also
experiment   with   alternative   algorithms   effected   using    the
<A HREF="#OPTIONS--gt-Reanneal-Cost-1-">Reanneal_Cost</A>  OPTIONS, discussed below.  For example, ASA provides an
alternative calculation for the cost temperature, when <A HREF="#OPTIONS--gt-Reanneal-Cost-1-">Reanneal_Cost</A> &lt;
-1  or  &gt; 1, that periodically calculates the initial and current cost
temperatures or  just  the  initial  cost  temperature,  resp.,  as  a
deviation over a sample of cost functions.
<P>
These  reannealing  algorithms  can  be changed adaptively by the
user  as  described  below  in  the  sections  <A HREF="#USER-REANNEAL-COST-FALSE">USER_REANNEAL_COST</A>  and
<A HREF="#USER-REANNEAL-PARAMETERS-FALSE">USER_REANNEAL_PARAMETERS</A>.
<P>
5.3.  <A NAME="Efficiency-Versus-Necessity">Efficiency Versus Necessity</A> [<A HREF="#To-Top-of-ASA-READMEhtml">To-Top-of-ASA-README.html</A>]
<P>
ASA  is not necessarily an &quot;efficient&quot; code.  For example, if you
know that your cost function to be optimized is something close  to  a
parabola,  then  a  simple  gradient  Newton search method most likely
would be faster than ASA.  ASA is  believed  to  be  faster  and  more
robust  than  other  simulated  annealing  techniques for most complex
problems with multiple local optima; again, be careful  to  note  that
some  problems  are  best  treated by other algorithms.  If you do not
know much about the structure of your system, and especially if it has
complex constraints, and you need to search for a global optimum, then
this ASA code is heartily recommended to you.
<P>
In the context of efficiency and necessity, the  user  should  be
alert to recognize that any sampling or optimization program generally
should be considered as complementary, not as a substitute, to gaining
knowledge  of  a  particular system.  Unlike relatively &quot;canned&quot; codes
that exist for (quasi-)linear systems, nonlinear systems typically are
non-typical.   Often  some  homework  must  be  done to understand the
system, and tuning often is required of numerical algorithms  such  as
ASA.   For  example,  while  principal  component analyses (PCA) often
suffices to generate good  (quasi-)orthogonal  or  (quasi-)independent
sets  of  parameters,  this is not true for general nonlinear systems.
While such innovations as reannealing take good advantage of ASA which
offers  independent  distributions  for each parameter, this generally
may not be a good  substitute  for  a  user-defined  front-end,  e.g.,
before  the  call  to <A HREF="#double-asa-">asa ()</A> or even embedded within the <A HREF="#double-cost-function-">cost_function</A>
(), to interpret and define relevant parameters.
<P>
The ASA-NOTES file contains the sections  @@Number  of  Generated
States Required and @@Judging Importance-Sampling, recommending use of
log-log plots to extrapolate the number of generated  states  required
to  attain  a  global  minimum,  possibly  as  a  function of selected
OPTIONS.
<P>
6.  <A NAME="Outline-of-Use">Outline of Use</A> [<A HREF="#To-Top-of-ASA-READMEhtml">To-Top-of-ASA-README.html</A>]
<P>
Set up the ASA interface: Your program should be divided into two
basic  modules.   (1)  The user calling procedure, containing the cost
function to be minimized (or its negative  if  you  require  a  global
maximum), is contained in asa_usr.c, asa_usr.h and asa_usr_cst.c.  (2)
The ASA optimization procedure, is contained in asa.c and asa.h.   The
file  asa_usr_asa.h  contains  definitions  and  macros common to both
asa.h  and  asa_usr.h.   Furthermore,  there  are  some   options   to
explore/read below.  It is assumed there will be no confusion over the
standard uses of the term &quot;parameter&quot; in different contexts, e.g.,  as

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -