⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 cg.cc

📁 COOOL:CWP面向对象最优化库(CWP Object Oriented Optimization Library) COOOL是C++类的一个集合
💻 CC
字号:
//============================================================// COOOL           version 1.1           ---     Nov,  1995//   Center for Wave Phenomena, Colorado School of Mines//============================================================////   This code is part of a preliminary release of COOOL (CWP// Object-Oriented Optimization Library) and associated class // libraries. //// The COOOL library is a free software. You can do anything you want// with it including make a fortune.  However, neither the authors,// the Center for Wave Phenomena, nor anyone else you can think of// makes any guarantees about anything in this package or any aspect// of its functionality.//// Since you've got the source code you can also modify the// library to suit your own purposes. We would appreciate it // if the headers that identify the authors are kept in the // source code.////=============================// Definition of the conjugate gradient class// Non-linear conjugate gradient algorithm// author:  Wenceslau Gouveia// modified:  H. Lydia Deng, 02/23/94,  /03/14/94//=============================#include <CG.hh>#include <defs.hh>   static const char*  myNameIs =  "Conjugate Gradient"; const char* ConjugateGradient::className() const { 	return (myNameIs);}ConjugateGradient::ConjugateGradient(LineSearch* p, int it, double eps) : LineSearchOptima(p){   iterMax 	= 	it;   tol 		= 	eps;   iterNum 	= 	0;}ConjugateGradient::ConjugateGradient(LineSearch* p, int it, double eps, int verb) : LineSearchOptima(p, verb){    	iterMax 	= 	it;    	tol 		= 	eps;    	iterNum 	= 	0;    }Model<double> ConjugateGradient::optimizer(Model<double>& model0){    //reset the residue history for every new optimizer   iterNum = 0;   isSuccess = 0;   if (residue != NULL)    {  	delete residue;   	residue = new List<double>;   }   int n = model0.modSize();   Model<double> 		model1(model0);     		// new model    Vector<double> 		search(n);		// search direction   Vector<double> 		g0(n);			// old gradient vector   Vector<double> 		g1(n);			// new gradient vector   double 			beta;			// beta parameter   double 			lambda = .025;		// line search parameter   double 			descent = 0.;		// descent direction// Beginning iterations   g0		= 	ls->gradient(model0);   // check the gradient, in case the initial model is the optimal, Lydia 03/08/95   double err = (double)sqrt(g0*g0);   if (isVerbose) cerr << "Initial residue : " << err << endl;   NonQuadraticOptima::appendResidue(err);	// residual   if (err < tol) {	 if (isVerbose) cerr << "Initial guess was great! \n";	 isSuccess = 1;	 return model0;      }	   // Considering first iteration    search = -1. * g0;   descent = search * g0;   model1 = ls->search(model0, search, descent, lambda);   g1 = ls->gradient(model1);		// Gradient at new model   err = (double)sqrt(g1*g1);   if (isVerbose) cerr << "Iteration (0) : " << "current value of the objective function: "      << ls->currentValue() << "\t current residue: "<< err << endl;   NonQuadraticOptima::appendResidue(err);	// residual   iterNum = 0;   double temp;   do    {      iterNum++;      temp 	= 	1./(g0*g0);      beta	=	(g1-g0)*g1;		      beta 	*= 	temp;			// computation Polak & Ribiere      search =  beta * search - g1;		// search direction            descent = search * g1;			// descent      if (descent > 0.)      {	 if (isVerbose)	    cerr << "Reset searching directions to gradient! \n";	 	 search = -g1;	 descent = search * g1;	      }       model0 = model1;      g0 = g1;	// save the old model and gradient before new search            model1 = ls->search(model0, search, descent, lambda); // line search      g1 = ls->gradient(model1);	          err = (double)sqrt(g1*g1);      if (isVerbose) 	 cerr << "Iteration (" << iterNum << ") : "<<"current value of the objective function: "	    <<ls->currentValue() << "\t current residue: "<< err << endl;      NonQuadraticOptima::appendResidue(err);	// residual   } while (residue->last() > tol && iterNum < iterMax); // stopping criterion   if (residue->last() <= tol) isSuccess = 1;      return(model1);			// hopefully answer}Model<long> ConjugateGradient::optimizer(Model<long>& model0){    Model<double> temp(model0);    temp =  optimizer(temp);    Model<long> m(temp);    return m;}

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -