⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 cgls.cc

📁 COOOL:CWP面向对象最优化库(CWP Object Oriented Optimization Library) COOOL是C++类的一个集合
💻 CC
字号:
//============================================================// COOOL           version 1.1           ---     Nov,  1995//   Center for Wave Phenomena, Colorado School of Mines//============================================================////   This code is part of a preliminary release of COOOL (CWP// Object-Oriented Optimization Library) and associated class // libraries. //// The COOOL library is a free software. You can do anything you want// with it including make a fortune.  However, neither the authors,// the Center for Wave Phenomena, nor anyone else you can think of// makes any guarantees about anything in this package or any aspect// of its functionality.//// Since you've got the source code you can also modify the// library to suit your own purposes. We would appreciate it // if the headers that identify the authors are kept in the // source code.////=============================// author:  H. Lydia Deng, 03/04/94, 03/13/94// Definition of the linear conjugate-gradient class//=============================#include <CGLS.hh>LSConjugateGradient::~LSConjugateGradient() { 	delete		dataError;	delete		modelError;	delete		search;}LSConjugateGradient::LSConjugateGradient(int n, LinearForward* p, Vector<double>* data, 			int it, double eps) : QuadraticOptima(n, p, data){    	iterMax 	= 	it;    	tol 		= 	eps;	m		=	p->modelSize();	n		=	p->dataSize();	search		=	new Vector<double>(m);	modelError	=	new Vector<double>(m);	dataError	=	new Vector<double>(n);}LSConjugateGradient::LSConjugateGradient(int n, LinearForward* p, Vector<double>* data, 			int it, double eps, int verb) : QuadraticOptima(n, p, data, verb){    	iterMax 	= 	it;    	tol 		= 	eps;	m		=	p->modelSize();	n		=	p->dataSize();	search		=	new Vector<double>(m);	modelError	=	new Vector<double>(m);	dataError	=	new Vector<double>(n);}void  LSConjugateGradient::getModelError(Vector<double>& v){    *modelError 	=	((MisFitFcn*)fp)->adjointOperation(v);	    scale		=	modelError->norm2S();}    void  LSConjugateGradient::upDating(){    Vector<double> aux(((MisFitFcn*)fp)->experiments());    aux			=	((MisFitFcn*)fp)->operateOn(*search);    alpha		=	scale/(aux*aux);    *dataError  	-=	alpha*aux;    scaleOld		=	scale;}void  LSConjugateGradient::conjugateDirection(){    double beta		=	scale/scaleOld;    	*search	=	*modelError + beta*(*search);}Model<double> LSConjugateGradient::optimizer(Model<double>& model){    *dataError	=	((MisFitFcn*)fp)->updateError(model);    getModelError(*dataError);    *search 	=	*modelError;    double value 	=	fp->performance(model);    if (isVerbose) cerr << "the current residue: "<< value << endl;    QuadraticOptima::appendResidue(value);    while(fp->iterations() < iterMax && residue->last() >tol) {	upDating();	getModelError(*dataError);	model	=	model.update(1,alpha,*search);	conjugateDirection();	value 	=	fp->performance(model);	if (isVerbose) cerr << "the current residue: "<< value << endl;	QuadraticOptima::appendResidue(value);    }    return(model);}Model<long> LSConjugateGradient::optimizer(Model<long>& model){    Model<double> temp(model);    temp = optimizer(temp);    Model<long> optm(temp);    return optm;}

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -