⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 outputinfo.h

📁 MultiBoost 是c++实现的多类adaboost酸法。与传统的adaboost算法主要解决二类分类问题不同
💻 H
字号:
/** This file is part of MultiBoost, a multi-class * AdaBoost learner/classifier** Copyright (C) 2005-2006 Norman Casagrande* For informations write to nova77@gmail.com** This library is free software; you can redistribute it and/or* modify it under the terms of the GNU Lesser General Public* License as published by the Free Software Foundation; either* version 2.1 of the License, or (at your option) any later version.** This library is distributed in the hope that it will be useful,* but WITHOUT ANY WARRANTY; without even the implied warranty of* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU* Lesser General Public License for more details.** You should have received a copy of the GNU Lesser General Public* License along with this library; if not, write to the Free Software* Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA  02110-1301  USA**//*** \file OutputInfo.h Outputs the step-by-step information.*/#ifndef __OUTPUT_INFO_H#define __OUTPUT_INFO_H#include <fstream>#include <string>#include <vector>#include <map>#include "IO/InputData.h"using namespace std;namespace MultiBoost {// forward declaration to avoid an includeclass BaseLearner;/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////*** Format and output step-by-step information.* With this class it is possible to output and update* the error rates, margins and the edge.* These function must be called at each iteration with the* newly found weak hypothesis, but \b before the update of the* weights.* @warning Don't forget to begin the list of information* printed with outputIteration(), and close it with* a call to endLine()!* @date 16/11/2005*/class OutputInfo{public:   /**   * The constructor. Create the object and open the output file.   * @param outputInfoFile The name of the file which will be updated   * with the data.   * @date 14/11/2005   */   OutputInfo(const string& outputInfoFile);   /**   * Just output the iteration number.   * @param t The iteration number.   * @date 14/11/2005   */   void outputIteration(int t);   /**   * Output the error of the given data.   * The error is computed by holding the information on the previous   * weak hypotheses. In AdaBoost, the error is computed with the formula   * \f[   * {\bf g}(x) = \sum_{t=1}^T \alpha^{(t)} {\bf h}^{(t)}(x),   * \f]   * we therefore update the \f${\bf g}(x)\f$ vector (for each example)   * each time this method is called:   * \f[   * {\bf g} = {\bf g} + \alpha^{(t)} {\bf h}^{(t)}(x).   * \f]   * @remark There can be any number of data to have the gTable. Each one is   * mapped into a map that uses the pointer of the data as key.   * @param pData The input data.   * @param pWeakHypothesis The current weak hypothesis.   * @see table   * @see _gTableMap   * @date 16/11/2005   */   void outputError(InputData* pData, BaseLearner* pWeakHypothesis);      /**   * Output the minimum margin the sum of below zero margins.   * These two elements are useful for an analysis of the training process.   *   * The margins are represent the per-class weighted correct rate, that is   * \f[   * \rho_{i, \ell} = \sum_{t=1}^T \alpha^{(t)} h_\ell^{(t)}(x_i) y_i   * \f]   * The \b fist \b value that this method outputs is the minimum margin, that is   * \f[   * \rho_{min} = \mathop{\rm arg\, min}_{i, \ell} \rho_{i, \ell},   * \f]   * which is normalized by the sum of alpha   * \f[   * \frac{\rho_{min}}{\sum_{t=1}^T \alpha^{(t)}}.   * \f]   * This can give a useful measure of the size of the functional margin.   *   * The \b second \b value which this method outputs is simply the sum of the   * margins below zero.   * @param pData The input data.   * @param pWeakHypothesis The current weak hypothesis.   * @date 16/11/2005   */   void outputMargins(InputData* pData, BaseLearner* pWeakHypothesis);   /**   * Output the edge. It is the measure of the accuracy of the current    * weak hypothesis relative to random guessing, and is defined as   * \f[   * \gamma = \sum_{i=1}^n \sum_{\ell=1}^k w_{i, \ell}^{(t)} h_\ell^{(t)}(x_i)   * \f]   * @param pData The input data.   * @param pWeakHypothesis The current weak hypothesis.   * @date 16/11/2005   */   void outputEdge(InputData* pData, BaseLearner* pWeakHypothesis);   /**   * End of line in the file stream.   * Call it when all the needed information has been outputted.   * @date 16/11/2005   */   void endLine() { _outStream << endl; }protected:   /**   * A table representing the votes for each example.   * Example:   * \verbatim   Ex_1:  Class 0, Class 1, Class 2, .. , Class k   Ex_2:  Class 0, Class 1, Class 2, .. , Class k   ..   Ex_n:  Class 0, Class 1, Class 2, .. , Class k \endverbatim   * @date 16/11/2005   */   typedef vector< vector<double> > table;   ofstream                _outStream; //!< The output stream    /**   * Maps the data to its g(x) table.   * It is needed to keep this information saved from iteration to   * iteration.   * @see table   * @see outputError()   * @date 16/11/2005   */   map<InputData*, table> _gTableMap;    /**   * Maps the data to the margins table.   * It is needed to keep this information saved from iteration to   * iteration.   * @see table   * @see outputMargins()   * @date 16/11/2005   */   map<InputData*, table>  _margins;   /**   * Maps the data to the sum of the alpha.   * It is needed to keep this information saved from iteration to   * iteration.   * @see outputMargins()   * @date 16/11/2005   */   map<InputData*, double> _alphaSums;};} // end of namespace MultiBoost#endif // __OUTPUT_INFO_H

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -