⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 additiveregression.java

📁 wekaUT是 university texas austin 开发的基于weka的半指导学习(semi supervised learning)的分类器
💻 JAVA
📖 第 1 页 / 共 2 页
字号:
/* *    This program is free software; you can redistribute it and/or modify *    it under the terms of the GNU General Public License as published by *    the Free Software Foundation; either version 2 of the License, or *    (at your option) any later version. * *    This program is distributed in the hope that it will be useful, *    but WITHOUT ANY WARRANTY; without even the implied warranty of *    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the *    GNU General Public License for more details. * *    You should have received a copy of the GNU General Public License *    along with this program; if not, write to the Free Software *    Foundation, Inc., 675 Mass Ave, Cambridge, MA 02139, USA. *//* *    AdditiveRegression.java *    Copyright (C) 2000 Mark Hall * */package weka.classifiers.meta;import weka.classifiers.Classifier;import weka.classifiers.Evaluation;import weka.classifiers.trees.DecisionStump;import weka.classifiers.rules.ZeroR;import java.io.*;import java.util.*;import weka.core.*;import weka.classifiers.meta.*;/** * Meta classifier that enhances the performance of a regression base * classifier. Each iteration fits a model to the residuals left by the * classifier on the previous iteration. Prediction is accomplished by * adding the predictions of each classifier. Smoothing is accomplished * through varying the shrinkage (learning rate) parameter. <p> * * <pre> * Analysing:  Root_relative_squared_error * Datasets:   36 * Resultsets: 2 * Confidence: 0.05 (two tailed) * Date:       10/13/00 10:00 AM * * * Dataset                   (1) m5.M5Prim | (2) AdditiveRegression -S 0.7 \ *                                         |    -B weka.classifiers.meta.m5.M5Prime  *                          ---------------------------- * auto93.names              (10)    54.4  |    49.41 *  * autoHorse.names           (10)    32.76 |    26.34 *  * autoMpg.names             (10)    35.32 |    34.84 *  * autoPrice.names           (10)    40.01 |    36.57 *  * baskball                  (10)    79.46 |    79.85    * bodyfat.names             (10)    10.38 |    11.41 v  * bolts                     (10)    19.29 |    12.61 *  * breastTumor               (10)    96.95 |    96.23 *  * cholesterol               (10)   101.03 |    98.88 *  * cleveland                 (10)    71.29 |    70.87 *  * cloud                     (10)    38.82 |    39.18    * cpu                       (10)    22.26 |    14.74 *  * detroit                   (10)   228.16 |    83.7  *  * echoMonths                (10)    71.52 |    69.15 *  * elusage                   (10)    48.94 |    49.03    * fishcatch                 (10)    16.61 |    15.36 *  * fruitfly                  (10)   100    |   100    *  * gascons                   (10)    18.72 |    14.26 *  * housing                   (10)    38.62 |    36.53 *  * hungarian                 (10)    74.67 |    72.19 *  * longley                   (10)    31.23 |    28.26 *  * lowbwt                    (10)    62.26 |    61.48 *  * mbagrade                  (10)    89.2  |    89.2     * meta                      (10)   163.15 |   188.28 v  * pbc                       (10)    81.35 |    79.4  *  * pharynx                   (10)   105.41 |   105.03    * pollution                 (10)    72.24 |    68.16 *  * pwLinear                  (10)    32.42 |    33.33 v  * quake                     (10)   100.21 |    99.93    * schlvote                  (10)    92.41 |    98.23 v  * sensory                   (10)    88.03 |    87.94    * servo                     (10)    37.07 |    35.5  *  * sleep                     (10)    70.17 |    71.65    * strike                    (10)    84.98 |    83.96 *  * veteran                   (10)    90.61 |    88.77 *  * vineyard                  (10)    79.41 |    73.95 *  *                        ---------------------------- *                              (v| |*) |   (4|8|24)  * * </pre> <p> * * For more information see: <p> * * Friedman, J.H. (1999). Stochastic Gradient Boosting. Technical Report * Stanford University. http://www-stat.stanford.edu/~jhf/ftp/stobst.ps. <p> * * Valid options from the command line are: <p> *  * -B classifierstring <br> * Classifierstring should contain the full class name of a classifier * followed by options to the classifier. * (required).<p> * * -S shrinkage rate <br> * Smaller values help prevent overfitting and have a smoothing effect  * (but increase learning time). * (default = 1.0, ie no shrinkage). <p> * * -M max models <br> * Set the maximum number of models to generate. Values <= 0 indicate  * no maximum, ie keep going until the reduction in error threshold is  * reached. * (default = -1). <p> * * -D <br> * Debugging output. <p> * * @author Mark Hall (mhall@cs.waikato.ac.nz) * @version $Revision: 1.1.1.1 $ */public class AdditiveRegression extends Classifier   implements OptionHandler,	     AdditionalMeasureProducer,	     WeightedInstancesHandler {    /**   * Base classifier.   */  protected Classifier m_Classifier = new weka.classifiers.trees.DecisionStump();  /**   * Class index.   */  private int m_classIndex;  /**   * Shrinkage (Learning rate). Default = no shrinkage.   */  protected double m_shrinkage = 1.0;    /**   * The list of iteratively generated models.   */  private FastVector m_additiveModels = new FastVector();  /**   * Produce debugging output.   */  private boolean m_debug = false;  /**   * Maximum number of models to produce. -1 indicates keep going until the error   * threshold is met.   */  protected int m_maxModels = -1;  /**   * Returns a string describing this attribute evaluator   * @return a description of the evaluator suitable for   * displaying in the explorer/experimenter gui   */  public String globalInfo() {    return " Meta classifier that enhances the performance of a regression "      +"base classifier. Each iteration fits a model to the residuals left "      +"by the classifier on the previous iteration. Prediction is "      +"accomplished by adding the predictions of each classifier. "      +"Reducing the shrinkage (learning rate) parameter helps prevent "      +"overfitting and has a smoothing effect but increases the learning "      +"time.  For more information see: Friedman, J.H. (1999). Stochastic "      +"Gradient Boosting. Technical Report Stanford University. "      +"http://www-stat.stanford.edu/~jhf/ftp/stobst.ps.";  }  /**   * Default constructor specifying DecisionStump as the classifier   */  public AdditiveRegression() {    this(new weka.classifiers.trees.DecisionStump());  }  /**   * Constructor which takes base classifier as argument.   *   * @param classifier the base classifier to use   */  public AdditiveRegression(Classifier classifier) {    m_Classifier = classifier;  }  /**   * Returns an enumeration describing the available options.   *   * @return an enumeration of all the available options.   */  public Enumeration listOptions() {    Vector newVector = new Vector(4);    newVector.addElement(new Option(	      "\tFull class name of classifier to use, followed\n"	      + "\tby scheme options. (required)\n"	      + "\teg: \"weka.classifiers.bayes.NaiveBayes -D\"",	      "B", 1, "-B <classifier specification>"));    newVector.addElement(new Option(	      "\tSpecify shrinkage rate. "	      +"(default=1.0, ie. no shrinkage)\n", 	      "S", 1, "-S"));    newVector.addElement(new Option(	      "\tTurn on debugging output.",	      "D", 0, "-D"));    newVector.addElement(new Option(	      "\tSpecify max models to generate. "	      +"(default = -1, ie. no max; keep going until error reduction threshold "	      +"is reached)\n", 	      "M", 1, "-M"));         return newVector.elements();  }  /**   * Parses a given list of options. Valid options are:<p>   *   * -B classifierstring <br>   * Classifierstring should contain the full class name of a classifier   * followed by options to the classifier.   * (required).<p>   *   * -S shrinkage rate <br>   * Smaller values help prevent overfitting and have a smoothing effect    * (but increase learning time).   * (default = 1.0, ie. no shrinkage). <p>   *   * -D <br>   * Debugging output. <p>   *   * -M max models <br>   * Set the maximum number of models to generate. Values <= 0 indicate    * no maximum, ie keep going until the reduction in error threshold is    * reached.   * (default = -1). <p>   *   * @param options the list of options as an array of strings   * @exception Exception if an option is not supported   */  public void setOptions(String[] options) throws Exception {    setDebug(Utils.getFlag('D', options));    String classifierString = Utils.getOption('B', options);    if (classifierString.length() == 0) {      throw new Exception("A classifier must be specified"			  + " with the -B option.");    }    String [] classifierSpec = Utils.splitOptions(classifierString);    if (classifierSpec.length == 0) {      throw new Exception("Invalid classifier specification string");    }    String classifierName = classifierSpec[0];    classifierSpec[0] = "";    setClassifier(Classifier.forName(classifierName, classifierSpec));    String optionString = Utils.getOption('S', options);    if (optionString.length() != 0) {      Double temp;      temp = Double.valueOf(optionString);      setShrinkage(temp.doubleValue());    }    optionString = Utils.getOption('M', options);    if (optionString.length() != 0) {      setMaxModels(Integer.parseInt(optionString));    }    Utils.checkForRemainingOptions(options);  }  /**   * Gets the current settings of the Classifier.   *   * @return an array of strings suitable for passing to setOptions   */  public String [] getOptions() {        String [] options = new String [7];    int current = 0;    if (getDebug()) {      options[current++] = "-D";    }    options[current++] = "-B";    options[current++] = "" + getClassifierSpec();    options[current++] = "-S"; options[current++] = ""+getShrinkage();    options[current++] = "-M"; options[current++] = ""+getMaxModels();    while (current < options.length) {      options[current++] = "";    }    return options;

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -