⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 classbalancednd.java

📁 代码是一个分类器的实现,其中使用了部分weka的源代码。可以将项目导入eclipse运行
💻 JAVA
📖 第 1 页 / 共 2 页
字号:
      firstCopy[i] = firstInds[sortedFirst[i]];    }    firstInds = firstCopy;    for (int i = 0; i < sortedSecond.length; i++) {      secondCopy[i] = secondInds[sortedSecond[i]];    }    secondInds = secondCopy;		    // Unify indices to improve hashing    if (firstInds[0] > secondInds[0]) {      int[] help = secondInds;      secondInds = firstInds;      firstInds = help;      int help2 = second;      second = first;      first = help2;    }    m_Range = new Range(Range.indicesToRangeList(firstInds));    m_Range.setUpper(data.numClasses() - 1);    Range secondRange = new Range(Range.indicesToRangeList(secondInds));    secondRange.setUpper(data.numClasses() - 1);           // Change the class labels and build the classifier    MakeIndicator filter = new MakeIndicator();    filter.setAttributeIndex("" + (data.classIndex() + 1));    filter.setValueIndices(m_Range.getRanges());    filter.setNumeric(false);    filter.setInputFormat(data);    m_FilteredClassifier = new FilteredClassifier();    if (data.numInstances() > 0) {      m_FilteredClassifier.setClassifier(Classifier.makeCopies(classifier, 1)[0]);    } else {      m_FilteredClassifier.setClassifier(new weka.classifiers.rules.ZeroR());    }    m_FilteredClassifier.setFilter(filter);    // Save reference to hash table at current node    m_classifiers=table;	    if (!m_classifiers.containsKey( getString(firstInds) + "|" + getString(secondInds))) {      m_FilteredClassifier.buildClassifier(data);      m_classifiers.put(getString(firstInds) + "|" + getString(secondInds), m_FilteredClassifier);    } else {      m_FilteredClassifier=(FilteredClassifier)m_classifiers.get(getString(firstInds) + "|" + 								 getString(secondInds));	    }				    // Create two successors if necessary    m_FirstSuccessor = new ClassBalancedND();    if (first == 1) {      m_FirstSuccessor.m_Range = m_Range;    } else {      RemoveWithValues rwv = new RemoveWithValues();      rwv.setInvertSelection(true);      rwv.setNominalIndices(m_Range.getRanges());      rwv.setAttributeIndex("" + (data.classIndex() + 1));      rwv.setInputFormat(data);      Instances firstSubset = Filter.useFilter(data, rwv);      m_FirstSuccessor.generateClassifierForNode(firstSubset, m_Range,                                                  rand, classifier, m_classifiers);    }    m_SecondSuccessor = new ClassBalancedND();    if (second == 1) {      m_SecondSuccessor.m_Range = secondRange;    } else {      RemoveWithValues rwv = new RemoveWithValues();      rwv.setInvertSelection(true);      rwv.setNominalIndices(secondRange.getRanges());      rwv.setAttributeIndex("" + (data.classIndex() + 1));      rwv.setInputFormat(data);      Instances secondSubset = Filter.useFilter(data, rwv);      m_SecondSuccessor = new ClassBalancedND();            m_SecondSuccessor.generateClassifierForNode(secondSubset, secondRange,                                                   rand, classifier, m_classifiers);    }  }  /**   * Returns default capabilities of the classifier.   *   * @return      the capabilities of this classifier   */  public Capabilities getCapabilities() {    Capabilities result = super.getCapabilities();    // class    result.disableAllClasses();    result.enable(Capability.NOMINAL_CLASS);    result.enable(Capability.MISSING_CLASS_VALUES);    // instances    result.setMinimumNumberInstances(1);        return result;  }      /**   * Builds tree recursively.   *   * @param data contains the (multi-class) instances   * @throws Exception if the building fails   */  public void buildClassifier(Instances data) throws Exception {    // can classifier handle the data?    getCapabilities().testWithFail(data);    // remove instances with missing class    data = new Instances(data);    data.deleteWithMissingClass();        Random random = data.getRandomNumberGenerator(m_Seed);	    if (!m_hashtablegiven) {      m_classifiers = new Hashtable();    }	    // Check which classes are present in the    // data and construct initial list of classes    boolean[] present = new boolean[data.numClasses()];    for (int i = 0; i < data.numInstances(); i++) {      present[(int)data.instance(i).classValue()] = true;    }    StringBuffer list = new StringBuffer();    for (int i = 0; i < present.length; i++) {      if (present[i]) {        if (list.length() > 0) {          list.append(",");        }        list.append(i + 1);      }    }          Range newRange = new Range(list.toString());    newRange.setUpper(data.numClasses() - 1);	    generateClassifierForNode(data, newRange, random, m_Classifier, m_classifiers);  }      /**   * Predicts the class distribution for a given instance   *   * @param inst the (multi-class) instance to be classified   * @return the class distribution   * @throws Exception if computing fails   */  public double[] distributionForInstance(Instance inst) throws Exception {	    double[] newDist = new double[inst.numClasses()];    if (m_FirstSuccessor == null) {      for (int i = 0; i < inst.numClasses(); i++) {        if (m_Range.isInRange(i)) {          newDist[i] = 1;        }      }      return newDist;    } else {      double[] firstDist = m_FirstSuccessor.distributionForInstance(inst);      double[] secondDist = m_SecondSuccessor.distributionForInstance(inst);      double[] dist = m_FilteredClassifier.distributionForInstance(inst);      for (int i = 0; i < inst.numClasses(); i++) {        if ((firstDist[i] > 0) && (secondDist[i] > 0)) {          System.err.println("Panik!!");        }        if (m_Range.isInRange(i)) {          newDist[i] = dist[1] * firstDist[i];        } else {          newDist[i] = dist[0] * secondDist[i];        }      }      return newDist;    }  }      /**   * Returns the list of indices as a string.   *    * @param indices the indices to return as string   * @return the indices as string   */  public String getString(int [] indices) {    StringBuffer string = new StringBuffer();    for (int i = 0; i < indices.length; i++) {      if (i > 0) {        string.append(',');      }      string.append(indices[i]);    }    return string.toString();  }	  /**   * @return a description of the classifier suitable for   * displaying in the explorer/experimenter gui   */  public String globalInfo() {	        return         "A meta classifier for handling multi-class datasets with 2-class "      + "classifiers by building a random class-balanced tree structure.\n\n"      + "For more info, check\n\n"      + getTechnicalInformation().toString();  }	  /**   * Outputs the classifier as a string.   *    * @return a string representation of the classifier   */  public String toString() {	        if (m_classifiers == null) {      return "ClassBalancedND: No model built yet.";    }    StringBuffer text = new StringBuffer();    text.append("ClassBalancedND");    treeToString(text, 0);	        return text.toString();  }	  /**   * Returns string description of the tree.   *    * @param text the buffer to add the node to   * @param nn the node number   * @return the next node number   */  private int treeToString(StringBuffer text, int nn) {	        nn++;    text.append("\n\nNode number: " + nn + "\n\n");    if (m_FilteredClassifier != null) {      text.append(m_FilteredClassifier);    } else {      text.append("null");    }    if (m_FirstSuccessor != null) {      nn = m_FirstSuccessor.treeToString(text, nn);      nn = m_SecondSuccessor.treeToString(text, nn);    }    return nn;  }    	  /**   * Main method for testing this class.   *   * @param argv the options   */  public static void main(String [] argv) {    runClassifier(new ClassBalancedND(), argv);  }}

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -