⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 decisiontreelearner.java

📁 一个很好的LIBSVM的JAVA源码。对于要研究和改进SVM算法的学者。可以参考。来自数据挖掘工具YALE工具包。
💻 JAVA
字号:
/*
 *  YALE - Yet Another Learning Environment
 *  Copyright (C) 2001-2004
 *      Simon Fischer, Ralf Klinkenberg, Ingo Mierswa, 
 *          Katharina Morik, Oliver Ritthoff
 *      Artificial Intelligence Unit
 *      Computer Science Department
 *      University of Dortmund
 *      44221 Dortmund,  Germany
 *  email: yale-team@lists.sourceforge.net
 *  web:   http://yale.cs.uni-dortmund.de/
 *
 *  This program is free software; you can redistribute it and/or
 *  modify it under the terms of the GNU General Public License as 
 *  published by the Free Software Foundation; either version 2 of the
 *  License, or (at your option) any later version. 
 *
 *  This program is distributed in the hope that it will be useful, but
 *  WITHOUT ANY WARRANTY; without even the implied warranty of
 *  MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
 *  General Public License for more details.
 *
 *  You should have received a copy of the GNU General Public License
 *  along with this program; if not, write to the Free Software
 *  Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307
 *  USA.
 */
package edu.udo.cs.yale.operator.learner.decisiontree;

import edu.udo.cs.yale.example.ExampleReader;
import edu.udo.cs.yale.example.ExampleSet;
import edu.udo.cs.yale.example.SplittedExampleSet;
import edu.udo.cs.yale.example.Example;
import edu.udo.cs.yale.example.Attribute;
import edu.udo.cs.yale.example.Tools;
import edu.udo.cs.yale.tools.ParameterService;
import edu.udo.cs.yale.tools.LogService;
import edu.udo.cs.yale.tools.Ontology;
import edu.udo.cs.yale.operator.OperatorException;

import java.util.*;

/** DecisionTreeLearner ist an internal (i.e. pure Java) classification machine learning algorithm based on
 *  the ID3 algorithm by Quinlan. In each step the most promising attribute is determined by calculating
 *  the information gain. Then the example set is partitioned according to the values of this attribute
 *  and the algorithm is applied recursively on the partitions. The trees resulting from the recursive
 *  calls are attached as children together with their respective attribute values. Recursion stops
 *  when all examples of a subset have the same label or the subset becomes empty.
 *  <br/>
 *  Whereas ID3 can only handle categorical attributes, this implementation can also handle
 *  continuous attributes although this is not implemented efficiently. Compared with C4.5 it should 
 *  be mentioned that it cannot deal with missing values and does not prune the tree.
 *  For numerical data we recommend to use the {@link edu.udo.cs.yale.operator.learner.weka.WekaLearner} with 
 *  the J48 decision tree inducer.
 *
 *  @yale.xmlclass DecisionTreeLearner
 *  @see edu.udo.cs.yale.operator.learner.decisiontree.Tree
 *  @author Ingo
 *  @version $Id: DecisionTreeLearner.java,v 2.3 2004/08/27 11:57:38 ingomierswa Exp $
 */
public class DecisionTreeLearner extends ID3Learner {


    /** Erzeugt einen neuen Entscheidungsbaum aus dem gegebenem Attribut.
     */
    Tree createNewDecisionTree(ExampleSet exampleSet, Attribute bestAttribute, boolean ratioGain, int defaultGoal) throws OperatorException {
	// nominal
	if (Ontology.ATTRIBUTE_VALUE_TYPE.isA(bestAttribute.getValueType(), Ontology.NOMINAL)) {
	    return super.createNewDecisionTree(exampleSet, bestAttribute, ratioGain, defaultGoal);

	} else {  // kontinuierlich

	    if (exampleSet.getSize() == 0) return null;


	    double threshold = Tools.getThreshold(exampleSet, bestAttribute);

	    SplittedExampleSet splitted = SplittedExampleSet.splitByAttribute(exampleSet, bestAttribute, threshold);
	    // make new decisionTree
	    Tree decisionTree = new Tree(exampleSet.getLabel(), bestAttribute);
	    
	    splitted.selectSingleSubset(0);  // less equal
	    Premise premise = new SimplePremise(bestAttribute, "<=", threshold);
	    Tree child = makeDecisionTree(splitted, ratioGain, defaultGoal);
	    if (child == null) child = new Tree(splitted.getLabel(), defaultGoal);
	    decisionTree.addChild(premise, child);    

	    splitted.selectSingleSubset(1);  // greater
	    premise = new SimplePremise(bestAttribute, ">", threshold);
	    child = makeDecisionTree(splitted, ratioGain, defaultGoal);
	    if (child == null) child = new Tree(splitted.getLabel(), defaultGoal);
	    decisionTree.addChild(premise, child);
	    
	    // wenn alle Untermengen leer waren, so gib das default ziel zurueck.
	    return decisionTree;
	}
    }
}

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -