⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 knnclassifier.java

📁 一个自然语言处理的Java开源工具包。LingPipe目前已有很丰富的功能
💻 JAVA
📖 第 1 页 / 共 2 页
字号:
/* * LingPipe v. 3.5 * Copyright (C) 2003-2008 Alias-i * * This program is licensed under the Alias-i Royalty Free License * Version 1 WITHOUT ANY WARRANTY, without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the Alias-i * Royalty Free License Version 1 for more details. * * You should have received a copy of the Alias-i Royalty Free License * Version 1 along with this program; if not, visit * http://alias-i.com/lingpipe/licenses/lingpipe-license-1.txt or contact * Alias-i, Inc. at 181 North 11th Street, Suite 401, Brooklyn, NY 11211, * +1 (718) 290-9170. */package com.aliasi.classify;import com.aliasi.corpus.ClassificationHandler;import com.aliasi.util.AbstractExternalizable;import com.aliasi.util.BoundedPriorityQueue;import com.aliasi.util.Compilable;import com.aliasi.util.Distance;import com.aliasi.util.FeatureExtractor;import com.aliasi.util.ObjectToDoubleMap;import com.aliasi.util.Proximity;import com.aliasi.util.ScoredObject;import com.aliasi.symbol.MapSymbolTable;import com.aliasi.matrix.EuclideanDistance;import com.aliasi.matrix.SparseFloatVector;import com.aliasi.matrix.Vector;import java.io.IOException;import java.io.ObjectInput;import java.io.ObjectOutput;import java.io.Serializable;import java.util.ArrayList;import java.util.List;import java.util.Map;import java.util.Set;/** * A <code>KnnClassifier</code> implements k-nearest-neighor * classification based on feature extraction and a vector proximity * or distance.  K-nearest-neighbor classification is a kind of * memory-based learning in which every training instance is stored * along with its category.  To classify an object, the k nearest * training examples to the object being classified are found. * Each of the k nearest neighbors votes for its training category. * The resulting classification scores are the result of voting. * It is possible to weight the votes by proximity. * * <p>K-nearest-neighbor classifiers are particularly effective for * highly irregular classification boundaries where linear classifiers * like the perceptron have a hard time discriminiting instances.  For * instance, it's possible to learn checkerboard patterns in 2D space * using k-nearest-neighbor classification. * * <h3>Construction</h3> * * <p>A k-nearest neighbor classifier is constructed using a feature * extractor, the number of neighbors k to consider, a vector * distance or proximity and a boolean indicator of whether to * weight results by proximity or treat the nearest neighbors * equally. * * <h3>Training</h3> * * <p>Training simply involves storing the feature vector for * each training instance along with its category.  The vectors * are stored as instances of {@link SparseFloatVector} for * space efficiency.  They are constructed using a symbol table * for features based on the specified feature extractor for * this classifier. * * <h3>Appropriate Distance and Proximity Functions</h3> * * <p>Nearness is defined by the proximity or distance functions over * vectors supplied at construction time.  As objects move nearer to * one another, their distance decreases and their proximity * increases.  Distance measures are converted into proximities behind * the scenes by inversion, as it leaves all proximities positive and * finite: * * <pre> *     proximity(v1,v2) = 1 / (1 + distance(v1,v2))</pre> * * This will scale distance functions that return results between 0 * and positive infinity to return proximities between 0 and 1. * * <p><b>Warning:</b> * Distance functions used for k-nearest-neighbors classification * should not return negative values; any zero or negative values * will be converted to <code>Double.POSITIVE_INFINITY</code>. * * <h3>Classification</h3> * * <p>Classification involves finding the k nearest neighbors to a * query point.  Both training instances and instances to be classified * are converted to feature mappings using the specified feature * extractor, and then encoded as sparse vectors using an implicitly * managed feature symbol table. * * <p>The first step in classification is simply collecting the * k nearest neighbors.  That is, the training examples that have * the greatest proximity to the example being classified. * Given the set of k training examples that are closest to the * test point, one of two strategies is used for determing scores. * In the simple, unweighted case, the score of a category is * simply the number of vectors in the k nearest neighbors with * that category.  In the weighted case, each vector in the * k nearest neighbors contributes its proximity, and the final score * is the sum of all proximities. * * <h3>Choosing the Number of Neighbors k</h3> * * <p>In most cases, it makes sense to try to optimize for the number * of neighbors k using cross-validation or held-out data. * * <p>In the weighted case, it sometimes makes sense to take the * maximum number of neighbors k to be very large, potentially even * <code>Integer.MAX_VALUE</code>.  This is because the examples are * weighted by proximity, and those far away may have vanishingly * small proximities. * * <h3>Serialization and Compilation</h3> * * <p>There is no compilation required for k-nearest-neighbors * classification, so serialization and compilation produce the * same result.  The object read back in after serialization or * compilation should be identical to the one serialized or * compiled. * * <h3>Implementation Notes</h3> * * <p>This is a brute force implementation of k-nearest neighbors in * the sense that every training example is multiplied by every object * being classified.  Some k-nearest-neighbor implementations attempt * to efficiently index the training examples, using techniques such * as <a href="http://en.wikipedia.org/wiki/Kd-tree">KD trees</a>, so * that search for nearest neighbors can be more efficient. * * <h3>References</h3> * * <p>K-nearest-neighbor classification is widely used, and well * described in several texts: * * <ul> * <li>Hastie, T., R. Tibshirani, and J. H. Friedman.  2001. * <i>Elements of Statistical Learning</i>.  Springer-Verlag.</li> * <li>Witten, I. and E. Frank.  <i>Data Mining, 2nd Edition</i>. * Morgan Kaufmann.</li> * * <li>Wikipedia: <a href="http://en.wikipedia.org/wiki/Nearest_neighbor_(pattern_recognition)">K Nearest Neighbor Algorithm</a></li> * </ul> * * @author  Bob Carpenter * @version 3.1.3 * @since   LingPipe3.1 */public class KnnClassifier<E>    implements Classifier<E,ScoredClassification>,               ClassificationHandler<E,Classification>,               Compilable,               Serializable {    static final long serialVersionUID = 5692985587478284405L;    final FeatureExtractor<? super E> mFeatureExtractor;    final int mK;    final Proximity<Vector> mProximity;    final boolean mWeightByProximity;    final List<Integer> mTrainingCategories;    final List<SparseFloatVector> mTrainingVectors;    final MapSymbolTable mFeatureSymbolTable;    final MapSymbolTable mCategorySymbolTable;    KnnClassifier(FeatureExtractor<? super E> featureExtractor,                  int k,                  Proximity<Vector> proximity,                  boolean weightByProximity,                  List<Integer> trainingCategories,                  List<SparseFloatVector> trainingVectors,                  MapSymbolTable featureSymbolTable,                  MapSymbolTable categorySymbolTable) {        mFeatureExtractor = featureExtractor;        mK = k;        mProximity = proximity;        mWeightByProximity = weightByProximity;        mTrainingCategories = trainingCategories;        mTrainingVectors = trainingVectors;        mFeatureSymbolTable = featureSymbolTable;        mCategorySymbolTable = categorySymbolTable;    }    /**     * Construct a k-nearest-neighbor classifier based on the     * specified feature extractor and using the specified number of     * neighbors.  The distance measure will be taken to be {@link     * EuclideanDistance}.  The nearest neighbors will not be     * weighted by proximity, and thus all have equal votes.     *     * @param featureExtractor Feature extractor for training and     * classification instances.     * @param k Maximum number of neighbors to use during     * classification.     */    public KnnClassifier(FeatureExtractor<? super E> featureExtractor,                         int k) {        this(featureExtractor,k,EuclideanDistance.DISTANCE);    }    /**     * Construct a k-nearest-neighbor classifier based on the     * specified feature extractor, specified maximum number of     * neighbors, and specified distance function.  The nearest     * neighbors will not be weighted by proximity, and thus all have     * equal votes.

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -