⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 neuralnetwork.java

📁 一个数据挖掘软件ALPHAMINERR的整个过程的JAVA版源代码
💻 JAVA
📖 第 1 页 / 共 2 页
字号:
/*
 *    This program is free software; you can redistribute it and/or modify
 *    it under the terms of the GNU General Public License as published by
 *    the Free Software Foundation; either version 2 of the License, or
 *    (at your option) any later version.
 *
 *    This program is distributed in the hope that it will be useful,
 *    but WITHOUT ANY WARRANTY; without even the implied warranty of
 *    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
 *    GNU General Public License for more details.
 *
 *    You should have received a copy of the GNU General Public License
 *    along with this program; if not, write to the Free Software
 *    Foundation, Inc., 675 Mass Ave, Cambridge, MA 02139, USA.
 */

/**
 * Title: XELOPES Data Mining Library
 * Description: The XELOPES library is an open platform-independent and data-source-independent library for Embedded Data Mining.
 * Copyright: Copyright (c) 2002 Prudential Systems Software GmbH
 * Company: ZSoft (www.zsoft.ru), Prudsys (www.prudsys.com)
 * @author Michael Thess
 * @version 1.2
 */

package com.prudsys.pdm.Models.Regression.NeuralNetwork;

import java.util.Enumeration;
import java.util.Hashtable;

import com.prudsys.pdm.Core.CategoricalAttribute;
import com.prudsys.pdm.Core.Category;
import com.prudsys.pdm.Core.MetaDataOperations;
import com.prudsys.pdm.Core.MiningAttribute;
import com.prudsys.pdm.Core.MiningDataSpecification;
import com.prudsys.pdm.Core.MiningException;
import com.prudsys.pdm.Input.MiningVector;
import com.prudsys.pdm.Models.Supervised.Classifier;

/**
 * A NeuralNetwork object contains the whole neural network (NN). <p>
 *
 * The creation of the NN is realized in three steps:
 * 1. Create all neural layers,
 * 2. Add the neural layers using the method addNeuralLayer,
 * 3. Connect the neurons of the layers using the method connectLayers. <p>
 *
 * In addition, more sophisticated methods for connecting the neurons can
 * be utilized: The method connectLayers allows to connect two individual
 * layers and the method connectNodes even allows to connect two individual
 * neurons. <p>
 *
 * Corresponds to PMML element NeuralNetwork. Further ideas from Weka's
 * Neural Network implementation and of the Neural Network package
 * are utilized.
 *
 * @see NeuralLayer
 * @see NeuralNode
 * @see com.prudsys.pdm.Adapters.PmmlVersion20.NeuralNetwork
 */
public class NeuralNetwork extends com.prudsys.pdm.Cwm.Core.Class implements Classifier {

  // -----------------------------------------------------------------------
  //  Variables declarations
  // -----------------------------------------------------------------------
  /** Meta data. */
  protected MiningDataSpecification metaData;

  /** Name of target attribute. */
  protected String className;

  /** Layers of the neural network. */
  protected NeuralLayer[] neuralLayer = null;

  /** Hashtable containing all neurons with ID as key. */
  protected Hashtable neurons = new Hashtable(100);

  /** Reference of activation function for the whole network. */
  protected ActivationFunction activationFunction = null;

  /** Indicates that a bias term is used in the whole network. */
  protected boolean useBias = true;

  /** Threshold value when no bias term is used. For whole network. */
  protected double threshold = Category.MISSING_VALUE;


  // -----------------------------------------------------------------------
  //  Constructor
  // -----------------------------------------------------------------------
  /**
   * Empty constructor.
   */
  public NeuralNetwork() {
  }

  // -----------------------------------------------------------------------
  //  Getter and setter methods
  // -----------------------------------------------------------------------
  /**
   * Returns meta data.
   *
   * @return meta data
   */
  public MiningDataSpecification getMetaData() {
    return metaData;
  }

  /**
   * Sets new meta data.
   *
   * @param metaData new meta data
   */
  public void setMetaData(MiningDataSpecification metaData) {
    this.metaData = metaData;
  }

  /**
   * Returns name of traget attribute.
   *
   * @return name of target attribute
   */
  public String getClassName() {
    return className;
  }

  /**
   * Sets name of target attribute.
   *
   * @param className new name of target attribute
   */
  public void setClassName(String className) {
    this.className = className;
  }

  /**
   * Reference of activation function for the whole network. If set to
   * null, the activation functions are specified on the neural layer level.
   *
   * @return activation function of network
   */
  public ActivationFunction getActivationFunction() {
    return activationFunction;
  }

  /**
   * Set activation function for the whole network. If set to
   * null, the activation functions are specified on the neural layer level.
   *
   * @param activationFunction new activation function of network
   */
  public void setActivationFunction(ActivationFunction activationFunction) {
    this.activationFunction = activationFunction;

    if (activationFunction != null) {
      for (int i = 0; i < getNumberOfLayers(); i++)
        neuralLayer[i].setActivationFunction(activationFunction);
    }
  }

  /**
   * Returns true if a bias term is used for whole network. A bias is equivalent
   * to an input connection set at a constant level. If the value is false,
   * the use of bias can also be specified on the neural layer level.
   *
   * @return true if bias term is used, false otherwise
   */
  public boolean isUseBias() {
    return useBias;
  }

  /**
   * Set bias term is used in the neuron. A bias is equivalent
   * to an input connection set at a constant level. If the value is false,
   * the use of bias can also be specified on the neural layer level.
   *
   * @param useBias set use bias term
   */
  public void setUseBias(boolean useBias) {
    this.useBias = useBias;

    if (useBias) {
      for (int i = 0; i < getNumberOfLayers(); i++)
        neuralLayer[i].setUseBias(useBias);
    }
  }

  /**
   * Returns threshold value. Usually required, when no bias term
   * is used. If the value is missing, the threshold can also be specified
   * on the neural layer level.
   *
   * @return threshold value
   */
  public double getThreshold() {
    return threshold;
  }

  /**
   * Sets new threshold value. Usually required, when no bias term
   * is used. If the value is missing, the threshold can also be specified
   * on the neural layer level.
   *
   * @param threshold new threshold value
   */
  public void setThreshold(double threshold) {
    this.threshold = threshold;

    if ( !Category.isMissingValue(threshold) ) {
      for (int i = 0; i < getNumberOfLayers(); i++)
        neuralLayer[i].setThreshold(threshold);
    }
  }

  // -----------------------------------------------------------------------
  //  Network topology methods
  // -----------------------------------------------------------------------
  /**
   * Returns neural node of specified ID from the network.
   *
   * @param id ID of neural node
   * @return neural node of specified ID, null if not found
   */
  public NeuralNode getNeuralNodeFromId(String id) {

    if (id == null) return null;

    return (NeuralNode) neurons.get(id);
  }

  /**
   * Removes all neural layers.
   */
  public void removeAllLayers() {

    neuralLayer = null;
    neurons.clear();
  }

  /**
   * Returns number of neural layers.
   *
   * @return number of neural layers
   */
  public int getNumberOfLayers() {

    return ( neuralLayer != null ) ? neuralLayer.length : 0;
  }

  /**
   * Returns input layer.
   *
   * @return input layer, null if not found
   */
  public NeuralLayer getInputLayer() {

    int nLay = getNumberOfLayers();
    if (nLay == 0)
      return null;

    if ( neuralLayer[0].getLayerType() != NeuralLayer.NEURAL_INPUT )
      return null;

    return neuralLayer[0];
  }

  /**
   * Returns output layer.
   *
   * @return output layer, null if not found
   */
  public NeuralLayer getOutputLayer() {

    int nLay = getNumberOfLayers();
    if (nLay == 0)
      return null;

    if ( neuralLayer[nLay-1].getLayerType() != NeuralLayer.NEURAL_OUTPUT )
      return null;

    return neuralLayer[nLay-1];
  }

  /**
   * Returns the set of layers associated with a model. Starts from
   * the input side. First layer is input layer (contains NeuralInput nodes),
   * last layer is output layer (contains NeuralOutput nodes). Hidden layers
   * in between contain Neuron nodes.
   *
   * @return neural layers of this model
   */
  public NeuralLayer[] getNeuralLayer() {
    return neuralLayer;
  }

  /**
   * Assigns the neural layers to the network. Starts from
   * the input side. First layer is input layer (contains NeuralInput nodes),
   * last layer is output layer (contains NeuralOutput nodes). Hidden layers
   * in between contain Neuron nodes.
   *
   * @param neuralLayer array of neural layers to be added
   * @exception MiningException cannot set neural layers
   */
  public void setNeuralLayer(NeuralLayer[] neuralLayer) throws MiningException {

    this.neuralLayer = neuralLayer;

    neurons.clear();
    for (int i = 0; i < getNumberOfLayers(); i++) {
      neuralLayer[i].setNeuralNetwork(this);

      int nnodes = neuralLayer[i].getNumberOfNodes();
      for (int j = 0; j < nnodes; j++) {
        NeuralNode NN = neuralLayer[i].getNeuralNodes()[j];
        String id = NN.getId();
        if (id == null || neurons.get(id) != null)
          throw new MiningException("layer " + i + " contains neuron with invalid ID");
        neurons.put(id, NN);
      };
    }
  }

  /**
   * Adds a new layer to the network.
   *
   * @param layer new neural layer to add
   * @exception MiningException cannot add layer
   */
  public void addNeuralLayer(NeuralLayer layer) throws MiningException {

    // Add layer:
    int nlay = getNumberOfLayers();
    NeuralLayer[] lay2 = new NeuralLayer[nlay];
    for (int i = 0; i < nlay; i++)
      lay2[i] = (NeuralLayer) neuralLayer[i];
    neuralLayer = new NeuralLayer[nlay+1];
    for (int i = 0; i < nlay; i++)
      neuralLayer[i] = lay2[i];
    neuralLayer[nlay] = layer;
    neuralLayer[nlay].setNeuralNetwork(this);

    // Add neurons:
    int nnodes = layer.getNumberOfNodes();
    for (int i = 0; i < nnodes; i++) {
      NeuralNode NN = layer.getNeuralNodes()[i];
      String id = NN.getId();
      if (id == null || neurons.get(id) != null)
        throw new MiningException("layer contains neuron with invalid ID");
      neurons.put(id, NN);
    }
  }

  /**
   * Connects two layers by connecting all nodes of the parent layer
   * with all nodes of the child layer by means of the method connectNodes.
   * Notice that it is also possible (though rarely used) to connect two
   * layers which are not neighbours, e.g. to connect layer 2 with layer 5.
   *
   * @param parNr number of the parent layer
   * @param childNr number of the child layer
   * @throws MiningException couldn't connect the layers
   */
  public void connectLayers(int parNr, int childNr) throws MiningException {

    // Check valid numbers:
    if (parNr < 0 || childNr < parNr || childNr >= getNumberOfLayers() )
      throw new MiningException("invalid layer number");

    // Connect layers:
    NeuralLayer parlay   = neuralLayer[parNr];
    NeuralLayer childlay = neuralLayer[childNr];
    for (int i = 0; i < parlay.getNumberOfNodes(); i++)
      for (int j = 0; j < childlay.getNumberOfNodes(); j++) {
        connectNodes( parlay.getNeuralNodes()[i], childlay.getNeuralNodes()[j], true );
      };
  }

  /**
   * Connects all n layers of the network by successively calling the
   * method connectLayers for layer pairs (0,1), (1,2), ..., (n-2,n-1)
   * and connect the output layer via a one-to-one relation with its
   * previous layer.
   *
   * @throws MiningException could not connect all layers
   */
  public void connectAllLayers() throws MiningException {

    int nlay = getNumberOfLayers();
    if (nlay < 3)
      throw new MiningException("There must be at least three layers");

    // Connect the layers except output one:
    for (int i = 0; i < nlay-2; i++)
      connectLayers(i, i+1);

    // Connect output layer:
    NeuralLayer outputLayer = neuralLayer[nlay-1];
    NeuralLayer prevLayer   = neuralLayer[nlay-2];

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -