⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 layer.java

📁 神经网络源代码,实现了一个BP神经网络,可以完成基于BP的神经网络算法.
💻 JAVA
字号:
package net.openai.ai.nn.network;import java.io.*;import java.util.*;import net.openai.ai.nn.learning.*;import net.openai.ai.nn.transfer.*;import net.openai.ai.nn.training.*;import net.openai.ai.nn.input.*;/** *  This class is a container for neurons and keeps state  *  information for this layer. * */public class Layer implements Serializable {    //  A container for the neurons for this layer.    private Vector neurons = null;    //  The learning rule for this layer    private LearningRule learningRule = null;    //  The transfer function for this layer    private TransferFunction transferFunction = null;    //  The input function for this layer.    private InputFunction inputFunction = null;    //  name of this layer    private String name = "Layer";    //  variables describing the layer type    protected static final int INPUT_LAYER = 0;    protected static final int HIDDEN_LAYER = 1;    protected static final int OUTPUT_LAYER = 2;    //  the type of this layer    private int layerType = -1;    public Layer() {	neurons = new Vector();    }        /**     * Processes this layer, calling the input function,      * then the transfer function and sets all variables for the      * learning process to be completed.     */    public void calculate() {	//  if this is an input layer, no calculation is necessary	if(layerType == INPUT_LAYER)	    return;	//  adjust this later to not have all three checks but just	//  catch a NULL exception...	if(inputFunction == null) {	    db("No input funtion has been set.");	    return;	}	if(learningRule == null) {	    db("No learning rule has been set.");	    return;	}	if(transferFunction == null) {	    db("No transfer function has been set.");	    return;	}	Enumeration e = neurons.elements();	while(e.hasMoreElements()) {	    Neuron neuron = (Neuron) e.nextElement();	    if(neuron instanceof BiasNeuron) {		continue;	    }	    //  call the input function for each neuron	    inputFunction.calculateInput(neuron);	    //db("calculated input: " + neuron.getInput());	    transferFunction.transfer(neuron);	}    }    /**     *  Calls the learning rule to see if it's ready to process this layer.     *     * @return boolean Tells whether this layer is ready to be     * processed for learning.     */    public boolean readyToLearn() {	if(learningRule == null) {	    db("No learning rule has been set.");	    return false;	}	return learningRule.ready(this);    }    /**     * The learning method for the layer.  This method just handles      * higher level concerns and leaves the implementation of the      * learning to the learning rule.     */    public void learn(TrainingElement trainingElement) 	throws NetworkConfigurationException {	if(learningRule == null) 	    throw new NetworkConfigurationException("No Learning Rule set for"						    + " this layer.");	learningRule.correctLayer(this, trainingElement);    }    /**     * Get the neurons for this layer.     * @return Vector All neurons in the layer.     */    public Vector getNeurons() {	return neurons;    }    /**      * Seed the neurons with input.     *     * @param Vector A collection of values to seed the neurons with...     */    public void seedNeurons(Vector inputs) {	int neuronsSize = neurons.size();	int inputsSize = inputs.size();	//  check that we have matching sizes	if(inputsSize != neuronsSize) {	    db("The number of inputs (" + inputsSize + ") does not match "	       + "the number of neurons (" + neuronsSize + ").");	}	for(int i = 0; i < inputsSize; i++) {	    String inputString = (String) inputs.elementAt(i);	    double input = 0;	    try {		input = Double.parseDouble(inputString);	    } catch (NumberFormatException nfe) {		db("Input (" + inputString + ") cannot be parsed, "		   + "setting input to 0.");		nfe.printStackTrace();	    }	    Neuron neuron = (Neuron) neurons.elementAt(i);	    //  yes this looks weird but this is used only for the	    //  input layer and it does no activation/processing	    neuron.setOutput(input);	}    }    /**      * Add the bias neuron.     *     */    public void addBias() {	addNeuron(new BiasNeuron());    }    /**     * Get the value of learningRule.     * @return Value of learningRule.     */    public LearningRule getLearningRule() {	return learningRule;    }        /**     * Set the value of learningRule.     * @param v  Value to assign to learningRule.     */    public void setLearningRule(LearningRule  learningRule) {	this.learningRule = learningRule;    }    /**     * Get the value of transferFunction.     * @return Value of transferFunction.     */    public TransferFunction getTransferFunction() {	return transferFunction;    }        /**     * Set the value of transferFunction.     * @param v  Value to assign to transferFunction.     */    public void setTransferFunction(TransferFunction  transferFunction) {	this.transferFunction = transferFunction;    }        /**     * Get the value of inputFunction.     * @return Value of inputFunction.     */    public InputFunction getInputFunction() {	return inputFunction;    }        /**     * Set the value of inputFunction.     * @param v  Value to assign to inputFunction.     */    public void setInputFunction(InputFunction  inputFunction) {	this.inputFunction = inputFunction;    }        /**     * Returns the number of neurons in this layer.     * @return number of neurons in this layer.      */    public int getSize() {	//  should we return -1 if neurons is null 	//  or let the exception propogate?	//  looking at the Vector class, they let the exception propogate.	return neurons.size();    }        /**     * Sets the number of neurons in this layer, this should behave much      * like the Vector.setSize() in that, if i > current Size, new neurons      * are added to the layer and if i < current Size then any neuron at      * index i or greater are discarded.     * @param i The number of neurons to be held in this layer.     */    public void setSize(int i) {	//  if the layer is to have less neurons than it currently has, 	//  just call setSize on the Vector and let that class handle it.	if(i < neurons.size()) {	    neurons.setSize(i);	    return;	}	//  if the layer is to have more neurons than it currently has, 	//  create new neurons and place them in the Vector.	int newNeurons = i - neurons.size();	for(int j = 0; j < newNeurons; j++) {	    addNeuron();	}    }    /**     * Creates and adds a single neuron to the layer.     */    public void addNeuron() {	Neuron neuron = new Neuron();	addNeuron(neuron);    }    /**     * Adds a single neuron to the layer.     */    public void addNeuron(Neuron neuron) {	neurons.add(neuron);    }    /**     * Clears all the existing neurons.     */    public void clear() {	neurons.removeAllElements();    }    /**     * Gets the name for this layer.     *     * @return String - The name of this layer.     */    public String getName() {	return name;    }    /**     * Sets the name for this layer.     *     * @param name The name for this layer.     */    public void setName(String name) {	this.name = name;    }    /**     * Get the layer type.     *     * @return int - layer type     */    public int getLayerType() {	return layerType;    }    /**     * Set the layer type.     *     * @param layerType The type of this layer (input, hidden, output)     */    public void setLayerType(int layerType) {	if(layerType == Layer.INPUT_LAYER	   || layerType == Layer.HIDDEN_LAYER	   || layerType == Layer.OUTPUT_LAYER)	    this.layerType = layerType;	else	    db("Received an invalid type for this layer..");    }    /**     * Returns a String representation of this layer.\     *     * @return String A representation of this layer.     */    public String toString() {	return name;    }    private void db(String s) {	if(Network.getDebug())	    System.err.println("Layer: " + s);    }}

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -