⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 nn_util.c

📁 关于遗传算法的一些见地。特别是关于简单遗传程序设计的实现。
💻 C
📖 第 1 页 / 共 4 页
字号:
/**********************************************************************  nn_util.c **********************************************************************  nn_util - Simple multi-layer Neural Network routines.  Copyright ©2001-2003, The Regents of the University of California.  All rights reserved.  Primary author: "Stewart Adcock" <stewart@linux-domain.com>  The latest version of this program should be available at:  http://www.linux-domain.com/  This program is free software; you can redistribute it and/or modify  it under the terms of the GNU General Public License as published by  the Free Software Foundation; either version 2 of the License, or  (at your option) any later version.  Alternatively, if your project  is incompatible with the GPL, I will probably agree to requests  for permission to use the terms of any other license.  This program is distributed in the hope that it will be useful, but  WITHOUT ANY WARRANTY WHATSOEVER.  A full copy of the GNU General Public License should be in the file  "COPYING" provided with this distribution; if not, see:  http://www.gnu.org/ **********************************************************************  Synopsis:	Multi-layer NN trained using backpropagation with		momentum.		Warning: This code contains almost no error checking!		This code uses neuronal input/response in the range 0.0>=x>=1.0.		Note that best results will be acheived if data is		similarly normalized.		Most of these functions are NOT thread-safe!  To do:	Need to define data from external sources.		Alternative switching functions.		Automated functions for "leave-one-out" validation.		Full support for weight decay method starting at a given epoch.		Node pruning. (prune zeros and +/- ones(by using BIAS as replacement)) **********************************************************************/#include "gaul/nn_util.h"/* * Yucky global variables. */static float      **train_data=NULL;       /* Input data for training. */static float      **test_data=NULL;        /* Input data for testing. */static float      **eval_data=NULL;        /* Input data for evaluation. */static int        num_train_data=0;        /* Number of training target items. */static int        num_test_data=0;         /* Number of testing target items. */static int        num_eval_data=0;         /* Number of evaluation target items. */static int        max_train_data=0;        /* Maximum number of training target items. */static int        max_test_data=0;         /* Maximum number of testing target items. */static int        max_eval_data=0;         /* Maximum number of evaluation target items. */static float      **train_property=NULL;   /* Training target property. */static float      **test_property=NULL;    /* Testing target property. */static float      **eval_property=NULL;    /* Evaluation target property. */static int        num_train_prop=0;        /* Number of training target properties. */static int        num_test_prop=0;         /* Number of testing target properties. */static int        num_eval_prop=0;         /* Number of evaluation target properties. */static float      **predict_data=NULL;     /* Input data for prediction. */static int        num_predict_data=0;      /* Number of sets of input data to predict. */static int        max_predict_data=0;      /* Maximum number of sets of input data to predict. */static char       **train_labels=NULL;     /* Labels for training data. */static char       **test_labels=NULL;      /* Labels for test data. */static char       **eval_labels=NULL;      /* Labels for evaluation data. */static char       **predict_labels=NULL;   /* Labels for prediction data. *//**********************************************************************  NN_diagnostics()  synopsis:     Display diagnostic information.   parameters:   none  return:       none  last updated: 13 Mar 2002 **********************************************************************/void NN_diagnostics(void)  {  printf("=== nn_util diagnostic information ===========================\n");  printf("Version:                   %s\n", GA_VERSION_STRING);  printf("Build date:                %s\n", GA_BUILD_DATE_STRING);  printf("Compilation machine characteristics:\n%s\n", GA_UNAME_STRING);  printf("--------------------------------------------------------------\n");  printf("NN_DEBUG:                  %d\n", NN_DEBUG);  printf("NN_MAX_FNAME_LEN:          %d\n", NN_MAX_FNAME_LEN);  printf("NN_DATA_ALLOC_SIZE:        %d\n", NN_DATA_ALLOC_SIZE);  printf("NN_SIGNAL_OFF:             %f\n", NN_SIGNAL_OFF);  printf("NN_SIGNAL_ON:              %f\n", NN_SIGNAL_ON);  printf("NN_DEFAULT_BIAS:           %f\n", NN_DEFAULT_BIAS);  printf("NN_DEFAULT_SEED:           %d\n", NN_DEFAULT_SEED);  printf("NN_DEFAULT_MOMENTUM:       %f\n", NN_DEFAULT_MOMENTUM);  printf("NN_DEFAULT_RATE:           %f\n", NN_DEFAULT_RATE);  printf("NN_DEFAULT_GAIN:           %f\n", NN_DEFAULT_GAIN);  printf("NN_DEFAULT_DECAY:          %f\n", NN_DEFAULT_DECAY);  printf("NN_DEFAULT_MAX_EPOCHS:     %d\n", NN_DEFAULT_MAX_EPOCHS);  printf("NN_DEFAULT_TEST_STEP:      %d\n", NN_DEFAULT_TEST_STEP);  printf("NN_DEFAULT_STOP_RATIO:     %f\n", NN_DEFAULT_STOP_RATIO);  printf("--------------------------------------------------------------\n");  printf("structure                  sizeof\n");  printf("layer_t:                   %lu\n", (unsigned long) sizeof(layer_t));  printf("network_t:                 %lu\n", (unsigned long) sizeof(network_t));  printf("--------------------------------------------------------------\n");  return;  }/**********************************************************************  NN_display_summary()  synopsis:     Display a summary of a Neural Network datastructure.  parameters:   network_t *network  return:       none  last updated: 04 Dec 2001 **********************************************************************/void NN_display_summary(network_t *network)  {  int		l;		/* Layer index. */  printf("num_layers = %d num_neurons =", network->num_layers);  for (l=0; l<network->num_layers; l++)    printf(" %d", network->layer[l].neurons);  printf("\nmomentum = %f rate = %f gain = %f bias = %f decay = %f\n",             network->momentum,	     network->rate,	     network->gain,	     network->bias,	     network->decay);  return;  }/**********************************************************************  NN_new()  synopsis:     Allocate and initialise a Neural Network datastructure.  parameters:   int num_layers	Number of layers (incl. input+output)		int *neurons	Array containing number of nodes per layer.  return:       network_t *network  last updated: 01 Mar 2002 **********************************************************************/network_t *NN_new(int num_layers, int *neurons)  {  network_t	*network;	/* The new network. */  int		l;		/* Layer index. */  int		i;		/* Neuron index. */  network = (network_t*) s_malloc(sizeof(network_t));  network->layer = (layer_t*) s_malloc(num_layers*sizeof(layer_t));  network->num_layers = num_layers;  network->layer[0].neurons     = neurons[0];  network->layer[0].output      = (float*) s_calloc(neurons[0]+1, sizeof(float));  network->layer[0].error       = (float*) s_calloc(neurons[0]+1, sizeof(float));  network->layer[0].weight      = NULL;  network->layer[0].weight_save  = NULL;  network->layer[0].weight_change = NULL;  network->layer[0].output[0]   = NN_DEFAULT_BIAS;     for (l=1; l<num_layers; l++)    {    network->layer[l].neurons     = neurons[l];    network->layer[l].output      = (float*)  s_calloc(neurons[l]+1, sizeof(float));    network->layer[l].error       = (float*)  s_calloc(neurons[l]+1, sizeof(float));    network->layer[l].weight      = (float**) s_calloc(neurons[l]+1, sizeof(float*));    network->layer[l].weight_save  = (float**) s_calloc(neurons[l]+1, sizeof(float*));    network->layer[l].weight_change = (float**) s_calloc(neurons[l]+1, sizeof(float*));    network->layer[l].output[0]   = NN_DEFAULT_BIAS;          for (i=1; i<=neurons[l]; i++)      {      network->layer[l].weight[i]      = (float*) s_calloc(neurons[l-1]+1, sizeof(float));      network->layer[l].weight_save[i]  = (float*) s_calloc(neurons[l-1]+1, sizeof(float));      network->layer[l].weight_change[i] = (float*) s_calloc(neurons[l-1]+1, sizeof(float));      }    }/* Tuneable parameters: */  network->momentum = NN_DEFAULT_MOMENTUM;  network->rate = NN_DEFAULT_RATE;  network->gain = NN_DEFAULT_GAIN;  network->bias = NN_DEFAULT_BIAS;  network->decay = NN_DEFAULT_DECAY;  return network;  }/**********************************************************************  NN_clone()  synopsis:     Allocate and initialise a Neural Network datastructure  		using the contents of an existing datastructure.  parameters:   network_t *network  return:       network_t *network  last updated: 01 Mar 2002 **********************************************************************/network_t *NN_clone(network_t *src)  {  network_t	*network;	/* The new network. */  int		l;		/* Layer index. */  int		i;		/* Neuron index. */  network = (network_t*) s_malloc(sizeof(network_t));  network->layer = (layer_t*) s_malloc(src->num_layers*sizeof(layer_t));  network->num_layers = src->num_layers;  network->layer[0].neurons     = src->layer[0].neurons;  network->layer[0].output      = (float*) s_malloc((src->layer[0].neurons+1)*sizeof(float));  memcpy(network->layer[0].output, src->layer[0].output, src->layer[0].neurons+1);  network->layer[0].error       = (float*) s_malloc((src->layer[0].neurons+1)*sizeof(float));  memcpy(network->layer[0].error, src->layer[0].error, src->layer[0].neurons+1);  network->layer[0].weight      = NULL;  network->layer[0].weight_save  = NULL;  network->layer[0].weight_change = NULL;     for (l=1; l<src->num_layers; l++)    {    network->layer[l].neurons     = src->layer[l].neurons;    network->layer[l].output      = (float*)  s_malloc((src->layer[l].neurons+1)*sizeof(float));    memcpy(network->layer[l].output, src->layer[l].output, src->layer[l].neurons+1);    network->layer[l].error       = (float*)  s_malloc((src->layer[l].neurons+1)*sizeof(float));    memcpy(network->layer[l].error, src->layer[l].error, src->layer[l].neurons+1);    network->layer[l].weight      = (float**) s_malloc((src->layer[l].neurons+1)*sizeof(float*));    network->layer[l].weight_save  = (float**) s_malloc((src->layer[l].neurons+1)*sizeof(float*));    network->layer[l].weight_change = (float**) s_malloc((src->layer[l].neurons+1)*sizeof(float*));          for (i=1; i<=src->layer[l].neurons; i++)      {      network->layer[l].weight[i]      = (float*) s_malloc((src->layer[l-1].neurons+1)*sizeof(float));      memcpy(network->layer[l].weight[i], src->layer[l].weight[i], src->layer[l-1].neurons+1);      network->layer[l].weight_save[i]  = (float*) s_malloc((src->layer[l-1].neurons+1)*sizeof(float));      memcpy(network->layer[l].weight_save[i], src->layer[l].weight_save[i], src->layer[l-1].neurons+1);      network->layer[l].weight_change[i] = (float*) s_malloc((src->layer[l-1].neurons+1)*sizeof(float));      memcpy(network->layer[l].weight_change[i], src->layer[l].weight_change[i], src->layer[l-1].neurons+1);      }    }/* Tuneable parameters: */  network->momentum = src->momentum;  network->rate = src->rate;  network->gain = src->gain;  network->bias = src->bias;  network->decay = src->decay;  return network;  }/**********************************************************************  NN_copy()  synopsis:     Copy the data in one Neural Network datastructure over  		the data in another.  parameters:   network_t *src                network_t *dest  return:       none  last updated: 01 Mar 2002 **********************************************************************/void NN_copy(network_t *src, network_t *dest)  {  int		l;		/* Layer index. */  int		i;		/* Neuron index. */  if (dest->num_layers != src->num_layers) die("Incompatiable topology for copy (layers)");  for (l=0; l<src->num_layers; l++)    if (dest->layer[l].neurons != src->layer[l].neurons) die("Incompatiable topology for copy (neurons)");  memcpy(dest->layer[0].output, src->layer[0].output, src->layer[0].neurons+1);  memcpy(dest->layer[0].error, src->layer[0].error, src->layer[0].neurons+1);  dest->layer[0].weight      = NULL;  dest->layer[0].weight_save  = NULL;  dest->layer[0].weight_change = NULL;     for (l=1; l<src->num_layers; l++)    {    memcpy(dest->layer[l].output, src->layer[l].output, src->layer[l].neurons+1);    memcpy(dest->layer[l].error, src->layer[l].error, src->layer[l].neurons+1);          for (i=1; i<=src->layer[l].neurons; i++)      {      memcpy(dest->layer[l].weight[i], src->layer[l].weight[i], src->layer[l-1].neurons+1);      memcpy(dest->layer[l].weight_save[i], src->layer[l].weight_save[i], src->layer[l-1].neurons+1);      memcpy(dest->layer[l].weight_change[i], src->layer[l].weight_change[i], src->layer[l-1].neurons+1);      }    }/* Tuneable parameters: */  dest->momentum = src->momentum;  dest->rate = src->rate;  dest->gain = src->gain;  dest->bias = src->bias;  dest->decay = src->decay;  return;  }/**********************************************************************  NN_set_layer_bias()  synopsis:     Change the bias of a single layer of a network to a 		given value.  parameters:   network_t	*network  		const int	layer		const float	bias  return:       none  last updated: 01 Mar 2002 **********************************************************************/void NN_set_layer_bias(network_t *network, const int layer, const float bias)  {  if (layer<0 || layer>=network->num_layers)    dief("Invalid layer %d (0-%d)", layer, network->num_layers);    network->layer[layer].output[0] = bias;  return;  }/**********************************************************************  NN_set_bias()  synopsis:     Change the bias of all layers in a network to a given 		value.  parameters:   network_t	*network		const float	bias  return:       none  last updated: 03 Dec 2001 **********************************************************************/void NN_set_bias(network_t *network, const float bias)  {  int l; 	/* Loop variable over layers. */  if (network->bias != bias)    {    network->bias = bias;    for (l=0; l<network->num_layers; l++)      network->layer[l].output[0] = bias;    }  return;  }/**********************************************************************  NN_set_gain()  synopsis:     Change the gain of a network to a given value.  parameters:   network_t *network		float    gain  return:       none  last updated: 3 Dec 2001 **********************************************************************/void NN_set_gain(network_t *network, const float gain)  {  network->gain = gain;  return;  }/**********************************************************************  NN_set_rate()  synopsis:     Change the learning rate of a network to a given value.  parameters:   network_t *network		float    rate  return:       none  last updated: 3 Dec 2001 **********************************************************************/void NN_set_rate(network_t *network, const float rate)  {  network->rate = rate;  return;  }/**********************************************************************  NN_set_momentum()  synopsis:     Change the momentum of a network to a given value.  parameters:   network_t *network		float    momentum  return:       none  last updated: 3 Dec 2001 **********************************************************************/void NN_set_momentum(network_t *network, const float momentum)  {  network->momentum = momentum;  return;  }/**********************************************************************  NN_set_decay()  synopsis:     Change the weight decay of a network to a given value.  parameters:   network_t	*network		const float	decay  return:       none  last updated: 01 Mar 2002 **********************************************************************/void NN_set_decay(network_t *network, const float decay)  {  network->decay = decay;  return;  }/**********************************************************************

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -