⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 kohnet.c

📁 统计模式识别算法包
💻 C
📖 第 1 页 / 共 2 页
字号:
/******************************************************************************//*                                                                            *//*  KOHNET - All principal routines for KohNet processing                     *//*                                                                            *//* Copyright (c) 1993 by Academic Press, Inc.                                 *//*                                                                            *//* All rights reserved.  Permission is hereby granted, until further notice,  *//* to make copies of this diskette, which are not for resale, provided these  *//* copies are made from this master diskette only, and provided that the      *//* following copyright notice appears on the diskette label:                  *//* (c) 1993 by Academic Press, Inc.                                           *//*                                                                            *//* Except as previously stated, no part of the computer program embodied in   *//* this diskette may be reproduced or transmitted in any form or by any means,*//* electronic or mechanical, including input into storage in any information  *//* system for resale, without permission in writing from the publisher.       *//*                                                                            *//* Produced in the United States of America.                                  *//*                                                                            *//* ISBN 0-12-479041-0                                                         *//*                                                                            *//******************************************************************************/#include <stdio.h>#include <string.h>#include <math.h>#include <ctype.h>#include <stdlib.h>#include "const.h"       // System and limitation constants, typedefs, structs#include "classes.h"     // Includes all class headers#include "funcdefs.h"    // Function prototypesstatic void free_non_null ( void **p ) ;/*--------------------------------------------------------------------------------   Constructor   Note that some normalization methods generate an extra input.   Therefore we always allocate nin+1 length vectors, even though   we may not need the extra weight.   The parameter 'executable' determines whether work areas for   output neuron activations are also allocated. These are needed   if we will ever apply inputs and want to compute outputs.   In case of malloc failure, we set 'ok' to zero so the user knows about it.   Also, we always leave unallocated pointers set to NULL.  There is no   hard reason for doing this; calling programs should always know enough not   to reference them.  However, it is simply good style.  Most compilers are   much better at producing code that intercepts NULL pointer references than   just wild pointers.  An ounce of prevention...--------------------------------------------------------------------------------*/KohNet::KohNet (   int n_inputs ,   int n_outputs ,   KohParams *kp ,  // Specialized parameters   int executable , // Also allocate hidden and output neurons?   int zero         // Zero all weights?   ){   int i, n ;   outmod = OUTMOD_CLASSIFY ;   nin = n_inputs ;   nout = n_outputs ;   normalization = kp->normalization ;   exe = executable ;   neterr = 1.0 ;   confusion = NULL ;   out_coefs = out = NULL ;   ok = 0 ;   // Indicates failure of malloc (What a pessimist!)   if (exe && (confusion=(int *) MALLOC((nout+1) * sizeof(int))) == NULL)      return ;   n = nout * (nin+1) ; // Some normalizations generate extra input   if (((out_coefs = (double *) MALLOC ( n * sizeof(double) )) == NULL)    || (exe && (out = (double *) MALLOC ( nout * sizeof(double) )) == NULL)){      free_non_null ( (void **) &out_coefs ) ;      free_non_null ( (void **) &confusion ) ;      return ;      }   if (zero) {      while (n--)         out_coefs[n] = 0.0 ;      }   if (exe)      memset ( confusion , 0 , (nout+1) * sizeof(int) ) ;   ok = 1 ;            // Indicate to caller that all mallocs succeeded}/*   Local routine to free non-null pointers*/static void free_non_null ( void **p ){   if (*p != NULL) {      FREE ( *p ) ;      *p = NULL ;      }}/*--------------------------------------------------------------------------------   Destructor--------------------------------------------------------------------------------*/KohNet::~KohNet(){   if (! ok)    // If constructor's mallocs failed      return ;  // there is nothing to free   FREE ( out_coefs ) ;   if (exe) {      FREE ( out ) ;      FREE ( confusion ) ;      }}/*--------------------------------------------------------------------------------   copy_weights - Copy the weights from one network to another                  Note that this is NOT like a copy or assignment,                  as it does not copy other parameters.  In fact,                  it gets sizes from the calling instance!--------------------------------------------------------------------------------*/void KohNet::copy_weights ( KohNet *dest , KohNet *source ){   int n ;   dest->neterr = source->neterr ;   if (source->exe  &&  dest->exe) // These may be important too!      memcpy ( dest->confusion , source->confusion , (nout+1) * sizeof(int) ) ;   n = nout * (nin+1) ;   memcpy ( dest->out_coefs , source->out_coefs , n * sizeof(double) ) ;}/*--------------------------------------------------------------------------------   zero_weights - Zero all weights in a network--------------------------------------------------------------------------------*/void KohNet::zero_weights (){   int n ;   neterr = 1.0 ;   n = nout * (nin+1) ;   while (n--)      out_coefs[n] = 0.0 ;}/*--------------------------------------------------------------------------------   KOH_NORM - Routines for normalizing Kohonen vectors to unit length   in_norm - Normalize an input vector by computing a normalizing             factor and the synthetic last input.             The input vector itself is not touched.             It is assumed that all inputs are in the range -1 to 1.             The end result is that if the inputs are multiplied by             normfac, that vector with synth appended has unit length.   wt_norm - Normalize a weight vector in place.  The synthetic last             component is NOT computed.--------------------------------------------------------------------------------*/void KohNet::in_norm (   double *input ,   // Input vector   double *normfac , // Output: multiply input by this   double *synth     // Output: synthetic last input   )   {   double length, d ;   length = veclen ( nin , input ) ; // Squared length   if (length < 1.e-30)              // Safety      length = 1.e-30 ;   if (normalization == 0) {      // Multiplicative      *normfac = 1.0 / sqrt ( length ) ;      *synth = 0.0 ;      }   else if (normalization == 1) { // Z      *normfac = 1.0 / sqrt ( nin ) ;      d = (double) nin - length ;      if (d > 0.0)         *synth = sqrt ( d ) * *normfac ;      else                // If the inputs are all -1 to 1         *synth = 0.0 ;   // this error never occurs      }}void KohNet::wt_norm ( double *w ){   int i ;   double len, norm ;   len = veclen ( nin , w ) ;     // Ignore last weight   if (len < 1.e-30)              // Safety      len = 1.e-30 ;   if (normalization == 0) {      // Multiplicative      len = 1.0 / sqrt ( len ) ;      for (i=0 ; i<nin ; i++)         w[i] *= len ;      w[nin] = 0. ;      }   else if (normalization == 1) { // Z      len += w[nin] * w[nin] ;      len = 1.0 / sqrt ( len ) ;      for (i=0 ; i<=nin ; i++)         w[i] *= len ;      }}/*--------------------------------------------------------------------------------   trial - Compute the outputs for a given input by evaluating the network.           It is assumed that all inputs are from -1 to 1, but not           necessarily normalized (that is done here).--------------------------------------------------------------------------------*/void KohNet::trial ( double *input ){   int i ;   double normfac, synth, *optr ;      in_norm ( input , &normfac , &synth ) ;  // Normalize input   for (i=0 ; i<nout ; i++) {      optr = out_coefs + i * (nin+1) ;  // i'th weight vector      out[i] = dotprod ( nin , input , optr ) * normfac               + synth * optr[nin] ;      out[i] = 0.5 * (out[i] + 1.0) ;   // Remap -1,1 to 0,1      if (out[i] > 1.0)   // Only trivially happens due to rounding         out[i] = 1.0 ;      if (out[i] < 0.0)         out[i] = 0.0 ;      }}/*--------------------------------------------------------------------------------   winner - Return the subscript of the winning neuron.            This is identical to 'trial' above except that            it also returns the normalization info and winner.--------------------------------------------------------------------------------*/int KohNet::winner (   double *input ,   // Input vector   double *normfac , // Output: multiply input by this   double *synth     // Output: synthetic last input   ){   int i, win ;   double biggest, *optr ;   in_norm ( input , normfac , synth ) ;  // Normalize input   biggest = -1.e30 ;   for (i=0 ; i<nout ; i++) {      optr = out_coefs + i * (nin+1) ;  // i'th weight vector      out[i] = dotprod ( nin , input , optr ) * *normfac               + *synth * optr[nin] ;      out[i] = 0.5 * (out[i] + 1.0) ;   // Remap -1,1 to 0,1      if (out[i] > biggest) {         biggest = out[i] ;         win = i ;         }      if (out[i] > 1.0)   // Only trivially happens due to rounding         out[i] = 1.0 ;      if (out[i] < 0.0)         out[i] = 0.0 ;      }   return win ;}/*--------------------------------------------------------------------------------   learn--------------------------------------------------------------------------------*/void KohNet::learn (   TrainingSet *tptr ,       // Training set   struct LearnParams *lptr  // Learning parameters   ){   int i, key, tset ;   int iter ;         // Iterations (epochs)   int n_retry ;      // Number of random retries   int nwts ;         // Total number of weights   int *won ;         // Counts how many times each neuron won   int winners ;      // How many neurons won per epoch   char msg[80] ;     // For messages to user   double *work ;     // Scratch for additive learning   double *correc ;   // Scratch for cumulative correction vector   double rate ;      // Current learning rate   double bigerr ;    // Biggest single error in epoch   double bigcorr ;   // Biggest cumulative correction in epoch   double best_err ;  // Minimum error so far   double *dptr ;     // Points to a training case   KohNet *bestnet ;  // Preserve best here   KohParams *kp ;    // User's parameters here   if (! exe) {   // Should NEVER happen, but good style to aid debugging      error_message ( "Internal error in KohNet::learn" ) ;      exit ( 1 ) ;      }   kp = lptr->kp ;  // Simplify pointing to parameters   neterr = 1.0 ;/*   If this is multiplicative normalization, make sure all training   cases are non-null.*/   if (normalization == 0) {      // Multiplicative      for (tset=0 ; tset<tptr->ntrain ; tset++) {         dptr = tptr->data + (nin+1) * tset ;         if (veclen ( nin , dptr ) < 1.e-30) {            error_message (                      "Multiplicative normalization has null training case" ) ;            return ;            }         }      }/*   Allocate scratch memory, then initialize weights*/   MEMTEXT ( "KOHNET::learn new bestnet" ) ;   bestnet = new KohNet ( nin , nout , kp , 0 , 1 ) ;   if ((bestnet == NULL)  ||  (! bestnet->ok)) {      memory_message ( "to learn" ) ;      if (bestnet != NULL)         delete bestnet ;      return ;      }   nwts = nout * (nin+1) ;   MEMTEXT ( "KOHNET: Learn scratch" ) ;   won = (int *) MALLOC ( nout * sizeof(int) ) ;   correc = (double *) MALLOC ( nwts * sizeof(double) ) ;   if (! kp->learn_method)  // Needed only for additive method      work = (double *) MALLOC ( (nin+1) * sizeof(double)) ;

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -