📄 readme
字号:
NEURAL NETWORK PC TOOLS
SOFTWARE USER'S GUIDE
$Revision: 1.0 $ $Date: 18 Sep 1989 9:38:14 $
INTRODUCTION
The software described in this User's Guide is that described in the
chapter on Neural Network PC Tool Implementations in the book entitled
Neural Network PC Tools: A Practical Guide, to be published by
Academic Press in 1990. This software may be copied and distributed
AS LONG AS IT IS NOT MODIFIED. In particular, any problems with the
source code should be brought to the attention of the authors.
If you use this software, consider it as shareware and please send
$5.00 to the authors at the following address: Roy Dobbins, 5833
Humblebee Road, Columbia, MD 21045. As additions are made to this
software diskette, such as including self-organizing (Kohonen)
networks, the price will increase. It is anticipated that the price
for the diskette sold in conjunction with the book will be about $20.
BACKGROUND
Much excitement exists due to the apparent ability of artificial
neural networks to imitate the brain's ability to make decisions and
draw conclusions when presented with complex, noisy and/or partial
information. This software is for the engineer or programmer who is
interested in solving practical problems with neural networks.
It is a myth that the only way to achieve results with neural networks
is with a million dollars, a supercomputer, and an interdisciplinary
team of Nobel laureates. There are some commercial vendors out there
who would like you to believe that, though.
Using simple hardware and software tools, it is possible to solve
practical problems that are otherwise impossible or impractical.
Neural network tools (NNT's) offer a solution to some problems that
can't be solved any other way known to the authors.
THE BACK-PROPAGATION NNT: BATCHNET
This release contains both source and executable code for a "standard"
three layer back-propagation neural network. The executable program
is called batchnet.exe; its source code is in the file batchnet.c.
The program for generating random weights used as input to the
training run is weights.exe; its source code is in weights.c. These
files were compiled using Turbo C v 2.0, but can also be compiled in
Microsoft C.
They were compiled using the 80x87 emulator mode, so that they will
run even if you don't have a co-processor. If you have a coprocessor
and want batchnet to run faster, which may be especially important in
training, you can recompile batchnet.c using the 80x87 option. Always
use the compact model.
To run the batchnet program, you must specify the run file that it
will use. Look at the demo.bat and demo.run files to see what we
mean. Demo.bat also illustrates one of the options for batchnet; you
can specify the interval of iterations between error printout. (The
error is the mean sum-squared error of the output nodes.)
The other option for batchnet is to specify what sum-squared error is
required for the program to terminate training. The default value is
0.04. The default number of iterations between error printouts is
100.
In the run file, you specify a number of things. Look at demo.run in
detail to see what they are; there is explanation following the run
data for the two runs that tell what goes where.
First, you specify the number of runs. The demo has two. This is
fairly typical. You often have a training run followed by a test run,
as is the case in the demo.
You then specify the filenames for a number of files: the output file
that gives the values of the output nodes for each pattern on the last
iteration (or the only iteration, if you are in testing mode and there
is only one iteration), the error file that gives you the average sum
squared error value each specified number of iterations, the source
pattern file (values normalized between 0 and 1), the input weights
file (generated by weights.exe for a training run, and consisting of
the output weights file from training for a testing run), and the
output weights file which gives you weight values after the last
iteration.
Note that the pattern files have values for each input node followed
by values for each output node followed by an ID field that you can
use to identify each pattern in some way. The input and output node
values should be between 0 and 1.
Following filenames, you specify, for each run, the number of input
patterns, the number of epochs (iterations of entire pattern set), the
number of input nodes, number of hidden nodes, number of output nodes,
the value for the learning coefficient eta, and the value for the
momentum factor alpha. The number of epochs varies a lot during
training, but often is in the range of 100-1000; during testing, you
only do one iteration.
Sample files are given that you can run with demo.bat; the output
files you will get when you run the demo are already on the diskette
as mytest.out, mytrain.out, mytrain.wts, mytest.wts, mytrain.err, and
mytest.err. You will get similar files without the "my" prefix when
you run the demo.bat program, and you can compare corresponding files
to see that they are the same.
All you have to do is run "demo.bat" in order to both train and test
the batchnet artificial neural network on the patterns in the
train.pat and test.pat files. These pattern files are built from
actual electroencephalogram (EEG) spike parameter data, and illustrate
the use of a parameter-based NNT.
The training phase of the demo.bat will probably take about 45 minutes
on a 4.77 MHz 8088 PC with coprocessor. A 10 MHz Grid 80286 Laptop with
no coprocessor takes about 140 minutes. The coprocessor makes the
difference!
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -