📄 documentation.txt
字号:
Quintessential documentation of the ESN learning toolbox
V 1.0 Copyright: Fraunhofer IAIS 2006 / Patents pending
Revision 1, Feb 23, 2007, H. Jaeger
Revision 2, June 23, 2007, H. Jaeger
Revision 3, June 27, 2007, H. Jaeger
*************** Terms of Use ******************************
This software is free for non-commercial use. If commercial use is
intended, contact Fraunhofer IAIS (www.iais.fraunhofer.de) who have
claimed international patents on ESN algorithms (pending).
This software is intended for research use by experienced Matlab users
and includes no warranties or services. Bug reports are gratefully
received by Herbert Jaeger (h.jaeger [at] iu-bremen.de)
*************** Main References ****************************
The methods described for training ESNs are described in detail in the
following papers:
H. Jaeger (2001), The "echo state" approach to analyzing and training
recurrent neural networks. GMD Report 148, GMD - German National
Research Institute for Computer Science, 2001,
H.Jaeger(2002), A tutorial on training recurrent neural networks,
covering BPPT, RTRL, EKF and the "echo state network" approach,
Fraunhofer Institute for Autonomous Intelligent Systems (AIS),
International University Bremen, 2002
************* A quick start *******************************
A demo script "demoScript" shows how to load a data set, create a
network, train an esn, test it on new data, and plot the output of the
network.
************ Coding and Naming Conventions **************************
This toolbox tries to respects the coding/naming conventions found at
http://www.mathworks.com/matlabcentral/files/2529/MatlabStyle1p5.pdf
There are also a few convections regarding the format of input and teacher data
- The input sequences are given in the form of a vector of size nTimeSteps x
nInputDimensions
- The output sequences are given in the form of a vector of size nTimeSteps x
nOutputDimensions
*********** The files from this package ****************************
All files from the toolbox come with the standard inline
documentation.
In alphabetical order here is what the routines are doing:
analogToUnitCoded:
- helper function for coding a 1-dim analog signal into a unit-coded
signal (aka "local coding")
compute_error:
- computes the normalized mean squared error of the esn, given the
output of the esn and the actual teacher.
compute_statematrix:
- runs the input through the ESN. The reservoir (plus input) states of
the units are collected in matrix which is returned by the function
compute_teacher:
- scales, shifts and applies the inverse output activation function on
the teacher.
demoScript:
see "quickStart" section above
generate_esn:
- generates an ESN reservoir and the associated input and output
feedback weight matrices. Information on how to handle the input,
how to train etc, can be specified as parameters to this
function. The generated esn is a cell structure carrying many kinds
of information; it is the central data structure when doing ESN
experiments with this toolbox.
generate_internal_weights:
- generates the random reservoir weight matrix (sparse)
generate_NARMA_sequence:
- generates training and test sequences according to a NARMA equation.
identity:
- a helper function that just outputs its input
leaky_esn:
- computes an update of the internal state using the equation for
leaky integrator neurons ("classical" leaky integrator neuron equation
desciribed and used in some earlier papers by Jaeger et al)
leaky1_esn:
- computes an update of the internal state using another equation for
leaky integrator neurons (preferable to leaky_esn)
load_esn:
- loads a trained or untrained esn from a file
makeTimeConstantsFromRange:
- a little helper function to generate a time constants vector from lower
and upper boundary
myeigs:
- a quickfix version of Matlab's built-in function eigs, which was bugged
normalizeData_min0:
- shifts time series such that the minimal value per channel becomes 0
normalizeData01:
- scales and shifts time series such that they range in [0 1] in each
channel
plain_esn:
- generates the internal state of an ESN with standard
additive-sigmoid neurons
plot_sequence:
- plots the teacher and the output of the network on the same graph.
plot_states:
- plots traces of the states for some reservoir neurons
pseudoinverse:
- computes the output weights using the pseudoinverse method
save_esn:
- save an ESN structure to a file
scaleshiftData:
- first shifts data by given shifts, then scales by given scalings
sigmoid01:
- computes the sigmoid of the input and returns it
sigmoid01_inv:
- computes the inverse of the sigmoid, useful when wanting to
transform the teacher
split_train_test:
- splits a sequence in two parts(train and test)
test_esn:
- runs an already trained esn on an input sequence.
train_esn:
- trains an ESN(i.e. it computes the output weights) based on the
chosen method
twi_esn:
- computes the internal state based on the time warping invariant
method. This method is still in testing phase.
unitCodedToAnalog:
- helper for data transformation of unit-coded ("local coding") signal
into one-dimensional analog signal
wiener_hopf:
- computes output weights using the Wiener-Hopf solution to the MSE equation
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -