📄 readme.parameter
字号:
This is the documentation for the parameter file.The parameter file is loaded from LSTM by the parameter -c and consistsof two sections.:: Memory cell (MC) configurationHere the number of MCs and the parameters for each MC can be set.number memory cellblocks: : Number of memory cell blocks. More blocks lead to a more complex network but can result in overfitting.block size : Block size. A memory cell can have more than one unit. Default 1.initial input bias : Initial input bias. A good choice is to set all input bias to different negative values. E.g. -2.0, -3.0, -4.0 etc.initial input gate bias: Initial input gate bias. Default -1.0.initial output gatebias : Initial output gate bias. Default -1.0.initial output weight : Initial weight from the memory cell to the output. A good choice could be to set half of all MCs negative (-1.0) and the other half positive (1.0) and set for each positive/negative pair the same initial input bias.:: Other parametersHere the parameters like window size, learning rate, location of thedatasets etc. are set.windowsize : Window size for the LSTM scan processoutputbias : Output bias for the output neuron.inputdata .. : Locations of the datasets.targetvalue0 : Target value for the positive classtargetvalue1 : Target value for the negative classlearning rate : Learning rate for the training process. Note that a to high learning rate results in a not converging training process. To low learning rates slow the training process down.half interval.. : Half interval length for random weight initialization. Default 0.1 so weights are randomly set to [-0.1,0.1].performing test after? epochs : Number of epochs after a test is made within training.write weight after ?epochs : Number of epochs after a weight file is writteninitialization ofrandom generator : Initialization of the random generator with a fixed seed given here. '0' seed with time.reset the net aftereach sequence? : If the net should reset after each sequence. 1 : yes 0 : no. 1 is the default.weight update aftersequence or epoch? : Online (1) or batch learning (0)stop learning after nepochs : Number of epochs after the training should stop.Important settings for different datasets:size of training set : Maximum number of training sequences, positive plus negatives.maxlength of trainingset : Maximum length of a sequence in the training dataset.size of test set : Maximum number of test sequences, positive plus negatives.maxlength of test set : Maximum length of a sequence in the training dataset.
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -