📄 usage.h
字号:
// *-c++-*-//===========================================================//= University of Illinois at Urbana-Champaign =//= Department of Computer Science =//= Dr. Dan Roth - Cognitive Computation Group =//= =//= Project: SNoW =//= =//= Module: Usage.h =//= Version: 3.2.0 =//= Authors: Andrew Carlson, Nick Rizzolo =//= Date: xx/xx/99 = //= =//= Comments: If you are updating the description of a =//= command line argument, make sure you update =//= every instance of its description in this =//= file. =//===========================================================#ifndef USAGE_H__#define USAGE_H__char* usage[] = {"SNoW\n","\n","NAME\n"," SNoW - Sparse Network of Winnows learning system\n","\n\n","SYNOPSIS\n"," snow -train -I {string} -F {string} [ -AaBbcdEefGgiLlMmOoPpRrSsTtuVvWwz ]\n"," snow -test -I {string} -F {string} [ -abEefGgiLlmOopRSstVvwz ]\n"," snow -interactive -I {string} -F {string} [ -AEfLoPpRvWw ]\n"," snow -evaluate -x {string} -F {string} [ -befmpRVvw ]\n",#ifdef SERVER_MODE_" snow -server {integer} -F {string} [ -befLlmopVvw ]\n",#endif"\n\n","DESCRIPTION\n"," SNoW can be run in the following modes: \n","\n"," -train\n"," The system is run in training mode and the input file is considered to\n"," be a set of labeled training examples.\n","\n"," -test\n"," The system is run in a batch test mode. The input file consists of\n"," examples which are classified by the system.\n","\n"," -interactive\n"," This mode allows the user to (1) evaluate specific examples, and (2)\n"," control the training process by specifying the targets to be promoted\n"," or demoted.\n","\n"," -evaluate\n"," The system classifies a single example given as a parameter to the\n"," program.\n","\n",#ifdef SERVER_MODE_" -server\n"," The system is run as a server in test mode. Clients send examples,\n"," and the server classifies them and returns results. Before the client\n"," starts sending examples, it can send a string of options in the same\n"," format as on the command line. Any of the command line options that\n"," are valid for the -server mode are valid when a client sends them\n"," except for -e. The server sends, and expects to receive, a four byte,\n"," big endian integer representing the size in bytes of the data about to\n"," be transmitted immediately before that data is transmitted.\n",#else" NOTE: In order to enable server functionality, the executable must be\n"," recompiled with the make variable 'SERVER' defined; eg. by\n"," typing, 'gmake SERVER=1'.\n",#endif"\n"," Depending on the mode chosen, some options are required, some are\n"," optional, and some are superfluous. Also, some options have slightly\n"," different descriptions depending on the mode. (See below.)\n","\n\n","OPTIONS\n"," Required options for mode -train:\n","\n"," -F {string}\n"," Specifies the name of a file to which the resulting network is written\n"," after training. No default.\n","\n"," -I {string}\n"," Specifies the name of a file from which training examples are read.\n"," No default.\n","\n"," Architecture definition options for mode -train: \n","\n"," -A {string}\n"," Specifies the name of a file from which to read the algorithm\n"," definition. No default.\n","\n"," -B <:targets>\n"," Adds a Naive Bayes algorithm to the network for the given targets.\n","\n"," -P <:targets>\n"," Adds a Perceptron algorithm with default parameters to the network for\n"," the given targets.\n","\n"," -P <learning_rate:targets>\n"," Adds a Perceptron algorithm to the network for the given targets\n"," generating the default weight.\n","\n"," -P <learning_rate,threshold,default_weight:targets>\n"," Adds a Perceptron algorithm to the network for the given targets.\n","\n"," -W <:targets>\n"," Adds a Winnow algorithm with default parameters to the network for the\n"," given targets.\n","\n"," -W <alpha,beta:targets>\n"," Adds a Winnow algorithm for the given targets, generating the default\n"," weight.\n","\n"," -W <alpha,beta,threshold,default_weight:targets>\n"," Adds a Winnow algorithm to the network for the given targets.\n","\n"," If no architecture is given, a default Winnow algorithm will be used.\n"," Defaults:\n"," learning_rate = .1, alpha = 1.35, beta = 1 / 1.35, threshold = 4\n"," default_weight = 3 * threshold / average example size\n","\n"," Other options for mode -train:\n","\n"," -a <+ | ->\n"," '+' forces all non-discarded features to be written to the network.\n"," The default is '-', when features that have not yet reached the\n"," eligibility threshold are not written to the network.\n","\n"," -b {double}\n"," Specifies the smoothing parameter to be used in Naive Bayes. In\n"," training mode, this option only makes sense when used in conjunction\n"," with the -T parameter. Default 15.0.\n","\n"," -c {integer}\n"," Specifies an interval, as a number of examples presented in a given\n"," cycle, after which the network will be tested on the examples in the\n"," file specified by the -T parameter. Results are displayed after each\n"," interval. The default is 0, when the network is never tested during\n"," training.\n","\n"," -d <none | abs:{double} | rel>\n"," Specifies the feature discarding method. Default 'none'.\n","\n"," -E {string}\n"," Specifies the name of a file in which to write information about\n"," mistakes during testing. In training mode, this option is only useful\n"," in conjunction with the -T parameter. No default.\n","\n"," -e <count:{integer} | percent:{double}>\n"," Sets the feature eligibility method. Features are not included in\n"," activation calculations until they become eligible. Naive Bayes\n"," always considers every feature eligible, so it ignores the setting of\n"," this parameter. Default is 'count:2'.\n","\n"," -f <+ | ->\n"," '+' enables automatic insertion of the \"fixed\" feature into every\n"," example. This feature's weight then acts as a dynamic threshold.\n"," Default '+'.\n","\n"," -G <+ | ->\n"," When set to '+', this option enables the Gradient Descent algorithm in\n"," conjunction with -P and the Exponentiated Gradient Descent algorithm\n"," with -W. This parameter has no effect when used with -B. Also, it\n"," cannot be used in conjunction with either '-O +' or '-t +'.\n"," Default '-'.\n","\n"," -g <+ | ->[,<+ | ->]\n"," This option enables generation of conjunctions of active features in\n"," each example when set to '+', and disables conjunction generation when\n"," set to '-'. By default, conjunctions are generated when the total\n"," number of distinct active features in the training data is less than\n"," 100 and the maximum feature ID in the training data is less than\n"," 10000.\n","\n"," Users also have the option of writing the new examples to disk by\n"," specifying a second argument. If '-g +,+' is specified, input\n"," examples will be written to disk with conjunctions added. They will\n"," be output into a file whose name is the original filename concatenated\n"," with '.conjunctions'. If no conjunctions are generated, the file will\n"," be left empty. The default is '-g <unset>,-', where no examples are\n"," written to disk.\n","\n"," -i <+ | ->\n"," In training mode, this option makes sense only when used in\n"," conjunction with the -T parameter and apart from learning curve\n"," generation. It specifies whether incremental learning should be used.\n"," '+' presents each example to the network for training. The resulting\n"," network is written at the end of testing with .new appended to its\n"," filename. Default '-'.\n","\n"," -L {long}\n"," Specifies the limit to the number of targets printed with the -o\n"," parameter. This option makes sense only when used in conjunction with\n"," the -o parameter. Default ULONG_MAX.\n","\n"," -l <+ | ->\n"," In training mode, this option makes sense only when used in\n"," conjunction with the -T parameter. It specifies whether test examples\n"," are labeled or not. Default '+'.\n","\n"," -M <+ | ->\n"," This parameter controls how examples are stored in memory. '-' makes\n"," SNoW parse and train one example at a time. At the beginning of the\n"," next cycle, the input stream is rewound, and the parsing and training\n"," process begins again. '+' makes SNoW parse every training example,\n"," storing them all in an array, before training begins. It uses much\n"," more memory but runs quicker and quicker in comparison to '-' as the\n"," number of training cycles increases. Default '-'.\n","\n"," -m <+ | ->\n"," This option specifies whether to train with multiple labels. '+'\n"," means that a given target will not treat other targets' IDs as\n"," features when they are encountered in an example. '-' means that a\n"," given target treats all IDs as features except for its own ID.\n"," Default '+'.\n","\n"," -n \n <+ | ->"," When this option is specified with '+', SNoW will look for an\n"," existing network (with the name supplied to the -F parameter) to\n"," train, and will overwrite the original network. If no network is\n"," provided, a new one is created.\n","\n"," -O <+ | ->[,<+ | ->]\n"," '+' enables ordered targets (a.k.a. constraint classification) mode.\n"," '+' in the second argument is used to enable a more conservative\n"," version of the algorithm. If the first argument has been set to '+'\n"," and the second is unspecified, it is the same as specifying '+,-'.\n"," Default '-,-'.\n","\n"," -o < accuracy | winners | softmax | allpredictions | allactivations |\n"," allboth >\n"," In training mode, this option makes sense only in conjunction with the\n"," -T parameter. See its complete description in the -test mode options\n"," section. Default 'accuracy'.\n","\n"," -p {double}\n"," Specifies the prediction threshold. In training mode, this option\n"," makes sense only in conjunction with the -T parameter. SNoW will not\n"," make a prediction if the activation of the most activated target minus\n"," the activation of the second most activated target is less than or\n"," equal to the prediction threshold. Default -1.\n","\n"," -R {string}\n"," Specifies the name of a file in which to write the testing results.\n"," In training mode, this option only makes sense in conjunction with the\n"," -T parameter. Default STDOUT.\n","\n"," -r {integer}\n"," This option sets the number of cycles through the training data.\n"," Default 2.\n","\n"," -S {double}[,{double}]\n"," Sets the thickness of the separator between positive and negative\n",
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -