⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 readme

📁 source codes for ltr
💻
字号:
                         Ltree Documentation                             Joao Gama		   LIACC, FEP - University of PORTO		          jgama@ncc.up.pt                             Version 1                              10/12/97Note: You are not able to distribute this Code without the permission of Joao Gama---------------------------------------------------------------------------Ltree is a system able to built multivariate trees by integrating a linear discriminant with a decision tree by means of constructive induction.---------------------------------------------------------------------------  This source code is supplied "as is" without warranty of any kind, and  its author disclaim any and all warranties, including but not limited  to any implied warranties of merchantability and fitness for a particular   purpose, and any warranties or non infringement. The user assumes all   liability and responsibility for use of this source code, and the author   will be liable for damages of any kind resulting from its use.    Without limiting the generality of the foregoing, the author  warrant that the Source Code will be error-free, will operate without   interruption, or will meet the needs of the user.---------------------------------------------------------------------------References:-----------J.Gama, "Probabilistic Linear Tree", in Proceedings of 14th	International Conference on Machine Learning, 1997,	Morgan KaufmannJ.Gama, "Oblique Linear Tree", in Proceedings of the 2th	Intelligent Data Analysis, 1997	LNAI 1280, 	Springer VerlagIf you use the Ltree software in the context of any of yourpublications, please reference the above paper.---------------------------------------------------------------------------Installing Ltree:----------------You will need to create the Ltree image file:  type: makeDirectory Data contains a sample training, test and domain file in the format that the program expects.  They are called iris.data iris.domain and iris.test respectively.  To test Ltree type the following:   Ltree -f iris.data -u -v 3---------------------------------------------------------------------------To run Ltree:------------The Ltree code will run on UNIX machines. Options are specified on the input command line (some are required, others are optional).Required: -f <steam_name>Optional:FLAG PARAMETER	RANGE		Comments -u 		  		test on unseen examples stored at <steam_name>.test file -g               		Set Off Gain Ratio criteria. Uses Gain -m <integer>    		Minimum nr. of examples at one node -c <integer>    [0,100]	Level of pruning -s <integer>    [1:100] 	Branch Factor -w <integer>    [1:] 		Smoothing weight factor -v <integer>	 [1,2,3]  	Level of verbosity -k <integer>    [1,...]	Minimum ratio examples/attributes for generating LM                -d <integer>	 [1,...]	Maximum Depth for generating linear combinations -l <integer>			Level of pruning of attributes on linear combinations---------------------------------------------------------------------------Example:---------localhost:~/Ltree/Distribution> Ltree -f irisDecision Tree: (Nodes: 5, Leaves: 3, Depth: 3, Errors: 2.000)petalwidth <= 0.800|       setosa (50.00/0.00) [ 0.996 0.002 0.002 ]petalwidth > 0.800|       Linear_7 <= -0.618|       |       virginica (50.00/1.00) [ 0.000 0.022 0.978 ]|       Linear_7 > -0.618|       |       versicolor (50.00/1.00) [ 0.000 0.978 0.022 ]Linear_5        +18.241+10.976*[1]+20.124*[2]-29.025*[3]-39.088*[4]Linear_6        +31.525+3.213*[1]+3.510*[2]-7.538*[3]-14.764*[4]Linear_7        +530.189+3.559*[1]+7.070*[2]-7.854*[3]-13.388*[4]+511.957*[5]-511.958*[6]Linear Tree Learning Time:   0.02 (sec)---------------------------------------------------------------------------Ltree's output-------------Ltree outputs always the pruned decision tree obtained.When used to classify unseen examples, with the -u option,the user can control the verbosity level of the output,with the option -v <integer>.By default Ltree only outputs the error rate, nr. of errors and nr. of unseen examples.Increasing the verbose level, its possible to obtain:-a confusion matrix-how each example is classified -a probability class distribution for each example.---------------------------------------------------------------------------Input File Format:------------------The program expect at least, two text files:<problem>.data <problem>.domainWhen used to classify unseen examples, these must be in file<problem>.testThe file <problem>.domain contains information about the attributes, types and possible values.For each attribute must exist one line.The first information of this line must be the name of the attribute,followed by the character ":" and a list of the possible values,separated by commas. In the case of an attribute with real values it must be declared as continuous.The last line must contain the name of the class, followed by ":", followed by the value of the possible classes.For example:	att1: continuous	att2: a, b, c	class:a,bDeclares a two class problem defined by two attributes.The class can take the values a and b.The first attribute takes real values, and the second attribute takes three values: a, b, cAll data files (train and test) must have the following format.Each example is on one line.Each line contains the attribute values in the same order asthey are declared on the "domain" file.The last value of each example is the class attribute.Missing values are specified by "?".  The values for each attribute must be of one type.  The permitted types  are: symbolic and numeric.  The numeric values can be integer or real.  The program does not handle numeric values in exponent form; i.e., 1.2e3 ---------------------------------------------------------------------------Contents:---------drwxr-xr-x   2 jgama    users        2048 May  8 18:31 Codedrwxr-xr-x   2 jgama    users        1024 May  8 14:28 Datadrwxr-xr-x   2 jgama    users        1024 May  8 11:30 Docs-rw-r--r--   1 jgama    users        5821 May  8 18:21 READMECode:total 132-rw-r--r--   1 jgama    users       13928 May  2 21:52 BuildTree.c-rw-r--r--   1 jgama    users        1968 May  8 14:27 BuildTree.h-rw-r--r--   1 jgama    users       17279 May  8 14:13 Ci_instances.c-rw-r--r--   1 jgama    users        4333 May  8 11:42 Ci_instances.h-rw-r--r--   1 jgama    users        2373 May  8 14:14 Combine.c-rw-r--r--   1 jgama    users         799 May  8 14:29 Combine.h-rw-r--r--   1 jgama    users        6238 May  2 21:54 Ltree.c-rw-r--r--   1 jgama    users        3213 May  2 21:54 Ltree_u.c-rw-r--r--   1 jgama    users        1357 May  2 22:03 Makefile-rw-r--r--   1 jgama    users        5012 May  7 18:55 classify.c-rw-r--r--   1 jgama    users         717 May  8 18:08 classify.h-rw-r--r--   1 jgama    users       17363 May  2 22:01 discrim.c-rw-r--r--   1 jgama    users        1059 May  8 22:25 discrim.h-rw-r--r--   1 jgama    users        7012 May  8 14:15 distributions.c-rw-r--r--   1 jgama    users         956 May  8 14:30 distributions.h-rw-r--r--   1 jgama    users        8064 May  8 14:16 entropia.c-rw-r--r--   1 jgama    users        1078 May  8 14:31 entropia.h-rw-r--r--   1 jgama    users         351 May  8 10:31 externs.i-rw-r--r--   1 jgama    users        3954 May  8 14:27 prune.c-rw-r--r--   1 jgama    users         764 May  8 14:31 prune.h-rw-r--r--   1 jgama    users         172 May  8 18:11 teste.c-rw-r--r--   1 jgama    users        7203 May  2 21:56 tree.c-rw-r--r--   1 jgama    users        2057 May  8 14:32 tree.h-rw-r--r--   1 jgama    users        9993 May  8 14:46 utils.c-rw-r--r--   1 jgama    users        2643 May  8 14:49 utils.hData:total 10-rw-r--r--   1 jgama    users        3800 May  7 18:59 iris-rw-r--r--   1 jgama    users        3427 May  7 18:59 iris.data-rw-r--r--   1 jgama    users         126 May  7 12:47 iris.domain-rw-r--r--   1 jgama    users         381 May  7 18:59 iris.testDocs:total 100-rw-r--r--   1 jgama    users      100732 May 29 11:29 ltree.ps.gz

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -