📄 index.html
字号:
<dd>Initializes special constraints. Returns a sequence of initial constraints. Each constraint in the returned sequence is itself a sequence with two items (the intention is to be a tuple). The first item of the tuple is a document object, with at least its <var>fvec</var> attribute set to a support vector object, or list of support vector objects. The second item is a number, indicating that the inner product of the feature vector of the document object with the linear weights must be greater than or equal to the number (or, in the nonlinear case, the evaluation of the kernel on the feature vector with the current model must be greater). This initializes the optimization problem by allowing the introduction of special constraints. Typically no special constraints are necessary.<p>Note that the <var>docnum</var> attribute of each document returned by the user is ignored. These have to have particular values anyway. Also, regarding the <var>slackid</var> of each document, the slack IDs 0 through <code>len</code>(<var>sample</var>)-<var>1</var> inclusive are reserved for each training example in the sample. Note that if you leave the slackid of a document as <code>None</code>, which is the default for <code>svmlight.create_doc</code>, that the document encoded as a constraint will get <var>slackid</var>=<code>len</code>(<var>sample</var>)+<var>i</var>, where <var>i</var> is the position of the constraint within the returned list.<p>If this function is not implemented, it is equivalent to returning an empty list, i.e., no constraints.</dd> <dt><a class="bookmark" name="detail-init_struct_model"><code><b>init_struct_model</b></code></a>(<i>sample, sm, sparm</i>)</dt><dd>Initializes the learning model.<p>Initialize the structure model <var>sm</var>. The major intention is that we set <var>sm.size_psi</var> to the maximum feature index we return from <code>psi</code>. The ancillary purpose is to add any information to <var>sm</var> that is necessary from the user code perspective. This function returns nothing.</dd> <dt><a class="bookmark" name="detail-loss"><code><b>loss</b></code></a>(<i>y, ybar, sparm</i>)</dt><dd>Return the loss of <var>ybar</var> relative to the true labeling <var>y</var>.<p>Returns the loss for the correct label <var>y</var> and the predicted label <var>ybar</var>. In the event that <var>y</var> and <var>ybar</var> are identical loss must be 0. Presumably as <var>y</var> and <var>ybar</var> grow more and more dissimilar the returned value will increase from that point. <var>sparm.loss_function</var> holds the loss function option specified on the command line via the <code>-l</code> option.<p>If this function is not implemented, the default behavior is to perform 0/1 loss based on the truth of <code>y==ybar</code>.</dd> <dt><a class="bookmark" name="detail-parse_struct_parameters"><code><b>parse_struct_parameters</b></code></a>(<i>sparm</i>)</dt><dd>Sets attributes of sparm based on command line arguments.<p>This gives the user code a chance to change <var>sparm</var> based on the custom command line arguments. The command line arguments are stored in <var>sparm.argv</var> as a list of strings. The command line arguments have also been preliminarily processed as <var>sparm.argd</var> as a dictionary. For example, if the custom command line arguments were <code>--key1 value1 --key2 value2</code> then sparm.argd would equal <code>{'key1':'value1', 'key2':'value2'}</code>. This function returns nothing. It is called only during learning, not classification.<p>If this function is not implemented, any custom command line arguments (aside from <code>--m</code>, of course) are ignored and sparm remains unchanged.</dd> <dt><a class="bookmark" name="detail-print_struct_help"><code><b>print_struct_help</b></code></a>(<i></i>)</dt><dd>Prints help for badly formed CL-arguments when learning.<p>If this function is not implemented, the program prints the default SVM<sup><i>struct</i></sup> help string as well as a note about the use of the <code>--m</code> option to load a Python module.</dd> <dt><a class="bookmark" name="detail-print_struct_learning_stats"><code><b>print_struct_learning_stats</b></code></a>(<i>sample, sm, cset, alpha, sparm</i>)</dt><dd>Print statistics once learning has finished.<p>This is called after training primarily to compute and print any statistics regarding the learning (e.g., training error) of the model on the training sample. You may also use it to make final changes to <var>sm</var> before it is written out to a file. For example, if you defined any non-pickle-able attributes in <var>sm</var>, this is a good time to turn them into a pickle-able object before it is written out. Also passed in is the set of constraints cset as a sequence of (left-hand-side, right-hand-side) two-element tuples, and an alpha of the same length holding the Lagrange multipliers for each constraint.<p>If this function is not implemented, the default behavior is equivalent to <code>print [loss(e[1], classify(e.[0], sm, sparm)) for e in sample]</code>.</dd> <dt><a class="bookmark" name="detail-print_struct_testing_stats"><code><b>print_struct_testing_stats</b></code></a>(<i>sample, sm, sparm, teststats</i>)</dt><dd>Print statistics once classification has finished.<p>This is called after all test predictions are made to allow the display of any summary statistics that have been accumulated in the teststats object through use of the eval_prediction function.<p>If this function is not implemented, the default behavior is equivalent to <code>print teststats</code>.</dd> <dt><a class="bookmark" name="detail-psi"><code><b>psi</b></code></a>(<i>x, y, sm, sparm</i>)</dt><dd>Return a feature vector describing pattern x and label y.<p>This returns a sequence representing the feature vector describing the relationship between a pattern <var>x</var> and label <var>y</var>. What <var>psi</var> returns depends on the problem. Its particulars are described in the Tsochantaridis paper. The return value should be either a support vector object of the type returned by <code>svmlight.create_svector</code>, or a list of support vector objects.</dd> <dt><a class="bookmark" name="detail-read_struct_examples"><code><b>read_struct_examples</b></code></a>(<i>filename, sparm</i>)</dt><dd>Reads and returns <var>x</var>,<var>y</var> example pairs from a file.<p>This reads the examples contained at the file at path <var>filename</var> and returns them as a sequence. Each element of the sequence should be an object <var>e</var> where <var>e[0]</var> and <var>e[1]</var> is the pattern <var>x</var> and label <var>y</var> respectively. Specifically, the intention is that the element be a two-element tuple containing an <var>x</var>-<var>y</var> pair.</dd> <dt><a class="bookmark" name="detail-read_struct_model"><code><b>read_struct_model</b></code></a>(<i>filename, sparm</i>)</dt><dd>Load the structure model from a file.<p>Return the structmodel stored in the file at path filename, or None if the file could not be read for some reason.<p>If this function is not implemented, the default behavior is equivalent to <code>return pickle.load(file(filename))</code>.</dd> <dt><a class="bookmark" name="detail-write_label"><code><b>write_label</b></code></a>(<i>fileptr, y</i>)</dt><dd>Write a predicted label to an open file.<p>Called during classification, the idea is to write a string representation of y to the file fileptr. Note that unlike other functions, fileptr an actual open file, not a filename. It is not to be closed by this function. Any attempt to close it is ignored.<p>If this function is not implemented, the default behavior is equivalent to <code>fileptr.write(repr(y)+'\n')</code>.</dd> <dt><a class="bookmark" name="detail-write_struct_model"><code><b>write_struct_model</b></code></a>(<i>filename, sm, sparm</i>)</dt><dd>Dump the structmodel <var>sm</var> to a file.<p>Write the structmodel <var>sm</var> to a file at path filename.<p>If this function is not implemented, the default behavior is equivalent to <code>pickle.dump(sm, file(filename,'w'))</code>.</dd></dl><a class="bookmark" name="svmlight"><h2><code>svmlight</code> Extension Module</h2></a>There are some functions within the basic SVM<sup><i>light</i></sup> package that a C implementation of SVM<sup><i>struct</i></sup> can use. While in Python, these functions are, of course, inaccessible. For this reason, the SVM<sup><i>python</i></sup> provides an extension <code>svmlight</code> module that the Python instantiation modules can import and use to get access to the following sometimes useful functions.<dl> <dt><code><b>classify_example</b></code>(<i>sm, sv</i>)</dt><dd>Classify a feature vector with the model's kernel function.<p>Given a feature vector <var>sv</var>, classify it according to the kernel and learned support vectors in the structure model <var>sm</var>. This is equivalent to the C function <code>classify_example(sm.svm_model, doc)</code>, where <var>doc.fvec</var> holds the vector contained as <var>sv</var>.</dd> <dt><code><b>create_doc</b></code>(<i>sv, [costfactor=1.0, [slackid=None, [docnum=None]]]</i>)</dt><dd>Create a Python document object.<p>This is the rough analogy to the C function <code>create_example</code>, except since the function arguments have been rearranged considerably it has been renamed to avoid confusion. All arguments except the first one are optional. The first argument specifies what the document's <var>fvec</var> attribute will hold, and should be a support vector object. The rest of the arguments are named according to the attributes they set in the document.</dd> <dt><code><b>create_svector</b></code>(<i>words, [userdefined=<code>''</code>, [factor=1.0, [kernel_id=0]]]</i>)</dt><dd>Create a Python support vector object.<p>Given a feature list of words <var>words</var>, create a Python support vector object. (Note that, unlike other places, this is actually a support vector object and not a list of support vector objects.) All arguments except the first one are optional. <dt><code><b>kernel</b></code>(<i>kp, sv1, sv2</i>)</dt>
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -