⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 knn.cv.html

📁 本程序是基于linux系统下c++代码
💻 HTML
字号:
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">
<html><head><title>R: k-Nearest Neighbour Cross-Validatory Classification</title>
<meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1">
<link rel="stylesheet" type="text/css" href="../../R.css">
</head><body>

<table width="100%" summary="page for knn.cv {class}"><tr><td>knn.cv {class}</td><td align="right">R Documentation</td></tr></table>
<h2>k-Nearest Neighbour Cross-Validatory Classification</h2>


<h3>Description</h3>

<p>
k-nearest neighbour cross-validatory classification from training set.
</p>


<h3>Usage</h3>

<pre>
knn.cv(train, cl, k = 1, l = 0, prob = FALSE, use.all = TRUE)
</pre>


<h3>Arguments</h3>

<table summary="R argblock">
<tr valign="top"><td><code>train</code></td>
<td>
matrix or data frame of training set cases.
</td></tr>
<tr valign="top"><td><code>cl</code></td>
<td>
factor of true classifications of training set
</td></tr>
<tr valign="top"><td><code>k</code></td>
<td>
number of neighbours considered.
</td></tr>
<tr valign="top"><td><code>l</code></td>
<td>
minimum vote for definite decision, otherwise <code>doubt</code>. (More
precisely, less than <code>k-l</code> dissenting votes are allowed, even
if <code>k</code> is increased by ties.)
</td></tr>
<tr valign="top"><td><code>prob</code></td>
<td>
If this is true, the proportion of the votes for the winning class
are returned as attribute <code>prob</code>.
</td></tr>
<tr valign="top"><td><code>use.all</code></td>
<td>
controls handling of ties. If true, all distances equal to the <code>k</code>th
largest are included. If false, a random selection of distances
equal to the <code>k</code>th is chosen to use exactly <code>k</code> neighbours.
</td></tr>
</table>

<h3>Details</h3>

<p>
This uses leave-one-out cross validation.
For each row of the training set <code>train</code>, the <code>k</code> nearest
(in Euclidean distance) other
training set vectors are found, and the classification is decided by
majority vote, with ties broken at random. If there are ties for the
<code>k</code>th nearest vector, all candidates are included in the vote.
</p>


<h3>Value</h3>

<p>
factor of classifications of training set. <code>doubt</code> will be returned as <code>NA</code>.</p>

<h3>References</h3>

<p>
Ripley, B. D. (1996)
<EM>Pattern Recognition and Neural Networks.</EM> Cambridge.
</p>
<p>
Venables, W. N. and Ripley, B. D. (2002)
<EM>Modern Applied Statistics with S.</EM> Fourth edition.  Springer.
</p>


<h3>See Also</h3>

<p>
<code><a href="knn.html">knn</a></code>
</p>


<h3>Examples</h3>

<pre>
data(iris3)
train &lt;- rbind(iris3[,,1], iris3[,,2], iris3[,,3])
cl &lt;- factor(c(rep("s",50), rep("c",50), rep("v",50)))
knn.cv(train, cl, k = 3, prob = TRUE)
attributes(.Last.value)
</pre>



<hr><div align="center">[Package <em>class</em> version 7.2-44 <a href="00Index.html">Index]</a></div>

</body></html>

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -