📄 14.txt
字号:
发信人: nohau (nohau), 信区: DataMining
标 题: 第七章的几个重要的概念
发信站: 南京大学小百合站 (Thu Jan 2 15:58:28 2003)
列出了第七章的几个重要的概念.
我觉得这些概念是第七章的主要内容,准确的把握这几个概念是下一步学习的基础.
Chapter 7
Concepts:
7.2
真实错误率
True error: the true error of hypothesis h with respect to target concept c an
d distribution D is the probability that h will misclassify an instance drawn
at random according to D.
Note:
The true error depend on the unknown distribution D .
训练错误率
Training error: the probability of training examples misclassified by h.
Note:
1.The training error can be observed by the learner directly, and the true err
or can not.
2.The main problem of learning complexity is: “how probability of the observ
ed training error for h gives a misleading estimate of the true error.”
样本错误率
Sample error:(defined in chapter 5) The sample error of a hypothesis with resp
ect to some sample S of instances drawn from X is the faction of S that is mis
classified.
Note:
If S is the set of the training data, the sample error is the training error.
可PAC学习
Consider some class C of possible target concepts and a learner using hypothes
is space H. If a learner L conform two things, C is PAC-learnable by L using
H.: First, L must with arbitrarily high probability (1-d) output a hypothesis
having arbitrarily low error (e). Second, it must be efficiently, in time that
grows at most polynomially with 1/e and 1/d.
7.3
样本复杂度
Sample complexity
The growth in the number of required training examples with problem size, cal
led the complexity of the learning problem.
一致学习器
Consistent learner:
If a learner output hypothesis that perfectly fit the training data, it is cal
led consistent learner.
e-详尽
e-exhausted: The version space VSH,D is said to e-exhausted with respect to c
and D, if every hypothesis h in VSH,D has error less than e with respect to c
and D.
--
join us!
※ 来源:.南京大学小百合站 http://bbs.nju.edu.cn [FROM: 192.11.236.114]
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -