⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 46.txt

📁 This complete matlab for neural network
💻 TXT
📖 第 1 页 / 共 2 页
字号:
: 过学习就是钻牛角尖,不从事物的一般性出发思考问题,而是在考虑非常
: 特殊的情况,研究出来也不通用。
: 【 在 GzLi (笑梨) 的大作中提到: 】


helloboy (hello) 于Sat Jan  4 15:06:50 2003)
提到:

不解。what is too general? can you give me an example.
【 在 ihappy (drunken new year) 的大作中提到: 】
: It's defintely wrong.
: In most overfitting cases, it is because that the learner is TOO GENERAL,
: compared with the training data. 
: 【 在 helloboy (hello) 的大作中提到: 】


sagayao (才子佳人) 于Sun Jan  5 13:57:35 2003)
提到:

ihappy, you are definitely wrong. Overfitting, not because it is TOO general, 
instead, it is NOT TOO GENERAL.

Overfitting: the model over fits the training set. 

The algorithm learns too much from the current training set. As the example 
in the first post, an algorithm that fits the data extremely well probably 
has the problem of OVERFITTING. The model is too adaptive to the training set, 
it may not work well on 
.ther data not used for the training. 

Overfitting is a problem, because it  doesn’t perform well on new test data. 
On the training set, the learning scheme has a good performance, however, 
on test set, the performance dramatically drops. The real world problem is 
the data the model 
unseen, a model has overfitting problem cannot be used to solve the real 
world problem. 

In order to avoid overfitting, in building training set, the data should be 
representive in this concept space. Also, in building the classificator, 
domain knowledge are used to adjust the algorithm. 

【 在 ihappy 的大作中提到: 】
: It's defintely wrong.
: In most overfitting cases, it is because that the learner is TOO GENERAL,
: compared with the training data. 
: 【 在 helloboy (hello) 的大作中提到: 】


ihappy (drunken new year) 于Sun Jan  5 23:22:39 2003)
提到:

No. sagayao , you are definitely wrong. 

Trust me, my major is machine learning, hehe.

In machine learning, we always talk about the capacity of a learning machine.
Never say the training set is too general or specific. It is the learning 
machine, whose capacity is too general COMPARED to the traning set.

Your idea about the definition of overfitting is correct. But you are not 
familiar with the jargons. That's all.

【 在 sagayao (才子佳人) 的大作中提到: 】
: ihappy, you are definitely wrong. Overfitting, not because it is TOO genera..
: 
: Overfitting: the model over fits the training set. 
: 
: The algorithm learns too much from the current training set. As the example..
: ther data not used for the training. 
: 
: Overfitting is a problem, because it  doesn’t perform well on new test dat..
:  model has overfitting problem cannot be used to solve the real world probl..
: 
: In order to avoid overfitting, in building training set, the data should be 
: representive in this concept space. Also, in building the classificator, do..
: 
: 【 在 ihappy 的大作中提到: 】
: 
: --
: (以下引言省略 ... ...)


GzLi (笑梨) 于Mon Jan  6 09:52:32 2003)
提到:

perhaps, giving a definition of generalization of LM on the board is 
a good idea to make others clearly know what you mean.

【 在 ihappy (drunken new year) 的大作中提到: 】
: No. sagayao , you are definitely wrong. 
: Trust me, my major is machine learning, hehe.
: In machine learning, we always talk about the capacity of a learning machine.
: Never say the training set is too general or specific. It is the learning 
: machine, whose capacity is too general COMPARED to the traning set.
: Your idea about the definition of overfitting is correct. But you are not 
: familiar with the jargons. That's all.
: 【 在 sagayao (才子佳人) 的大作中提到: 】


txytxy (nils) 于Mon Jan  6 10:44:09 2003)
提到:

【 在 ihappy 的大作中提到: 】

: No. sagayao , you are definitely wrong. 

: Trust me, my major is machine learning, hehe.

这似乎与你的major没关系吧

: In machine learning, we always talk about the capacity of a learning machi..
: Never say the training set is too general or specific. It is the learning 


: machine, whose capacity is too general COMPARED to the traning set.

: Your idea about the definition of overfitting is correct. But you are not 


: familiar with the jargons. That's all.

capacity是用VC维来度量的,vc维越大,能识别的概念就多,假设本身就越general,所以
我认为你是对的,但我也看不出sagyao有什么不对的地方。


: 【 在 sagayao (才子佳人) 的大作中提到: 】

: (以下引言省略...)



GzLi (笑梨) 于Mon Jan  6 11:53:13 2003)
提到:

overfitting is a crucial problem in designing learning machine.
and the concept is defined on the machine, not on the dataset.

【 在 txytxy (nils) 的大作中提到: 】
: 
: 【 在 ihappy 的大作中提到: 】
: 这似乎与你的major没关系吧
: capacity是用VC维来度量的,vc维越大,能识别的概念就多,假设本身就越general,所以
: 我认为你是对的,但我也看不出sagyao有什么不对的地方。
: 


py, you are definitely wrong. hehe... 

 于computational linguistics).lol....

提到:

I never say training set is too general. I say, "Overfitting, not because it is 

TOO general, instead, it is NOT TOO GENERAL." In this sentense, it means, the 

machine learner. 


And, in "Overfitting: the model over fits the training set.", the model means, 

the machine learner. 


"The algorithm learns too much from the current training set." The algorithm 

means, the machine learner. 


"an algorithm that fits the data extremely well probably has the problem of 

OVERFITTING". means, the algorithm has overfitting problem. not the data. 


"Overfitting is a problem, because it doesn’t perform well on new test data.  " 

here, the word “it” means the machine learner. 


Our disagreement is, you say, the reason for overfitting is because the learner 

is too general. I say, it is NOT too general. 


My statement is: 

For overfitting, the learner is NOT general enough; instead, the learner is too 

adaptive to the training data, such that the learner creates a model which 

couldn't fit other unseen data. The learner considers too much on *this* 

training set, it doesn't learn to the GENERAL concept conveyed by the data. 


A model is general means it can be used in many situations. For example, like 

the famous E=M*sqr(C), this equation is general, since it can be used to 

describe the relation between matter and energy. If other complex equation can 

fit some observations very well but not general enough to over all, the seen 

and unseen, I bet we can not win the Nobel based on our none-general discovery. 


About the jargons, I don't think I am not familiar with them. Actually, we are 

trained by different professors, and we use different text books, we read 

different papers. Jargons, it is not a merit for scholarship, better not use 

them. hehe...:))


※ 修改:.sagayao 於 Jan  7 06:59:13 2003 修改本文.[FROM: 128.180.98.152] 
※ 修改:.sagayao 於 Jan  7 07:00:39 2003 修改本文.[FROM: 128.180.98.152] 
ihappy (drunken new year) 于Tue Jan  7 06:56:55 2003)
提到:

ok, now the problem is not overfitting itself, it is interesting :-)

the problem is 'general'. since this word is not a rigoriously defined words.
And although what we say about overfitting are actually the same, when we say 
'general', we mean totally opposite meanings.

well, a old example is in the languge theory. the more general the language 
class is, the more restricted the language. it is silly to continue debating.

anyway, what we say about overfitting are esentially the same.

now since the spring term has started, i have less time.

【 在 sagayao (才子佳人) 的大作中提到: 】
: No, iHappy, you are definitely wrong. hehe... 
: Your major is machine learning, and my research covers data mining and textual
:  data mining (computational linguistics).lol....
: 
: I never say training set is too general. I say, "Overfitting, not because it i
: s TOO general, instead, it is NOT TOO GENERAL." In this sentense, it means, th
: e machine learner. 
: 
: And, in "Overfitting: the model over fits the training set.", the model means,
:  the machine learner. 
: 
: "The algorithm learns too much from the current training set." The algorithm m
: eans, the machine learner. 
: 
: "an algorithm that fits the data extremely well probably has the problem of OV
: ERFITTING". means, the algorithm has overfitting problem. not the data. 
: 
: "Overfitting is a problem, because it  doesn’t perform well on new test data.
:  " it means, the machine learner. 
: 
: (以下引言省略 ... ...)


⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -