代码搜索:Boosting

找到约 146 项符合「Boosting」的源代码

代码结果 146
www.eeworm.com/read/482389/6624063

c boost-main.c

/****************************************************************************** boost-main.c - main driver program for experiments with boosting largely pilfered from ripper-main.c ****************
www.eeworm.com/read/429426/1948678

py ensemble.py

# Description: Demonstrates the use of boosting and bagging from orngEnsemble module # Category: classification, ensembles # Classes: BoostedLearner, BaggedLearner # Uses: lymphograph
www.eeworm.com/read/480116/6677149

index

ada Fitting Stochastic Boosting Models addtest Add a test set to ada pairs.ada Pairwise Plots and Variable Importancs Plot for
www.eeworm.com/read/429426/1948744

py domain13.py

# Description: Adds two new numerical attributes to iris data set, and tests through cross validation if this helps in boosting classification accuracy # Category: modelling # Uses: iris
www.eeworm.com/read/367675/2837917

txt 253.txt

发信人: WbAI (wbAI), 信区: DataMining 标 题: 谁能谈一下用<mark>boosting</mark>方法作文本分类的可行性? 发信站: 南京大学小百合站 (Sun Oct 13 20:19:14 2002) <mark>boosting</mark>方法从原理上讲可以用于任何分类器,但我想,对于文本分类,由于特征太多 等原因,该方法似乎效率会特别低下,甚至是不可行的哟。谁能谈谈<mark>boosting</mark>方法对文 ...
www.eeworm.com/read/367675/2838050

txt 338.txt

发信人: jeff814 (mimi), 信区: DataMining 标 题: 请教:<mark>boosting</mark>方法的实验结果为何会是**?? 发信站: 南京大学小百合站 (Thu Oct 17 09:20:10 2002) 我现在用<mark>boosting</mark>+决策树的方法做分类,希望性能比单纯用决策树好。但实际是随着迭代 轮数的增加,得到的假设hi的权重反而在减小。即:<mark>boosting</mark>+决策树得 ...
www.eeworm.com/read/397106/8067722

m locboost.m

function [D, P, theta, phi] = LocBoost(features, targets, Iterations, region) % Classify using the local boosting algorithm % Inputs: % features - Train features % targets - Train targets %
www.eeworm.com/read/429426/1948716

py ensemble3.py

# Description: Bagging and boosting with k-nearest neighbors # Category: modelling # Uses: promoters.tab # Classes: orngTest.crossValidation, orngEnsemble.BaggedLearner, orngEnsemble.
www.eeworm.com/read/191902/8417279

m locboost.m

function [D, P, theta, phi] = LocBoost(features, targets, params, region) % Classify using the local boosting algorithm % Inputs: % features - Train features % targets - Train targets % par