最新的支持向量机工具箱,有了它会很方便 1. Find time to write a proper list of things to do! 2. Documentation. 3. Support Vector regression. 4. Automated model selection. REFERENCES ========== [1] V.N. Vapnik, "The Nature of Statistical Learning Theory", Springer-Verlag, New York, ISBN 0-387-94559-8, 1995. [2] J. C. Platt, "Fast training of support vector machines using sequential minimal optimization", in Advances in Kernel Methods - Support Vector Learning, (Eds) B. Scholkopf, C. Burges, and A. J. Smola, MIT Press, Cambridge, Massachusetts, chapter 12, pp 185-208, 1999. [3] T. Joachims, "Estimating the Generalization Performance of a SVM Efficiently", LS-8 Report 25, Universitat Dortmund, Fachbereich Informatik, 1999.
上传时间: 2013-12-16
上传用户:亚亚娟娟123
Polynomial fit functions === === === === regressionObject.cls contains a class that provides an easy way to add polynomial regression functionality to any application. If you just want linear regression or a very high degree, no matter: this class has good performance and scales seamlessly with the complexity of your problem.
标签: regressionObject Polynomial functions contains
上传时间: 2015-04-06
上传用户:rocwangdp
Support Vector Machines is a powerful methodology for solving problems in nonlinear classification and regression. It is a matlab version.
标签: classification methodology nonlinear Machines
上传时间: 2015-06-08
上传用户:bruce
The Bayesian Committee Machine (BCM) is an approximation method for large-scale Gaussian process regression. - The code is for Matlab Version 1.0, November 2005
标签: approximation large-scale Committee Bayesian
上传时间: 2015-09-14
上传用户:caiiicc
高斯过程是一种非参数化的学习方法,它可以很自然的用于regression,也可以用于classification。本程序用高斯过程实现分类!
上传时间: 2015-10-22
上传用户:gxf2016
In this paper we propose to reduce the textural components by modelling the coefficients of a wedgelet based regression tree instead of the original pixel intensities
标签: coefficients components modelling the
上传时间: 2015-10-22
上传用户:gxmm
The subroutines glkern.f and lokern.f use an efficient and fast algorithm for automatically adaptive nonparametric regression estimation with a kernel method. Roughly speaking, the method performs a local averaging of the observations when estimating the regression function. Analogously, one can estimate derivatives of small order of the regression function.
标签: automatically subroutines and algorithm
上传时间: 2015-11-25
上传用户:luke5347
Support Vector Machine is small sample method based on statistic learning theory. It is a new method to deal with the highly nonlinear classification and regression problems .It can better deal with the small sample, nonlinear and
标签: method statistic learning Support
上传时间: 2014-12-02
上传用户:zukfu
PCA and PLS aims:to get some insight into the bilinear factor models Principal Component Analysis (PCA) and Partial Least Squares (PLS) regression, focusing on the mathematics and numerical aspects rather than how s and why s of data analysis practice. For the latter part it is assumed (but not absolutely necessary) that the reader is already familiar with these methods. It also assumes you have had some preliminary experience with linear/matrix algebra.
标签: Component Principal Analysis bilinear
上传时间: 2016-02-07
上传用户:zuozuo1215
本人编写的incremental 随机神经元网络算法,该算法最大的特点是可以保证approximation特性,而且速度快效果不错,可以作为学术上的比较和分析。目前只适合benchmark的regression问题。 具体效果可参考 G.-B. Huang, L. Chen and C.-K. Siew, “Universal Approximation Using Incremental Constructive Feedforward Networks with Random Hidden Nodes”, IEEE Transactions on Neural Networks, vol. 17, no. 4, pp. 879-892, 2006.
标签: incremental 编写 神经元网络 算法
上传时间: 2016-09-18
上传用户:litianchu