代码搜索:classifier

找到约 4,824 项符合「classifier」的源代码

代码结果 4,824
www.eeworm.com/read/421949/10675986

m u_lindemo.m

echo off %LINDEMO demonstration for using linear SVM classifier. echo on; clc %LINDEMO demonstration for using linear SVM classifier. %#########################################################
www.eeworm.com/read/421949/10675988

m c_clademo.m

echo off % CLADEMO demonstration for using a contructed SVM classifier to classify % input patterns echo on; % % % NOTICE: please first run any of the first three demonstrations before %
www.eeworm.com/read/421949/10676010

m svmclass.m

function [Labels, DecisionValue]= SVMClass(Samples, AlphaY, SVs, Bias, Parameters, nSV, nLabel) % Usages: % [Labels, DecisionValue]= SVMClass(Samples, AlphaY, SVs, Bias); % [Labels, DecisionValu
www.eeworm.com/read/349725/10802033

m svmclass.m

function [Labels, DecisionValue]= SVMClass(Samples, AlphaY, SVs, Bias, Parameters, nSV, nLabel) % Usages: % [Labels, DecisionValue]= SVMClass(Samples, AlphaY, SVs, Bias); % [Labels, DecisionValu
www.eeworm.com/read/418756/10928173

m adademo.m

function MOV=adademo % ADADEMO AdaBoost demo % ADADEMO runs AdaBoost on a simple two dimensional classification % problem. % Written by Andrea Vedaldi - 2006 % http://vision.ucla.edu/~vedaldi do_
www.eeworm.com/read/418695/10935170

m classd.m

%CLASSD Classify data using a given classifier % % labels = classd(D) % % Finds the labels of the classified dataset D (typically the result % of a mapping or classification A*W). For each object
www.eeworm.com/read/418695/10935190

m baggingc.m

%BAGGINGC Bootstrapping and aggregation of classifiers % % W = baggingc(A,classf,n,cclassf,T) % % Computation of a stabilized version of a classifier by % bootstrapping and aggregation ('bagging
www.eeworm.com/read/418695/10935194

m rbnc.m

%RBNC Radial basis neural net classifier % % W = rbnc(A,n) % % A feedforward neural network classifier with one hidden layer with % at most n radial basis units is computed for the labeled dataset
www.eeworm.com/read/418695/10935205

m knnc.m

%KNNC K-Nearest Neighbor Classifier % % [W,k,e] = knnc(A,k) % % Computation of the k-nearest neigbor classifier for the dataset A. % Default k: optimize leave-one-out error e. W is a mapping and %
www.eeworm.com/read/418695/10935254

m parzenc.m

%PARZENC Optimisation of the Parzen classifier % % [W,h,e] = parzenc(A) % % Computation of the optimum smoothing parameter h for the Parzen % classifier between the classes in the dataset A. The l