代码搜索:classifier

找到约 4,824 项符合「classifier」的源代码

代码结果 4,824
www.eeworm.com/read/344640/11870075

m u_rbfdemo.m

echo off % RBFDEMO demonstration for using nonlinear SVM classifier % with a RBF kernel. echo on; clc % RBFDEMO demonstration for using nonlinear SVM classifier % with a RBF kernel. %#####
www.eeworm.com/read/344640/11870082

m u_lindemo.m

echo off %LINDEMO demonstration for using linear SVM classifier. echo on; clc %LINDEMO demonstration for using linear SVM classifier. %#########################################################
www.eeworm.com/read/256798/11971866

m knn_map.m

%KNN_MAP Map a dataset on a K-NN classifier % % F = KNN_MAP(A,W) % % INPUT % A Dataset % W k-NN classifier trained by KNNC % % OUTPUT % F Posterior probabilities % % DESCRIPTION % Maps t
www.eeworm.com/read/342711/12005117

m svmclass.m

function [y,dfce] = svmclass(X,model) % SVMCLASS Support Vector Machines Classifier. % % Synopsis: % [y,dfce] = svmclass( X, model ) % % Description: % [y,dfce] = svmclass( X, model ) classifies inp
www.eeworm.com/read/342008/12046789

m classd.m

%CLASSD Classify data using a given classifier % % labels = classd(D) % % Finds the labels of the classified dataset D (typically the result % of a mapping or classification A*W). For each object
www.eeworm.com/read/342008/12046840

m baggingc.m

%BAGGINGC Bootstrapping and aggregation of classifiers % % W = baggingc(A,classf,n,cclassf,T) % % Computation of a stabilized version of a classifier by % bootstrapping and aggregation ('bagging
www.eeworm.com/read/342008/12046852

m rbnc.m

%RBNC Radial basis neural net classifier % % W = rbnc(A,n) % % A feedforward neural network classifier with one hidden layer with % at most n radial basis units is computed for the labeled dataset
www.eeworm.com/read/342008/12046871

m knnc.m

%KNNC K-Nearest Neighbor Classifier % % [W,k,e] = knnc(A,k) % % Computation of the k-nearest neigbor classifier for the dataset A. % Default k: optimize leave-one-out error e. W is a mapping and %
www.eeworm.com/read/342008/12046971

m parzenc.m

%PARZENC Optimisation of the Parzen classifier % % [W,h,e] = parzenc(A) % % Computation of the optimum smoothing parameter h for the Parzen % classifier between the classes in the dataset A. The l
www.eeworm.com/read/342008/12046989

m rsubc.m

%RSUBC Random Subspace Classifier % % W = rsubc(A,classf,r,n,cclassf,T) % % Computation of a combined classifier by selecting n random subsets % of r features. For each of these subsets the base c