代码搜索结果

找到约 33,766 项符合 Algorithm 的代码

weaklearn_train.m

function [i_star,theta_star,qdef_star,r_star]=WeakLearn_train(data,D) % Implements the weak learner in the RankBoost paper. % %Y. Freund, R. Iyer, and R. Schapire, 揂n efficient boosting algorithm for

km.m

function [clusters] = km(X,N,centers) % [clusters] = km(X,N,centers) % [clusters] = km(X,N) % % Function for determining the clusters using K-means algorithm. % % Input parameters: % -

aismain.m

function [] = AISMAIN % % Function AISMAIN Demonstration % Runs a Demo for the following immune tools: % 1) CLONALG (Basic Clonal Selection Algorithm)-----CLONALG.doc % % % Secondary Functions:

opf_slvr.m

function code = opf_slvr(alg) %OPF_SLVR Which OPF solver is used by alg. % code = opf_slvr(alg) returns a solver code given an algorithm code. % The codes are: % 0 - 'constr' from Optimiz

sgalab_contents.m

% /*M-FILE SCRIPT SGALAB_contents MMM SGALAB */ % % % /*================================================================================================== % Simple Genetic Algorithm Laboratory Tool

kmeanlbg.m

function [x,esq,j] = kmeanlbg(d,k) %KMEANLBG Vector quantisation using the Linde-Buzo-Gray algorithm [X,ESQ,J]=(D,K) % %Inputs: % D contains data vectors (one per row) % K is number of centres re

readme

Backpropagation learning: bp_innerloop The backpropagation learning algorithm, used in each of the demos below. XOR Demo: bpxor.m Learning the XOR function. XorPats.m Input patterns for

contents.m

% Genetic Optimization Toolbox % % Main interface % ga.m The Genetic Algorithm % initializega.m Initialization function for float and binary % repres

_str2pat.c

/* File : _str2pat.c Author : Richard A. O'Keefe. Updated: 2 June 1984 Defines: _pat_lim, _pat_vec[], _str2pat() Searching in this package is done by an algorithm due to R.

lms.m

function [h,y] = lms(x,d,delta,N) % LMS Algorithm for Coefficient Adjustment % ---------------------------------------- % [h,y] = lms(x,d,delta,N) % h = estimated FIR filter % y = output