代码搜索:optimization
找到约 10,000 项符合「optimization」的源代码
代码结果 10,000
www.eeworm.com/read/446239/7582984
m trainpso.m
%TRAINPSO Particle Swarm Optimization backpropagation.
%
% Syntax
%
% [net,tr,Ac,El] = trainpso(net,Pd,Tl,Ai,Q,TS,VV,TV)
% info = trainpso(code)
%
% Description
%
% TRAINPSO is a
www.eeworm.com/read/443209/7636197
m bg_pso.m
%% Tunning of PID controller using Bacterial Foraging Orientec by Particle swarm optimization
%
%
%My work has been accpepted in GECCO 2008 as Graduat Student workshop. I
%have used this techique
www.eeworm.com/read/438673/7728490
makefile
# DO NOT DELETE THIS LINE -- make depend depends on it.
# Edit the lines below to point to any needed include and link paths
# Or to change the compiler's optimization flags
CC = g++
COMPILEFLAGS =
www.eeworm.com/read/250888/7806390
m trainpso.m
%TRAINPSO Particle Swarm Optimization backpropagation.
%
% Syntax
%
% [net,tr,Ac,El] = trainpso(net,Pd,Tl,Ai,Q,TS,VV,TV)
% info = trainpso(code)
%
% Description
%
% TRAINPSO is a
www.eeworm.com/read/299178/7881322
m optimget.m
function o = optimget(options,name,default,flag)
%OPTIMGET Get OPTIM OPTIONS parameters.
% VAL = OPTIMGET(OPTIONS,'NAME') extracts the value of the named parameter
% from optimization options s
www.eeworm.com/read/296909/8072797
m cantilever_beam_norec.m
% Exercise 4.31: Design of a cantilever beam (non-recursive formulation)
% (For a detailed explanation see section 4.5.4, pp. 163-165)
% Boyd & Vandenberghe "Convex Optimization"
% (a figure is genera
www.eeworm.com/read/143706/12849695
m contents.m
% Netlab Toolbox
% Version 3.2.1 31-Oct-2001
%
% conffig - Display a confusion matrix.
% confmat - Compute a confusion matrix.
% conjgrad - Conjugate gradients optimization.
% consist - Ch
www.eeworm.com/read/143706/12849776
m graddesc.m
function [x, options, flog, pointlog] = graddesc(f, x, options, gradf, ...
varargin)
%GRADDESC Gradient descent optimization.
%
% Description
% [X, OPTIONS, FLOG, POINTLOG] = GRADDESC(F, X, OPTIONS
www.eeworm.com/read/244097/12888220
c resample_mmx.c
// MMX optimizations from Michael Niedermayer (michaelni@gmx.at) (under GPL)
/* optimization TODO / NOTES
movntq is slightly faster (0.5% with the current test.c benchmark)
(but thats just te
www.eeworm.com/read/140851/13059120
m graddesc.m
function [x, options, flog, pointlog] = graddesc(f, x, options, gradf, ...
varargin)
%GRADDESC Gradient descent optimization.
%
% Description
% [X, OPTIONS, FLOG, POINTLOG] = GRADDESC(F, X, OP