代码搜索:决策思维

找到约 689 项符合「决策思维」的源代码

代码结果 689
www.eeworm.com/read/398934/7908758

m mainfun.m

clear; clc; load shengchenshuju; T=size(r1,1); %%%%%%%%%%%%k叠交叉验证%%%%%%%%%%%%% k=4;s=4; tdata=r1(s:k:T,:);%测试数据包括分类好的决策属性值 r1(s:k:T,:)=[]; traindata=r1; %%%%%%%%%%GA:遗传算法%%%%%%%%%% NIND=40;
www.eeworm.com/read/405069/11472163

asv s6_3_q4.asv

clear all;close all;clc % 输入训练参数 [1 1;1 -1;-1 1;-1 -1] train_patterns = [0.8 0.9;0.83 -0.5;-0.7 0.6;-0.92 -0.75]'; train_targets = [0 1 1 0]'; params = [2 1e-8 0.3]; % 按照‘从左到右,从下到上’的次序 % 依次将决策区域
www.eeworm.com/read/175689/5343307

m detreeexp2_5.m

%设置全局变量 global WM Model_Year bad_d numobs grpname global x y j quadclass tree %创建含有x轴,y轴的栅格图来显示决策树错误分类 gscatter(x,y,grpname,'crk','odv') xlabel('车重(吨)'); ylabel('里/加仑'); hold on; plot(WM(bad
www.eeworm.com/read/175689/5343322

m detreeexp1_5.m

%设置全局变量 global meas species bad_d numobs grpname global x y j quadclass tree %创建含有x轴,y轴的栅格图来显示决策树错误分类 gscatter(x,y,grpname,'gmb','svp') xlabel('萼片长度'); ylabel('萼片宽度'); hold on; plot(meas(bad_
www.eeworm.com/read/428780/1953981

m detreeexp2_5.m

%设置全局变量 global WM Model_Year bad_d numobs grpname global x y j quadclass tree %创建含有x轴,y轴的栅格图来显示决策树错误分类 gscatter(x,y,grpname,'crk','odv') xlabel('车重(吨)'); ylabel('里/加仑'); hold on; plot(WM(bad
www.eeworm.com/read/428780/1953996

m detreeexp1_5.m

%设置全局变量 global meas species bad_d numobs grpname global x y j quadclass tree %创建含有x轴,y轴的栅格图来显示决策树错误分类 gscatter(x,y,grpname,'gmb','svp') xlabel('萼片长度'); ylabel('萼片宽度'); hold on; plot(meas(bad_
www.eeworm.com/read/368694/9681387

cpp dw1213_1.cpp

#include #include const int N=10; const int P=1000; void main() { int value[N];//各种面值大小 int i;//阶段变量 int S;//状态变量 int j;//决策变量 int number[N][P+1];//在各种状态下的最优解
www.eeworm.com/read/332284/12764915

m main.m

% ================main====================== clear,clc,close all; v=input(['请选择演示程序:\n 0 退出\n 1 Fisher法\n 2 感知器准则\n',... ' 3 最小二乘准则\n 4 快速近邻法\n 5 剪辑近邻法与压缩近邻法\n',... ' 6 二叉决策树\n 7
www.eeworm.com/read/405068/11472327

m s6_3_q4.m

clear all;close all;clc % 输入训练参数 [1 1;1 -1;-1 1;-1 -1] train_patterns = [0.5 0.7;0.8 -0.5;-0.7 0.8;-0.9 -0.85]'; train_targets = [0 1 1 0]'; params = [2 1e-8 0.3]; % 按照‘从左到右,从下到上’的次序 % 依次将决策区域每个
www.eeworm.com/read/215643/15055368

m generate_decision_tree.m

function Result=Generate_decision_tree(DataName,WhereSen,ForecastSen,attributName,i,j) %DataName为数表名称,ForecastSen预测属性名称,attributName为现有的属性名称,i为结点位置标记,为 %全局变量 %WhereSen为筛选语句名称,也就是从决策树根到这个结点的筛选条件 由j来