📄 neuralnet.m
字号:
function [Y_compute, Y_prob] = NeuralNet(para, X_train, Y_train, X_test, Y_test, num_class)
Y_compute = zeros(size(Y_test)); Y_prob = zeros(size(Y_test));
if (isempty(X_train)),
fprintf('Error: The training set is empty!\n');
return;
end;
class_set = GetClassSet(Y_train);
p = str2num(char(ParseParameter(para, {'-NHidden';'-NOut'; '-Alpha'; '-NCycles'}, {'10';'1';'0.2';'10'})));
% Now set up and train the MLP
nhidden = p(1);
nout = p(2);
alpha = p(3); % Weight decay
ncycles = p(4); % Number of training cycles.
[num_data num_feature] = size(X_train);
% Set up MLP network
net = mlp(num_feature, nhidden, nout, 'logistic', alpha);
options = zeros(1, 18);
options(1) = 1; % Print out error values
options(14) = ncycles;
% Train using quasi-Newton.
target = (Y_train == class_set(1));
[net] = netopt(net, options, X_train, target, 'quasinew');
Ypred = mlpfwd(net, X_test);
Y_compute = class_set(1) * (Ypred >= 0.5) + class_set(2) * (Ypred < 0.5);
Y_prob = Ypred .* (Ypred >= 0.5) + (1 - Ypred) .* (Ypred < 0.5);
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -