📄 neurc.m
字号:
%NEURC Automatic neural network classifier% % W = NEURC (A,UNITS)%% INPUT% A Dataset% UNITS Number of units% Default: 0.2 x size smallest class in A.%% OUTPUT% W Trained feed-forward neural network mapping%% DESCRIPTION% Automatically trained feed-forward neural network classifier with UNITS% units in a single hidden layer. Training, by LMNC, is stopped when the % performance on an artificially generated tuning set of 1000 samples per % class (based on k-nearest neighbour interpolation) does not improve anymore.%% NEURC always tries three random initialisations, with fixed random seeds, % and returns the best result according to the tuning set. This is done in % order to obtain a reproducable result.%% If UNITS is NaN it is optimised by REGOPTC. This may take a long% computing time and is often not significantly better than the default.%% Uses the Mathworks' neural network toolbox.% % SEE ALSO% MAPPINGS, DATASETS, LMNC, BPXNC, GENDATK, REGOPTC% Copyright: R.P.W. Duin, r.p.w.duin@prtools.org% Faculty EWI, Delft University of Technology% P.O. Box 5031, 2600 GA Delft, The Netherlands% $Id: neurc.m,v 1.9 2008/07/03 09:11:44 duin Exp $function argout = neurc (a,units) prtrace(mfilename); mapname = 'Automatic Neural Classifier'; n_attempts = 3; % Try three different random initialisations. if (nargin < 2) units = []; end if (nargin < 1) | (isempty(a)) w = mapping(mfilename,units); argout = setname(w,mapname); return end [m,k] = size(a); if isempty(units) cs = classsizes(a); units = ceil(0.2*min(cs)); end if (~ismapping(units)) if isnan(units) % optimize complexity parameter: number of neurons defs = {[]}; parmin_max = [1,30]; w = regoptc(a,mfilename,{units},defs,[1],parmin_max,testc([],'soft'),0); return end islabtype(a,'crisp'); isvaldfile(a,1,2); % at least 1 object per class, 2 classes a = testdatasize(a); %a = setprior(a,getprior(a)); % Second parameter is not a mapping: train a network. % Reproducability: always use same seeds. rand('seed',1); randn('seed',1); opt_err = inf; opt_mapping = []; % Try a number of random initialisations. s = sprintf('%i neural network initializations: ',n_attempts); prwaitbar(n_attempts,s); for attempt = 1:n_attempts prwaitbar(n_attempts,attempt,[s int2str(attempt)]); prwarning(4,'training with initialisation %d of %d',attempt,n_attempts); t = gendatk(a,1000,2,1); % Create tuning set based on training set. w = lmnc(a,units,inf,[],t); % Find LMNC mapping. e = t*w*testc; % Calculate classification error. if (e < opt_err) % If this is the best of the three repetitions, store it. opt_mapping = w; opt_err = e; end end prwaitbar(0); % Output is best network found. argout = setname(opt_mapping,mapname); else nodatafile(a); % Second parameter is a mapping: execute. w = units; data = getdata(w); if (length(data) > 1) % "Old" neural network - network is second parameter: unpack. data = getdata(w); weights = data{1}; pars = data{2}; numlayers = length(pars); output = a; % Output of first layer: dataset. for j = 1:numlayers-1 % Number of inputs (n_in) and outputs (n_out) of neurons in layer J. n_in = pars(j); n_out = pars(j+1); % Calculate output of layer J+1. Note that WEIGHTS contains both % weights (multiplied by previous layer's OUTPUT) and biases % (multiplied by ONES). this_weights = reshape(weights(1:(n_in+1)*n_out),n_in+1,n_out); output = sigm([output,ones(m,1)]*this_weights); % Remove weights of this layer. weights(1:(n_in+1)*n_out) = []; end else % "New" neural network: unpack and simulate using the toolbox. net = data{1}; output = sim(net,+a')'; end; % 2-class case, therefore 1 output: 2nd output is 1-1st output. if (size(output,2) == 1) output = [output (1-output)]; end % Output is mapped dataset. argout = setdat(a,output,w); endreturn
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -