⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 predicterror.m

📁 人工神经网络的源码编程
💻 M
字号:
function [e, W, A, P, g] = predictError(H, Y, l, options, U)% [e, W, A, P, g] = predictError(H, Y, l, options, U)%% Calculates the predicted error on a future test set for a linear% network of design H with output training points Y and using either% local or global ridge regression with regularisation parameter(s) l.% Uses a number of alternative methods:%%   options = MSE: Mean Square Error of training set%   options = UEV: Unbiased Estimate of Variance%   options = FPE: Final Prediction Error%   options = GCV: Generalised Cross-Validation%   options = BIC: Bayesian Information Criterion%   options = LOO: Leave-One-Out cross-validation%% The options string can contain one or more of these substrings% (seperated by commas or spaces) in which case the result will% be a list (row vector) of the corresponding error estimates.%% Inputs%%   H       design matrix                (p-by-m)%   Y       input trainig data           (p-by-k)%   l       regularisation parameter(s)  (real or vector length m)%   options error prediction method(s)   (string)%   U       alternative smoothing metric (m-by-m)%% Outputs%%   e       predicted error              (row vector)%   W       A * H' * Y                   (m-by-k)%   A       inv(H'* H + L)               (m-by-m)%   P       I - H * A * H'               (p-by-p)%   g       p - trace(P)                 (real)%% no model to begin withModel = '';% process optionsif nargin > 3  % initialise  i = 1;  [arg, i] = getNextArg(options, i);  % scan through arguments  while ~isempty(arg)    if strcmp(lower(arg), 'mse')      % use mean square error (over the training set)      Model = [Model 'm'];    elseif strcmp(lower(arg), 'uev')      % use unbiased expected variance      Model = [Model 'u'];    elseif strcmp(lower(arg), 'fpe')      % use final prediction error      Model = [Model 'f'];    elseif strcmp(lower(arg), 'gcv')      % use GCV (generalised cross-validation) to terminate      Model = [Model 'g'];    elseif strcmp(lower(arg), 'bic')      % use Bayesian information criterion      Model = [Model 'b'];    elseif strcmp(lower(arg), 'loo')      % use leave-one-out cross-validation      Model = [Model 'o'];    else      fprintf('predictError: use MSE, UEV, FPE, GCV, BIC or LOO\n')      error('predictError: bad option')    end    % get next argument    [arg, i] = getNextArg(options, i);  endend% default model if none specified in optionsif isempty(Model)  Model = 'g';endif nargin < 5  U = 1;         % default metric will be eye(m)end% initialise[p, m] = size(H);[p, k] = size(Y);if length(l) == 1  L = diag(l * ones(m,1));elseif length(l) == m  L = diag(l);else  error('predictError: wrongly sized regularisation parameter')end[u1, u2] = size(U);if u1 == 1 & u2 == 1  UU = L;elseif u1 == m & u2 == m  UU = U' * L * U;else  estr = sprintf('%d-by-%d', m, m);  error(['predictError: U should be 1-by-1 or ' estr])end% preliminary calculationsHH = H' * H;HY = H' * Y;A = inv(HH + UU);W = A * HY;P = eye(p) - H * A * H';PY = P * Y;YPY = traceProduct(PY', PY);g = p - trace(P);% calculate errors for each method specified in optionse = [];for model = Model  % two cases  if model == 'o'    % special case of LOO    dPPY = PY ./ dupCol(diag(P), k);    e = [e  traceProduct(dPPY', dPPY) / p];  else    % value of factor    if model == 'm'      psi = 1;    elseif model == 'u'      psi = p / (p - g);    elseif model == 'f'      psi = (p + g) / (p - g);    elseif model == 'g'      psi = p^2 / (p - g)^2;    else % (model == 'b')      psi = (p + (log(p) - 1) * g) / (p - g);    end    % final calculation    e = [e  psi * YPY / p];  endend

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -