⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 nlfa.m

📁 非线型因素分析matlab仿真程序包
💻 M
字号:
function [sources, net, params, status, fs] = nlfa(data, varargin),% NLFA  Run nonlinear factor analysis%%  [results, x_reco] = NLFA(data, otherargs...)%  runs the NLFA algorithm for given data (different samples in%  different columns, different channels in different rows) and%  returns a result structure containing the estimated sources, net,%  parameters and status information.  This structure can be passed%  back to the function as explained below to continue iteration.  The%  reconstructed observations are returned in the optional second%  output argument.%%  It is also possible to request more output arguments to get the%  fields of the result structure separately as in%  [sources, net, params, status, x_reco] = NLFA(data, otherargs...)%%  The additional arguments can be given as a structure with%  the field name indicating the name of the argument, or as a%  list as in NLFA(data, 'name1', val1, 'name2', val2, ...),%  or as a combination of these as in%  NLFA(data, struct, 'name1', val, 'name2', val2, ...).%  In the latter case the values given in the list take precedence%  over the ones specified in the structure.%% --------------------------------------------------------------------%  ACCEPTED ARGUMENTS%%  The recognised arguments are as follows:%%  'sources'            The values of sources returned by previous a%                       run of the algorithm.  This option can be used%                       to continue a previous simulation.%  'initsources'        Initial values for the sources to be used.%  'searchsources'      Number of sources to use, to be initialised by%                       linear PCA from the data.%%  'net'                The values of network weights returned by a%                       previous run of the algorithm.  This option%                       can be used to continue a previous simulation.%  'hidneurons'         Number of hidden neurons to use in the MLP.%  'nonlin'             Nonlinear activation function to use in the%                       MLP.  Default value 'tanh' is currently the%                       only supported value.%%  'params'             Structure of hyperparameter values returned by%                       a previous run of the algorithm.  This option%                       can be used to continue a previous simulation.%%  'status'             Structure of status information returned by a%                       previous run of the algorithm.  This option%                       can be used to continue a previous simulation.%  'iters'              Number of iterations to run the algorithm.%                       The default value is 300.%                       Use value 0 to only evaluate the reconstructed%                       observations and cost function value.%  'approximation'      Nonlinearity approximation scheme to use.%                       Supported values are 'hermite' (default) and%                       'taylor'.%  'updatealg'          Update algorithm.  Supported values are%                       'old', 'grad' and 'conjgrad' (default).%  'cgreset'            When to reset the conjugate gradient method.%                       Possible values are -1 (at the beginning of%                       each call to NLFA), 0 (never),%                       positive integer n (every n iterations).%                       The default value is 200.%  'verbose'            Print out a summary of all simulation%                       parameters at the beginning of simulation.%                       The default value is 1 (yes), other possible%                       values are 0 (no) and 2 (always).  With 1%                       (yes).  The value is reset in the returned%                       structure so further calls will not reprint%                       the summary.%  'epsilon'            Epsilon for stopping criterion.  The default%                       value is 1e-6.%%% --------------------------------------------------------------------%  RELATIONS BETWEEN ARGUMENTS%%  Exactly one of the arguments 'sources', 'initsources' and%  'searchsources' must be set.%%  Either 'net' or 'hidneurons' must be set.  If 'net' is set,%  'hidneurons' is not honoured.%%% --------------------------------------------------------------------%  STOPPING CRITERIA%%  The iteration is stopped if the designated number of iterations is%  reached, or the total number of iterations is greater than 400 and%  a) the cost function increases for 10 iterations in a row; or%  b) the cost function decreases by less than epsilon (see above)%  on each iteration for 200 iterations in a row.%%% --------------------------------------------------------------------%  EXAMPLES%%  results = NLFA(data, 'searchsources', 5, 'hidneurons', 30);%            Extract 5 nonlinear factors from data using an MLP with%            30 hidden neurons and PCA initialisation.%%  [results, fs] = NLFA(data, result, 'iters', 500);%            Continue the previous simulation for 500 more iterations,%            taking also the reconstructions of the observations as%            output variable fs.%%  [sources, net, params, status] = ...%      NLFA(data, 'initsources', my_s, 'hidneurons', 30, 'iters', 50);%            Extract nonlinear factors from data using an MLP with%            30 hidden neurons and custom initialisation given by my_s%            using 50 iterations of the algorithm.%% Copyright (C) 1999-2004 Antti Honkela, Harri Valpola,% and Xavier Giannakopoulos.%% This package comes with ABSOLUTELY NO WARRANTY; for details% see License.txt in the program package.  This is free software,% and you are welcome to redistribute it under certain conditions;% see License.txt for details.% ConstantsPCAEPSILON = 1e-10;% Read the argumentsif (length(varargin) == 1) & (isstruct(varargin{1})),  args = varargin{1};elseif (mod(length(varargin), 2) ~= 0),  if ~(isstruct(varargin{1})),    error('Keyword arguments should appear in pairs');  else    args = varargin{1};    for k=2:2:length(varargin),      if ~ischar(varargin{k})	error('Keyword argument names must be strings.');      end      eval(sprintf('args.%s = varargin{k+1};', varargin{k}));    end  endelse  args = struct(varargin{:});end%argsif ~isfield(args, 'status'),  [status.iters, args] = getargdef(args, 'iters', 300);  [status.approximation, args] = getargdef(args, 'approximation', 'hermite');  [status.updatealg, args] = getargdef(args, 'updatealg', 'conjgrad');  [status.cgreset, args] = getargdef(args, 'cgreset', 200);  [status.epsilon, args] = getargdef(args, 'epsilon', 1e-6);  [status.verbose, args] = getargdef(args, 'verbose', 1);  [status.debug, args] = getargdef(args, 'debug', 0);  status.kls = [];  status.cputime = [cputime];  % How many iterations to wait before starting to update these values  status.updatenet = 0;  status.updatesrcvars = 0;  if strcmp(status.updatealg, 'old'),    [status.updatesrcs, args] = getargdef(args, 'updatesrcs', -50);    [status.updateparams, args] = getargdef(args, 'updateparams', -100);  else    [status.updatesrcs, args] = getargdef(args, 'updatesrcs', -20);    [status.updateparams, args] = getargdef(args, 'updateparams', -100);  end  status.t0 = 1;else  status = args.status;  args = rmfield(args, 'status');  [status.iters, args] = getargdef(args, 'iters', status.iters);  [status.approximation, args] = getargdef(args, 'approximation', ...						 status.approximation);  [status.updatealg, args] = getargdef(args, 'updatealg', status.updatealg);  [status.cgreset, args] = getargdef(args, 'cgreset', status.cgreset);  [status.epsilon, args] = getargdef(args, 'epsilon', status.epsilon);  [status.verbose, args] = getargdef(args, 'verbose', status.verbose);  [status.debug, args] = getargdef(args, 'debug', status.debug);endif status.verbose,  if isempty(status.kls),    fprintf('Starting an NLFA simulation with parameters:\n');  else    fprintf('Continuing an NLFA simulation with parameters:\n');    fprintf('Number of iterations so far: %d\n', length(status.kls));  end  fprintf('Number of signals: %d\n', size(data, 1));  fprintf('Number of samples: %d\n', size(data, 2));  switch status.approximation,   case 'taylor'    fprintf('Using the Taylor approximation for the nonlinearity.\n');   case 'hermite'    fprintf('Using the Gauss-Hermite approximation for the nonlinearity.\n');   otherwise    fprintf('Using an unknown approximation for the nonlinearity.\n');  end  switch status.updatealg,   case 'old'    fprintf('Using the old heuristic NLFA update algorithm.\n');   case 'conjgrad'    fprintf('Using the conjugate gradient update algorithm.\n');   otherwise    fprintf('Using a gradient update algorithm with line searches.\n');  endend% Initialise the sourcesif ~isfield(args, 'sources'),  if isfield(args, 'initsources'),    if status.verbose,      fprintf('Using %d pre-initialised sources.\n', ...	      size(args.initsources, 1));    end    sources = probdist(...	diag(1 ./ std(args.initsources, 0, 2)) * args.initsources, ...	.0001 * ones(size(args.initsources)));    args = rmfield(args, 'initsources');  else    if ~isfield(args, 'searchsources')      error('Either sources, initsources or searchsources must be set!')    end    % Use PCA to get initial values for the sources    [pcasources, pcaV, pcaD] = basicpca(data, args.searchsources);    I = find(pcaD(1:args.searchsources) > PCAEPSILON);    if max(I) < args.searchsources,      warning('NFA:PCADimTooLarge', ...	      'Latent dimensionality of the data is smaller than requested number of sources, decreasing the number of sources');    end    sources = probdist(...	diag(sqrt(1./pcaD(I))) * pcasources(I, :), ...	.0001 * ones(size(pcasources(I, :))));    if status.verbose,      fprintf('Using %d sources initialised with PCA.\n', size(sources, 1));    end    args = rmfield(args, 'searchsources');  endelse  if status.verbose,    fprintf('Using %d previously used sources.\n', size(args.sources, 1));  end  sources = args.sources;  args = rmfield(args, 'sources');end% Initialise the networkif ~isfield(args, 'net'),  if ~isfield(args, 'hidneurons'),    error('Either net or hidneurons must be set!')  end  if status.verbose,    fprintf('Initialising a new MLP network with %d hidden neurons.\n', ...	    args.hidneurons);  end  if strcmp(status.updatealg, 'old'),    if isfield(args, 'nonlin'),      net = createnet_alpha(size(sources, 1), args.hidneurons, ...			    size(data, 1), args.nonlin, 1, 1, 1, 1, .1, .1);      args = rmfield(args, 'nonlin');    else      net = createnet_alpha(size(sources, 1), args.hidneurons, ...			    size(data, 1), 'tanh', 1, 1, 1, 1, .1, .1);    end  else    if isfield(args, 'nonlin'),      net = createnet(size(sources, 1), args.hidneurons, size(data, 1), ...		      args.nonlin, 1, 1, 1, 1);      args = rmfield(args, 'nonlin');    else      net = createnet(size(sources, 1), args.hidneurons, size(data, 1), ...		      'tanh', 1, 1, 1, 1);    end  end      net.b2 = probdist(mean(data, 2), net.b2.var);  args = rmfield(args, 'hidneurons');else  if status.verbose,    fprintf('Using a previously used MLP network with %d hidden neurons.\n', size(args.net.w1, 1));  end  net = args.net;  args = rmfield(args, 'net');endif ~isfield(args, 'params'),  params.net.w2var = probdist(zeros(1, size(net.b1, 1)), ...			      .5 / size(data, 1) * ones(1, size(net.b1, 1)));  params.noise = probdist(.5 * log(.1) * ones(size(data, 1), 1), ...			  .5 / size(data, 2) * ones(size(data, 1), 1));  params.src = probdist(zeros(size(sources, 1), 1), ...			.5 / size(data, 2) * ones(size(sources, 1), 1));  params.hyper.net.w2var = nlfa_inithyper(0, .1, 0, .1);  params.hyper.net.b1 = nlfa_inithyper(0, .1, 0, .1);  params.hyper.net.b2 = nlfa_inithyper(0, .1, 0, .1);  params.hyper.noise = nlfa_inithyper(0, .1, 0, .1);  params.hyper.src = nlfa_inithyper(0, .1, 0, .1);  params.prior.net.w2var = nlfa_initprior(0, log(100), 0, log(100));  params.prior.net.b1 = nlfa_initprior(0, log(100), 0, log(100));  params.prior.net.b2 = nlfa_initprior(0, log(100), 0, log(100));  params.prior.noise = nlfa_initprior(0, log(100), 0, log(100));  params.prior.src = nlfa_initprior(0, log(100), 0, log(100));else  params = args.params;  args = rmfield(args, 'params');endotherargs = fieldnames(args);if length(otherargs) > 0,  fprintf('Warning: nlfa: unused arguments:\n');  fprintf(' %s\n', otherargs{:});endif status.verbose == 1,  status.verbose = 0;end% Do the actual thing...[sources, net, params, status, fs] = ...    nlfa_iter(data, sources, net, params, status);% If only one return value is expected, pack everything to itif nargout < 3,  val.sources = sources;  val.net = net;  val.params = params;  val.status = status;  sources = val;  net = fs;endfunction [val, args] = getargdef(args, name, default),if isfield(args, name),  % val = args.(name);  eval(sprintf('val = args.%s;', name));  args = rmfield(args, name);else  val = default;endfunction hyper = nlfa_inithyper(mm, mv, vm, vv)hyper.mean = probdist(mm, mv);hyper.var = probdist(vm, vv);function prior = nlfa_initprior(mm, mv, vm, vv)prior.mean.mean = probdist(mm, 0);prior.mean.var = probdist(mv, 0);prior.var.mean = probdist(vm, 0);prior.var.var = probdist(vv, 0);

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -