⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 learn_dhmm_annealed_broken.m

📁 上载文件为Matlab环境下的高斯以马尔科夫模型例程
💻 M
字号:
function [hmm, LL] = learn_dhmm_simple(data, hmm, varargin)% LEARN_DHMM_SIMPLE Find the ML/MAP params of an HMM with discrete outputs using EM %% [hmm, LL] = learn_dhmm_simple(data, hmm, ...)%% Input%   data{m} is the m'th training sequence.%   hmm is a structure created with mk_rnd_dhmm.%% Output%   hmm is a structure containing the learned params%   LL(i) is the log likelihood at iteration i%% Optional arguments are passed as name/value pairs, e.g, %    learn_dhmm(data, hmm, 'max_iter', 30)% Defaults are in [brackets]%% max_iter - max num iterations [100]% thresh - threshold for stopping EM (relative change in log-lik must drop below this) [1e-4]% verbose - 1 means display the log lik at each iteration [0]% dirichlet - equivalent sample size of a uniform Dirichlet prior applied to obsmat [0]%% anneal - do deterministic annealing? [0]% anneal_rate - rate at which inverse temperature is increased (should be between 1.1 and 1.5) [1.2]% init_beta - initial inverse temperature (should be between 0.1 and 0.5) [0.5]%% rate=1.2 init=0.5 takes 5 iterations to reach beta=1% rate=1.2 init=0.1 takes 14 iterations to reach beta=1max_iter = 100;thresh = 1e-4;verbose = 0;dirichlet = 0;anneal = 0;anneal_rate = 1.2;init_beta = 0.5;if nargin >= 3  args = varargin;  for i=1:2:length(args)    switch args{i},     case 'max_iter', max_iter = args{i+1};     case 'thresh', thresh = args{i+1};     case 'verbose', verbose = args{i+1};     case 'dirichlet', dirichlet = args{i+1};     case 'anneal', anneal = args{i+1};     case 'anneal_rate', anneal_rate = args{i+1};     case 'init_beta', init_beta = args{i+1};    end  endend      if anneal check_score_increases = 0;else  check_score_increases = 0;endif anneal  % schedule taken from Ueda and Nakano, "Determinsitic Annealing EM algorithm",  % Neural Networks 11 (1998): 271-282, p276  b = [];  temp = [];  i = 1;  b(i)=init_beta;  temp(i)=1/b(i);  while b(i) < 1    i = i + 1;    b(i)=b(i-1)*anneal_rate;    temp(i)=1/b(i);  end  if temp(end) < 1, temp(end) = 1; end  temp_schedule = temp;else  temp_schedule = 1;endprevious_loglik = -inf;loglik = 0;converged = 0;num_iter = 1;LL = [];if ~iscell(data)  data = num2cell(data, 2); % each row gets its own cellendnumex = length(data);startprob = hmm.startprob;endprob = hmm.endprob;transmat = hmm.transmat;obsmat = hmm.obsmat;for anneal_iter = 1:length(temp_schedule)  temp = temp_schedule(anneal_iter);  converged = 0;  inner_iter = 1;  previous_loglik = -inf;     while (inner_iter <= max_iter) & ~converged    % E step    [loglik, exp_num_trans, exp_num_visits1, exp_num_emit, exp_num_visitsT] = ...	compute_ess_dhmm(startprob, transmat, obsmat, data, dirichlet, temp);        if verbose      if anneal	fprintf(1, 'iteration %d, loglik = %f, temp = %5.3f\n', num_iter, loglik, temp);      else	fprintf(1, 'iteration %d, loglik = %f\n', num_iter, loglik);      end    end    num_iter =  num_iter + 1;    inner_iter = inner_iter + 1;        % M step    startprob = normalise(exp_num_visits1);    endprob = normalise(exp_num_visitsT );    obsmat = mk_stochastic(exp_num_emit);    transmat = mk_stochastic(exp_num_trans);        converged = em_converged(loglik, previous_loglik, thresh, check_score_increases);    previous_loglik = loglik;    LL = [LL loglik];  end  endhmm.startprob = startprob;hmm.endprob = endprob;hmm.transmat = transmat;hmm.obsmat = obsmat;

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -