📄 learn_dhmm_simple.m
字号:
function [hmm, LL] = learn_dhmm_simple(data, hmm, varargin)% LEARN_DHMM_SIMPLE Find the ML/MAP params of an HMM with discrete outputs using EM %% [hmm, LL] = learn_dhmm_simple(data, hmm, ...)%% Input% data{m} is the m'th training sequence.% hmm is a structure created with mk_rnd_dhmm.%% Output% hmm is a structure containing the learned params% LL(i) is the log likelihood at iteration i%% Optional arguments are passed as name/value pairs, e.g, % learn_dhmm(data, hmm, 'max_iter', 30)% Defaults are in [brackets]%% max_iter - max num iterations [100]% thresh - threshold for stopping EM (relative change in log-lik must drop below this) [1e-4]% verbose - 1 means display the log lik at each iteration [0]% dirichlet - equivalent sample size of a uniform Dirichlet prior applied to obsmat [0]%% anneal - do deterministic annealing? [0]% anneal_fully - force temperature to go all the way to 0? [0]% anneal_rate - rate at which inverse temperature is increased (should be between 1.1 and 1.5) [1.2]% init_beta - initial inverse temperature (should be between 0.01 and 0.5) [0.1]max_iter = 100;thresh = 1e-4;verbose = 0;dirichlet = 0;anneal = 0;anneal_fully = 0;anneal_rate = 1.2;init_beta = 0.1;if nargin >= 3 args = varargin; for i=1:2:length(args) switch args{i}, case 'max_iter', max_iter = args{i+1}; case 'thresh', thresh = args{i+1}; case 'verbose', verbose = args{i+1}; case 'dirichlet', dirichlet = args{i+1}; case 'anneal', anneal = args{i+1}; case 'anneal_rate', anneal_rate = args{i+1}; case 'anneal_fully', anneal_fully = args{i+1}; case 'init_beta', init_beta = args{i+1}; end endend if anneal % schedule taken from Ueda and Nakano, "Determinsitic Annealing EM algorithm", % Neural Networks 11 (1998): 271-282, p276 b = []; temp = []; i = 1; b(i)=init_beta; temp(i)=1/b(i); while b(i) < 1 i = i + 1; b(i)=b(i-1)*anneal_rate; temp(i)=1/b(i); end temp_schedule = temp;endprevious_loglik = -inf;loglik = 0;converged = 0;num_iter = 1;LL = [];if ~iscell(data) data = num2cell(data, 2); % each row gets its own cellendnumex = length(data);startprob = hmm.startprob;endprob = hmm.endprob;transmat = hmm.transmat;obsmat = hmm.obsmat;while (num_iter <= max_iter) & ~converged if anneal & num_iter <= length(temp_schedule) temp = temp_schedule(num_iter); else temp = 1; end % E step [loglik, exp_num_trans, exp_num_visits1, exp_num_emit, exp_num_visitsT] = ... compute_ess_dhmm(startprob, transmat, obsmat, data, dirichlet, temp); if verbose if anneal fprintf(1, 'iteration %d, loglik = %f, temp = %5.3f\n', num_iter, loglik, temp); else fprintf(1, 'iteration %d, loglik = %f\n', num_iter, loglik); end end num_iter = num_iter + 1; % M step startprob = normalise(exp_num_visits1); endprob = normalise(exp_num_visitsT ); obsmat = mk_stochastic(exp_num_emit); transmat = mk_stochastic(exp_num_trans); converged = em_converged(loglik, previous_loglik, thresh); if anneal_fully % must cool all the way and do at least 3 steps at temp=1 before finishing if num_iter <= length(temp_schedule)+3 converged = 0; end end previous_loglik = loglik; LL = [LL loglik];endhmm.startprob = startprob;hmm.endprob = endprob;hmm.transmat = transmat;hmm.obsmat = obsmat;
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -