📄 emgmm.html
字号:
<html><head> <meta HTTP-EQUIV="Content-Type" CONTENT="text/html;charset=ISO-8859-1"> <title>Contents.m</title><link rel="stylesheet" type="text/css" href="../../stpr.css"></head><body><table border=0 width="100%" cellpadding=0 cellspacing=0><tr valign="baseline"><td valign="baseline" class="function"><b class="function">EMGMM</b><td valign="baseline" align="right" class="function"><a href="../../probab/estimation/index.html" target="mdsdir"><img border = 0 src="../../up.gif"></a></table> <p><b>Expectation-Maximization Algorithm for Gaussian mixture model.</b></p> <hr><div class='code'><code><span class=help> </span><br><span class=help> <span class=help_field>Synopsis:</span></span><br><span class=help> model = emgmm(X)</span><br><span class=help> model = emgmm(X,options)</span><br><span class=help> model = emgmm(X,options,init_model)</span><br><span class=help></span><br><span class=help> <span class=help_field>Description:</span></span><br><span class=help> This function implements the Expectation-Maximization algorithm </span><br><span class=help> (EM) [<a href="../../references.html#Schles68" title = "" >Schles68</a>][<a href="../../references.html#DLR77" title = "" >DLR77</a>] which computes the maximum-likelihood </span><br><span class=help> estimate of the paramaters of the Gaussian mixture model (GMM). </span><br><span class=help> The EM algorithm is an iterative procedure which monotonically </span><br><span class=help> increases log-likelihood of the current estimate until it reaches </span><br><span class=help> a local optimum. </span><br><span class=help></span><br><span class=help> The number of components of the GMM is given in options.ncomp </span><br><span class=help> (default 2).</span><br><span class=help></span><br><span class=help> The following three stopping are condition used:</span><br><span class=help> 1. Improvement of the log-likelihood is less than given</span><br><span class=help> threshold</span><br><span class=help> logL(t+1) - logL(t) < options.eps_logL</span><br><span class=help> 2. Change of the squared differences of a estimated posteriory </span><br><span class=help> probabilities is less than given threshold</span><br><span class=help> ||alpha(t+1) - alpha(t)||^2 < options.eps_alpha</span><br><span class=help> 3. Number of iterations exceeds given threshold.</span><br><span class=help> t >= options.tmax </span><br><span class=help></span><br><span class=help> The type of estimated covariance matrices is optional:</span><br><span class=help> options.cov_type = 'full' full covariance matrix (default)</span><br><span class=help> options.cov_type = 'diag' diagonal covarinace matrix</span><br><span class=help> cov_options.type = 'spherical' spherical covariance matrix</span><br><span class=help></span><br><span class=help> The initial model (estimate) is selected:</span><br><span class=help> 1. randomly (options.init = 'random') </span><br><span class=help> 2. using C-means (options.init = 'cmeans')</span><br><span class=help> 3. using the user specified init_model.</span><br><span class=help></span><br><span class=help> <span class=help_field>Input:</span></span><br><span class=help> X [dim x num_data] Data sample.</span><br><span class=help> </span><br><span class=help> options [struct] Control paramaters:</span><br><span class=help> .ncomp [1x1] Number of components of GMM (default 2).</span><br><span class=help> .tmax [1x1] Maximal number of iterations (default inf).</span><br><span class=help> .eps_logL [1x1] Minimal improvement in log-likelihood (default 0).</span><br><span class=help> .eps_alpha [1x1] Minimal change of Alphas (default 0).</span><br><span class=help> .cov_type [1x1] Type of estimated covarince matrices (see above).</span><br><span class=help> .init [string] 'random' use random initial model (default);</span><br><span class=help> 'cmeans' use K-means to find initial model.</span><br><span class=help> .verb [1x1] If 1 then info is displayed (default 0).</span><br><span class=help> </span><br><span class=help> init_model [struct] Initial model:</span><br><span class=help> .Mean [dim x ncomp] Mean vectors.</span><br><span class=help> .Cov [dim x dim x ncomp] Covariance matrices.</span><br><span class=help> .Priors [1 x ncomp] Weights of mixture components.</span><br><span class=help> .Alpha [ncomp x num_data] (optional) Distribution of hidden state.</span><br><span class=help> .t [1x1] (optional) Counter of iterations.</span><br><span class=help></span><br><span class=help> <span class=help_field>Output:</span></span><br><span class=help> model [struct] Estimated Gaussian mixture model:</span><br><span class=help> .Mean [dim x ncomp] Mean vectors.</span><br><span class=help> .Cov [dim x dim x ncomp] Covariance matrices.</span><br><span class=help> .Prior [1 x ncomp] Weights of mixture components.</span><br><span class=help> .t [1x1] Number iterations.</span><br><span class=help> .options [struct] Copy of used options.</span><br><span class=help> .exitflag [int] 0 ... maximal number of iterations was exceeded.</span><br><span class=help> 1 or 2 ... EM has converged; indicates which stopping </span><br><span class=help> was used (see above).</span><br><span class=help> </span><br><span class=help> <span class=help_field>Example:</span></span><br><span class=help> Note: if EM algorithm does not converge run it again from different</span><br><span class=help> initial model.</span><br><span class=help></span><br><span class=help> EM is used to estimate parameters of mixture of 2 Guassians:</span><br><span class=help> true_model = struct('Mean',[-2 2],'Cov',[1 0.5],'Prior',[0.4 0.6]);</span><br><span class=help> sample = gmmsamp(true_model, 100);</span><br><span class=help> estimated_model = emgmm(sample.X,struct('ncomp',2,'verb',1));</span><br><span class=help></span><br><span class=help> figure; ppatterns(sample.X);</span><br><span class=help> h1=pgmm(true_model,struct('color','r'));</span><br><span class=help> h2=pgmm(estimated_model,struct('color','b'));</span><br><span class=help> legend([h1(1) h2(1)],'Ground truth', 'ML estimation'); </span><br><span class=help> figure; hold on; xlabel('iterations'); ylabel('log-likelihood');</span><br><span class=help> plot( estimated_model.logL );</span><br><span class=help></span><br><span class=help> <span class=also_field>See also </span><span class=also></span><br><span class=help><span class=also> <a href = "../../probab/estimation/mlcgmm.html" target="mdsbody">MLCGMM</a>, <a href = "../../probab/estimation/mmgauss.html" target="mdsbody">MMGAUSS</a>, <a href = "../../probab/pdfgmm.html" target="mdsbody">PDFGMM</a>, <a href = "../../probab/gmmsamp.html" target="mdsbody">GMMSAMP</a>.</span><br><span class=help></span><br></code></div> <hr> <b>Source:</b> <a href= "../../probab/estimation/list/emgmm.html">emgmm.m</a> <p><b class="info_field">About: </b> Statistical Patte7rn Recognition Toolbox<br> (C) 1999-2003, Written by Vojtech Franc and Vaclav Hlavac<br> <a href="http://www.cvut.cz">Czech Technical University Prague</a><br> <a href="http://www.feld.cvut.cz">Faculty of Electrical Engineering</a><br> <a href="http://cmp.felk.cvut.cz">Center for Machine Perception</a><br> <p><b class="info_field">Modifications: </b> <br> 26-jul-07, VF, inconsistent parameter names 'ncomp and 'num_gauss' removed<br> 26-may-2004, VF, initialization by K-means added<br> 1-may-2004, VF<br> 19-sep-2003, VF<br> 16-mar-2003, VF<br></body></html>
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -