readme
来自「麻省理工学院的人工智能工具箱,很珍贵,希望对大家有用!」· 代码 · 共 101 行
TXT
101 行
The following kinds of potentials are supported- dpot: discrete- upot: utility- mpot: Gaussian in moment form- cpot: Gaussian in canonical form- cgpot: conditional (mixture) Gaussian, a list of mpots/cpot- scgpot: stable conditional Gaussian, a list of scgcpots- scgcpot: just used by scgpotMany of these are described in the following book@book{Cowell99, author = "R. G. Cowell and A. P. Dawid and S. L. Lauritzen and D. J. Spiegelhalter", title = "Probabilistic Networks and Expert Systems", year = 1999, publisher = "Springer"}CPD_to_pot converts P(Z|A,B,...) to phi(A,B,...,Z).A table is like a dpot, except it is a structure, not an object.Code that uses tables is faster but less flexible. -----------A potential is a joint probability distribution on a set of nodes,which we call the potential's domain (which is always sorted).A potential supports the operations of multiplication andmarginalization.If the nodes are discrete, the potential can be represented as a table(multi-dimensional array). If the nodes are Gaussian, the potentialcan be represented as a quadratic form. If there are both discrete andGaussian nodes, we use a table of quadratic forms. For details on theGaussian case, see below.For discrete potentials, the 'sizes' field specifies the number ofvalues each node in the domain can take on. For continuous potentials,the 'sizes' field specifies the block-size of each node.If some of the nodes are observed, extra complications arise. Wehandle the discrete and continuous cases differently. Suppose thedomain is [X Y], with sizes [6 2], where X is observed to have value x.In the discrete case, the potential will have many zeros in it(T(X,:) will be 0 for all X ~= x), which can be inefficient. Instead,we set sizes to [1 2], to indicate that X has only one possible value(namely x). For continuous nodes, we set sizes = [0 2], to indicate that X nolonger appears in the mean vector or covariance matrix (we must avoid0s in Sigma, lest it be uninvertible). When a potential is created, weassume the sizes of the nodes have been adjusted to include theevidence. This is so that the evidence can be incorporated at theoutset, and thereafter the inference algorithms can ignore it. ------------A Gaussian potential can be represented in terms of itsmoment characteristics (mu, Sigma, logp), or in terms of its canonicalcharacteristics (g, h, K). Although the moment characteristics aremore familiar, it turns out that canonical characteristics aremore convenient for the junction tree algorithm, for the same kinds ofreasons why backwards inference in an LDS uses the information form ofthe Kalman filter (see Murphy (1998a) for a discussion).When working with *conditional* Gaussian potentials, the method proposedby Lauritzen (1992), and implemented here, requires converting fromcanonical to moment form before marginalizing the discrete variables,and converting back from moment to canonical form beforemultiplying/dividing. A new algorithm, due to Lauritzen and Jensen(1999), works exclusively in moment form, andhence is more numerically stable. It can also handle 0s in thecovariance matrix, i.e., deterministic relationships between ctsvariables. However, it has not yet been implemented,since it requires major changes to the jtree algorithm.In Murphy (1998b) we extend Lauritzen (1992) to handlevector-valued nodes. This means the vectors and matrices become blockvectors and matrices. This manifests itself in the code as in thefollowing example.Suppose we have a potential on nodes dom=[3,4,7] with block sizes=[2,1,3].Then nodes 3 and 7 correspond to blocks 1,3 which correspond to indices 1,2,4,5,6.>> find_equiv_posns([3 7], dom)=[1,3]>> block([1,3],blocks)=[1,2,4,5,6].For more details, see- "Filtering and Smoothing in Linear Dynamical Systems using the Junction Tree Algorithm", K. Murphy, 1998a. UCB Tech Report.- "Inference and learning in hybrid Bayesian networks", K. Murphy. UCB Technical Report CSD-98-990, 1998b.- "Propagation of probabilities, means and variances in mixed graphical association models", S. L. Lauritzen, 1992, JASA 87(420):1098--1108.- "Causal probabilistic networks with both discrete and continuous variables", K. G. Olesen, 1993. PAMI 3(15). This discusses implementation details.- "Stable local computation with Conditional Gaussian distributions", S. Lauritzen and F. Jensen, 1999. Univ. Aalborg Tech Report R-99-2014. www.math.auc.dk/research/Reports.html.
⌨️ 快捷键说明
复制代码Ctrl + C
搜索代码Ctrl + F
全屏模式F11
增大字号Ctrl + =
减小字号Ctrl + -
显示快捷键?