⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 script.m

📁 Gaussian belief propagation code in matlab.
💻 M
字号:
% This is a test script to show that that Gaussian belief propagation% algorithm (CANONGBP) computes the correct marginals. This script will% only work on small models, since the CANONMARGS function quickly gets% out of hand for anything but small models. The script below runs on an% undirected graphical model that looks like this:%%   1 -- 2 -- 3 -- 5 -- 6%             |    |%             4    7%% The graphical model specifies a joint distribution p(X,Y) over the hidden% states X and the observations Y. The precision matrices Lxx and Lxy% specify the interactions between the Xs and Ys. The inference objective% is to compute the posterior marginals p(Xi|Y) and p(Xi,Xj|Y), where i% and j are neighbouring nodes. Note that the belief propagation CANONGBP% can also run on graphs with cycles, but it is only guaranteed to produce% the correct marginal distributions on tree structures using the default% message passing schedule.%% The code developed here is mostly based on the presentation:%%     * Mark A. Paskin. Exploiting Locality on Probabilistic%       Inference. Ph.D. Thesis, University of California, Berkeley,%       2004. % Other supporting material includes:%     * Yair Weiss and William T. Freeman. Correctness of belief%       propagation in Gaussian graphical models of arbitrary%       topology. Neural Computation, Vol. 13, 2001, pp. 2173-2200.%     * Jason K. Johnson. Estimation of GMRFs by recursive cavity%       modeling. Technical report, MIT, 2004.%% This code is free for non-profit use. Distribution of this code requires% express permission from the author.%%   Peter Carbonetto%   University of British Columbia%   http://www.cs.ubc.ca/~pcarbo%   October 7, 2005% Script parameters.n     = 7;      % Number of vertices in the graph (n > 1).mu    = [0 0]'; % Mean of observations.S     = [1 0;   % Variance of observations.         0 1];  pNode = -0.5;   % Inverse variance between Xi and Xj, for all (i,j).pEdge = -0.4;   % Inverse variance between Xi and Yi, for all i.fprintf('Creating the network using canonical parameterization.\n');A = sparse(zeros(n,n));A(1,2) = 1;A(2,3) = 1;A(2,4) = 1;A(3,5) = 1;A(5,6) = 1;A(5,7) = 1;% Do not change anything after this point.% ---------------------------------------% Get the number of edges in the graph.m = full(sum(sum(A)));% Number the edges.[is js] = find(A);E       = zeros(m,2);for u = 1:m  i      = is(u);  j      = js(u);  E(u,:) = [i j];  A(i,j) = u;  A(j,i) = u;endG.A = A;G.E = E;clear A E% Get the number of dimensions of the hidden states and observations.F = length(mu);% Generate the edge potentials.Lxx = cell(m,1);for u = 1:m  Lxx{u} = diag(pEdge * ones(F,1));end% Generate the node potentials and observations.Lxy = cell(n,1)y   = zeros(F,n);for i = 1:n  Lxy{i} = diag(pNode * ones(F,1));  y(:,i) = round(mu + norm_rnd(S)*10)/10;  end% Compute the marginals using the canonical parameterization.fprintf('Computing marginals.\n');[b bc] = canonmargs(G,Lxx,Lxy,y);% Run belief propagation.fprintf('Running belief propagation.\n');[B Bc] = canongbp(G,Lxx,Lxy,y,1);fprintf('\n');fprintf('Node marginals. On the right is the belief propagation estimate.\n');for i = 1:n  fprintf('Mean of x%d:\n', i);  disp([b(i).mu B(i).mu]);endfprintf('\n');fprintf('Edge marginals. On the right is the belief propagation estimate.\n');for u = 1:m  fprintf('Mean of (x%d,x%d):\n', G.E(u,1), G.E(u,2));  disp([bc(u).mu Bc(u).mu]);end

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -