cross_entropy.m

来自「利用HMM的方法的三种语音识别算法」· M 代码 · 共 16 行

M
16
字号
function kl = cross_entropy(p, q, symmetric)
% CROSS_ENTROPY Compute the Kullback-Leibler divergence between two discrete prob. distributions
% kl = cross_entropy(p, q, symmetric)
%
% If symmetric = 1, we compute the symmetric version. Default: symmetric = 0;

tiny = exp(-700);
if nargin < 3, symmetric = 0; end
p = p(:);
q = q(:);
if symmetric
  kl  = (sum(p .* log((p+tiny)./(q+tiny))) + sum(q .* log((q+tiny)./(p+tiny))))/2;
else
  kl  = sum(p .* log((p+tiny)./(q+tiny)));
end                                           

⌨️ 快捷键说明

复制代码Ctrl + C
搜索代码Ctrl + F
全屏模式F11
增大字号Ctrl + =
减小字号Ctrl + -
显示快捷键?