cross_entropy.m
来自「麻省理工学院的人工智能工具箱,很珍贵,希望对大家有用!」· M 代码 · 共 14 行
M
14 行
function kl = cross_entropy(p, q, symmetric)% CROSS_ENTROPY Compute the Kullback-Leibler divergence between two discrete prob. distributions% kl = cross_entropy(p, q, symmetric)%% If symmetric = 1, we compute the symmetric version. Default: symmetric = 0;tiny = exp(-700);if nargin < 3, symmetric = 0; endif symmetric kl = (sum(p .* log((p+tiny)./(q+tiny))) + sum(q .* log((q+tiny)./(p+tiny))))/2;else kl = sum(p .* log((p+tiny)./(q+tiny)));end
⌨️ 快捷键说明
复制代码Ctrl + C
搜索代码Ctrl + F
全屏模式F11
增大字号Ctrl + =
减小字号Ctrl + -
显示快捷键?