⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 index.html

📁 信号处理系列导航
💻 HTML
字号:
<!DOCTYPE html  PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN"><html xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd">   <head>      <meta http-equiv="Content-Type" content="text/html; charset=utf-8">         <!--This HTML is auto-generated from an M-file.To make changes, update the M-file and republish this document.      -->      <title>Entropic Coding and Compression</title>      <meta name="generator" content="MATLAB 7.4">      <meta name="date" content="2008-10-15">      <meta name="m-file" content="index">      <LINK REL="stylesheet" HREF="style.css" TYPE="text/css">   </head>   <body>      <div class="content">         <h1>Entropic Coding and Compression</h1>         <introduction>            <p>This numerical tour studies source coding using an arithmetic coder.</p>         </introduction>         <h2>Contents</h2>         <div>            <ul>               <li><a href="#1">Installing toolboxes and setting up the path.</a></li>               <li><a href="#8">Entropic Coding</a></li>            </ul>         </div>         <h2>Installing toolboxes and setting up the path.<a name="1"></a></h2>         <p>You need to download the <a href="../toolbox_general.zip">general purpose toolbox</a> and the <a href="../toolbox_signal.zip">signal toolbox</a>.         </p>         <p>You need to unzip these toolboxes in your working directory, so that you have <tt>toolbox_general/</tt> and <tt>toolbox_signal/</tt> in your directory.         </p>         <p><b>For Scilab user:</b> you must replace the Matlab comment '%' by its Scilab counterpart '//'.         </p>         <p><b>Recommandation:</b> You should create a text file named for instance <tt>numericaltour.sce</tt> (in Scilabe) or <tt>numericaltour.m</tt> to write all the Scilab/Matlab command you want to execute. Then, simply run <tt>exec('numericaltour.sce');</tt> (in Scilab) or <tt>numericaltour;</tt> (in Matlab) to run the commands.         </p>         <p>Execute this line only if you are using Matlab.</p><pre class="codeinput">getd = @(p)path(path,p); <span class="comment">% scilab users must *not* execute this</span></pre><p>Then you can add these toolboxes to the path.</p><pre class="codeinput"><span class="comment">% Add some directories to the path</span>getd(<span class="string">'toolbox_signal/'</span>);getd(<span class="string">'toolbox_general/'</span>);</pre><h2>Entropic Coding<a name="8"></a></h2>         <p>Entropic coding convert an vector <tt>x</tt> of integers into a binary stream <tt>y</tt>. The entries of this binary stream are packed by group of 8 bits so that each <tt>y(i)</tt> is in [0,255]. The entropic coding exploit the redundancies in the statistical distribution of the entries of <tt>x</tt> to reduce as much as possible the size of <tt>y</tt>. The lower bound for the number of bits <tt>p</tt> of <tt>y</tt> is the Shannon bound <tt>p=-sum_i h(i)*log2(h(i))</tt>, where <tt>h(i)</tt> is the probability of apparition of symbol <tt>i</tt> in <tt>x</tt>.         </p>         <p>Fist we generate a simple binary signal <tt>x</tt> so that 0 has a probability of appearance of <tt>p</tt>.         </p><pre class="codeinput"><span class="comment">% probability of 0</span>p = 0.1;<span class="comment">% size</span>n = 512;<span class="comment">% signal, should be with token 1,2</span>x = (rand(n,1)&gt;p)+1;</pre><p>One can check the probabilities by computing the empirical histogram.</p><pre class="codeinput">h = hist(x, [1 2]);h = h/sum(h);disp(strcat([<span class="string">'Empirical p='</span> num2str(h(1)) <span class="string">'.'</span>]));</pre><pre class="codeoutput">Empirical p=0.11133.</pre><p>An adaptive arithmetic coder automatically estimate the probability density and generate a code that is close to the entropic            bound of Shannon.         </p><pre class="codeinput"><span class="comment">% probability distribution</span>h = [p 1-p];<span class="comment">% coding</span>y = perform_arith_fixed(x,h);<span class="comment">% de-coding</span>x1 = perform_arith_fixed(y,h,n);<span class="comment">% see if everything is fine</span>disp(strcat([<span class="string">'Decoding error (should be 0)='</span> num2str(norm(x-x1)) <span class="string">'.'</span>]));</pre><pre class="codeoutput">Decoding error (should be 0)=0.</pre><p><i>Exercice 1:</i> (the solution is <a href="../private/coding_entropic/exo1.m">exo1.m</a>) Compare the number of bit per symbol generated by the arithmetic coder and the Shanon bound.         </p><pre class="codeinput">exo1;</pre><pre class="codeoutput">Entropy=0.469, arithmetic=0.541.</pre><p>We can generate a more complex integer signal</p><pre class="codeinput">n = 4096;<span class="comment">% this is an example of probability distribution</span>q = 10;h = 1:q; h = h/sum(h);<span class="comment">% draw according to the distribution h</span>x = rand_discr(h, n);<span class="comment">% check we have the correct distribution</span>h1 = hist(x, 1:q)/n;clf;subplot(2,1,1);bar(h); axis(<span class="string">'tight'</span>);set_graphic_sizes([], 20);title(<span class="string">'True distribution'</span>);subplot(2,1,2);bar(h1); axis(<span class="string">'tight'</span>);set_graphic_sizes([], 20);title(<span class="string">'Empirical distribution'</span>);</pre><img vspace="5" hspace="5" src="index_01.png"> <p><i>Exercice 2:</i> (the solution is <a href="../private/coding_entropic/exo2.m">exo2.m</a>) Encode a signal with an increasing size <tt>n</tt>, and check how close the generated signal coding rate <tt>length(y)/n</tt> becomes close to the optimal Shannon bound.         </p><pre class="codeinput">exo2;</pre><img vspace="5" hspace="5" src="index_02.png"> <p class="footer"><br>            Copyright  &reg; 2008 Gabriel Peyre<br></p>      </div>      <!--##### SOURCE BEGIN #####%% Entropic Coding and Compression% This numerical tour studies source coding using an arithmetic coder.%% Installing toolboxes and setting up the path.%%% You need to download the% <../toolbox_general.zip general purpose toolbox>% and the <../toolbox_signal.zip signal toolbox>.%%% You need to unzip these toolboxes in your working directory, so% that you have |toolbox_general/| and |toolbox_signal/| in your directory.%%% *For Scilab user:* you must replace the Matlab comment '%' by its Scilab% counterpart '//'.%%% *Recommandation:* You should create a text file named for instance% |numericaltour.sce| (in Scilabe) or |numericaltour.m| to write all the% Scilab/Matlab command you want to execute. Then, simply run% |exec('numericaltour.sce');| (in Scilab) or |numericaltour;| (in Matlab)% to run the commands.%%% Execute this line only if you are using Matlab.getd = @(p)path(path,p); % scilab users must *not* execute this%%% Then you can add these toolboxes to the path.% Add some directories to the pathgetd('toolbox_signal/');getd('toolbox_general/');%% Entropic Coding% Entropic coding convert an vector |x| of integers into a binary stream% |y|. The entries of this binary stream are packed by group of 8 bits so% that each |y(i)| is in [0,255]. The entropic coding exploit the% redundancies in the statistical distribution of the entries of |x| to% reduce as much as possible the size of |y|. The lower bound for the% number of bits |p| of |y| is the Shannon bound |p=-sum_i% h(i)*log2(h(i))|, where |h(i)| is the probability of apparition of% symbol |i| in |x|.%%% Fist we generate a simple binary signal |x| so that 0 has a probability% of appearance of |p|.% probability of 0p = 0.1;% sizen = 512;% signal, should be with token 1,2x = (rand(n,1)>p)+1;%%% One can check the probabilities by computing the empirical histogram.h = hist(x, [1 2]);h = h/sum(h);disp(strcat(['Empirical p=' num2str(h(1)) '.']));%%% An adaptive arithmetic coder automatically estimate the probability% density and generate a code that is close to the entropic bound of% Shannon.% probability distributionh = [p 1-p];% codingy = perform_arith_fixed(x,h);% de-codingx1 = perform_arith_fixed(y,h,n);% see if everything is finedisp(strcat(['Decoding error (should be 0)=' num2str(norm(x-x1)) '.']));%%% _Exercice 1:_ (the solution is <../private/coding_entropic/exo1.m exo1.m>)% Compare the number of bit per symbol generated by the arithmetic coder% and the Shanon bound.exo1;%%% We can generate a more complex integer signaln = 4096;% this is an example of probability distributionq = 10;h = 1:q; h = h/sum(h); % draw according to the distribution hx = rand_discr(h, n);% check we have the correct distributionh1 = hist(x, 1:q)/n;clf;subplot(2,1,1); bar(h); axis('tight');set_graphic_sizes([], 20);title('True distribution');subplot(2,1,2);bar(h1); axis('tight');set_graphic_sizes([], 20);title('Empirical distribution');%%% _Exercice 2:_ (the solution is <../private/coding_entropic/exo2.m exo2.m>)% Encode a signal with an increasing size |n|, and check how close the% generated signal coding rate |length(y)/n| becomes close to the optimal% Shannon bound.exo2;##### SOURCE END #####-->   </body></html>

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -