📄 part1_2.htm
字号:
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN">
<!-- saved from url=(0050)http://www.yobology.info/harbin/part1/part_1_2.htm -->
<HTML><HEAD><TITLE>Part1_2</TITLE>
<META http-equiv=content-type content="text/html; charset=shift_jis">
<META content="MSHTML 6.00.2900.2180" name=GENERATOR></HEAD>
<BODY text=black vLink=black aLink=black link=black bgColor=#223344>
<TABLE
style="FONT-WEIGHT: normal; FONT-SIZE: 14pt; FONT-STYLE: normal; FONT-FAMILY: 'Times New Roman'; BACKGROUND-COLOR: rgb(204,204,204); TEXT-DECORATION: none"
width=799 bgColor=#cccccc border=0>
<TBODY>
<TR
style="BORDER-RIGHT: 4px ridge; BORDER-TOP: 4px ridge; BORDER-LEFT: 4px ridge; BORDER-BOTTOM: 4px ridge; BACKGROUND-COLOR: rgb(204,204,204)">
<TD
style="PADDING-RIGHT: 8px; PADDING-LEFT: 8px; FONT-SIZE: 16pt; PADDING-BOTTOM: 8px; LINE-HEIGHT: 150%; PADDING-TOP: 8px; FONT-FAMILY: 'Times New Roman'"
borderColor=#0099ff borderColorLight=#0099ff width=779
borderColorDark=#0099ff height=78>
<P align=right><I><A
href="http://www.yobology.info/harbin/part1/index.htm">index</A></I></P>
<P align=center>Part 1 </P>
<P align=center>Chapter 2 Water Pouring
Theorem<BR><BR></P></TD></TR>
<TR>
<TD
style="PADDING-RIGHT: 8px; PADDING-LEFT: 8px; FONT-SIZE: 14pt; PADDING-BOTTOM: 8px; LINE-HEIGHT: 150%; PADDING-TOP: 8px; FONT-FAMILY: 'Times New Roman'; BACKGROUND-COLOR: rgb(204,204,204)"
borderColor=#0099ff borderColorLight=#0099ff width=779
borderColorDark=#0099ff height=3103>
<P>仒 1 Entropy of analog information source<BR> Quantize analog
signal<I> x</I> in small section <IMG
src="Part1_2.files/part_1_2_htm_eqn24720.gif" border=0 NAMO_EQN__><!--NAMO_EQN__ 176 1\delta -->. Then we get the following
information about its result <IMG
src="Part1_2.files/part_1_2_htm_eqn25460.gif" border=0 NAMO_EQN__><!--NAMO_EQN__ 192 1x_{i}-->.</P>
<P align=center><IMG src="Part1_2.files/part_1_2_htm_eqn24376.gif"
border=0 NAMO_EQN__><!--NAMO_EQN__ 208 1-\log _{2}\: \varphi (x_{i}\, )\, \delta =-\log _{2}\: \varphi (x_{i})-\log _{2}\delta \; \quad bit-->,</P>
<P>where <IMG src="Part1_2.files/part_1_2_htm_eqn24959.gif" border=0
NAMO_EQN__><!--NAMO_EQN__ 176 1\varphi (\; )--> denotes p.d.f. on which
<I>x</I> depends. Suppose that the sampled and quantized sequence is i.i.d
(independent and identically distributed), the entropy of this sequence is
given by</P>
<P align=center><IMG src="Part1_2.files/part_1_2_htm_eqn27419.gif"
border=0 NAMO_EQN__><!--NAMO_EQN__ 208 1E=-\sum _{i=-\infty }^{\infty }\delta \: \varphi (x_{i})\log _{2}\varphi (x_{i})-\log _{2}\delta \quad bit\slash sample--></P>
<P>Note that the second term in right hand side diverges when <IMG
src="Part1_2.files/part_1_2_htm_eqn28498.gif" border=0 NAMO_EQN__><!--NAMO_EQN__ 192 1\delta \to 0-->. But since this term does
not depend on <IMG src="Part1_2.files/part_1_2_htm_eqn4613.gif" border=0
NAMO_EQN__><!--NAMO_EQN__ 192 1\varphi (\; )-->, by discarding it let us
define the entropy of analog diganl by</P>
<P align=center><IMG src="Part1_2.files/part_1_2_htm_eqn28878.gif"
border=0 NAMO_EQN__><!--NAMO_EQN__ 208 1E=-\int _{-\infty }^{\infty }\: \varphi (x)\log _{2}x\; dx\quad bit\slash sample--></P>
<P><SPAN style="FONT-SIZE: 12pt"><FONT color=#660000>Note: For uniform
distribution in [-D/2,D/2], </FONT></SPAN></P>
<P align=center><SPAN style="FONT-SIZE: 12pt"><FONT color=#660000><IMG
src="Part1_2.files/part_1_2_htm_eqn31568.gif" border=0 NAMO_EQN__><!--NAMO_EQN__ 176 1E=-\log _{2}D\quad bit\slash symbol--></FONT></SPAN><SPAN
style="FONT-SIZE: 12pt"><FONT color=#660000></FONT></SPAN></P>
<P><SPAN style="FONT-SIZE: 12pt"><FONT
color=#660000> For
Gaussian distribution, <IMG src="Part1_2.files/part_1_2_htm_eqn32039.gif"
border=0 NAMO_EQN__><!--NAMO_EQN__ 160 1\varphi (x)=1\slash \sqrt{2\pi \sigma ^{2}}\: \: exp[-x^{2}\slash (2\sigma ^{2}\: )]-->,</FONT></SPAN></P>
<P align=center><SPAN style="FONT-SIZE: 12pt"><FONT color=#660000><IMG
src="Part1_2.files/part_1_2_htm_eqn32309.gif" border=0 NAMO_EQN__><!--NAMO_EQN__ 176 1E=-\log _{2}\sqrt{2\pi e\sigma ^{2}}\quad bit\slash sample--></FONT></SPAN></P>
<P align=left>仒 2 Channel capacity<BR> Mutual entropy under
AWGN is derived as</P>
<P align=left><IMG src="Part1_2.files/part_1_2_htm_eqn30944.gif" border=0
NAMO_EQN__><!--NAMO_EQN__ 192 1H(Y)-H(Y\mid X)=\, -\int \, \{ \int \, g(y-x)\varphi (x)dx\cdot \, log_{2}[\int \, g(y-x)\varphi (x)dx]\} dy+\int \, g(y)\, log_{2}g(y)\, dy-->,</P>
<P align=left>where <IMG src="Part1_2.files/part_1_2_htm_eqn32644.gif"
border=0 NAMO_EQN__><!--NAMO_EQN__ 192 1g(\; )--> is p.d.f. of AWGN.
This maximum value for all <IMG
src="Part1_2.files/part_1_2_htm_eqn12533.gif" border=0 NAMO_EQN__><!--NAMO_EQN__ 176 1\varphi (\; )--> is called channel
capacity, which indicates inherent criteria about the channel. In
this AWGN model, mutual entropy is maximized when <IMG
src="Part1_2.files/part_1_2_htm_eqn16998.gif" border=0 NAMO_EQN__><!--NAMO_EQN__ 176 1\varphi (\; )--> is Gaussain, and channel
capacity is written in simple form as</P>
<P align=center><IMG src="Part1_2.files/part_1_2_htm_eqn18330.gif"
border=0 NAMO_EQN__><!--NAMO_EQN__ 208 1C=\frac{1}{2}\log _{2}(1+\frac{\sigma _{s}^{2}}{\sigma _{n}^{2}}\: )\quad bit\slash symbol-->,<BR><IMG
src="Part1_2.files/part_1_2_htm_eqn18727.gif" border=0 NAMO_EQN__><!--NAMO_EQN__ 192 1\sigma _{s^{2}}--> : squared average of
signal<BR><IMG src="Part1_2.files/part_1_2_htm_eqn18793.gif" border=0
NAMO_EQN__><!--NAMO_EQN__ 192 1\sigma _{n^{2}}--> : squared average of
noise.</P>
<P align=left>For other p.d.f., the mutual entropy cannot be expressed in
explicit form but easily plotted by numerical calculation. The graph below
shows <IMG src="Part1_2.files/part_1_2_htm_eqn1746.gif" border=0
NAMO_EQN__><!--NAMO_EQN__ 192 12\times C--> (red colored) and mutual
entropy for uniform distribution (purple colored).</P>
<P align=center><IMG src="Part1_2.files/img40.gif" border=0></P>
<P align=left>It should be remarked that these are defined in Euclid
space. In digital multi-level PAM systems, as mentioned in previous
chapter, the channel capacity should be estimated with Hamming distance
and its procedures are rather complicated (refer <A
href="http://www.yobology.info/harbin/part1/appendix.htm">Appendix</A>).</P>
<P align=left>仒 3 Water Pouring Theorem<BR>At first, assume AWGN on
the flat baseband channel with bandwidth W Hz. Over this channel we can
transmit samples at speed of 2W sample/sec and the channel capacity is
given by</P>
<P align=center><IMG src="Part1_2.files/part_1_2_htm_eqn4461.gif" border=0
NAMO_EQN__><!--NAMO_EQN__ 208 1C=W\log _{2}(1+\frac{\sigma _{s}^{2}}{\sigma _{n}^{2}})\quad bit\slash sec--></P>
<P align=left><SPAN style="FONT-SIZE: 12pt"><FONT color=#660000>Note:
"<I>bit/sec</I>" above does not mean physical speed but net amount of
information speed.</FONT></SPAN></P>
<P align=center><IMG height=372 src="Part1_2.files/img1.gif" width=531
border=0></P>
<P align=left>The water pouring theorem answers for the question how to
realize channel capacity for additive colored Gaussain noise. </P>
<P align=center><FONT color=black><IMG height=153
src="Part1_2.files/img3.gif" width=564 border=0></FONT></P>
<P align=left> </P>
<P align=center>THEOREM (water pouring)<BR> The blue colored spectrum
gives the channel capacity.</P>
<P align=center><FONT color=black><FONT color=maroon><IMG height=396
src="Part1_2.files/img2.gif" width=743 border=0></FONT></FONT></P>
<P align=center><proof><BR>Segment bandwidth into narrow sections
<BR>and distribute signal power for each section<BR>so that total mutual
entropy<BR><IMG src="Part1_2.files/part_1_2_htm_eqn3226.gif" border=0
NAMO_EQN__><!--NAMO_EQN__ 208 1M=\sum _{i=1}^{W}\log _{2}(1+SNR_{i})\qquad SNR_{i}\: is\: \sigma _{s}^{2}\slash \sigma _{n}^{2}\: of\: i_{th}\: segment--><BR>is
maximized. </P>
<P align=left><SPAN style="FONT-SIZE: 12pt"><FONT color=maroon>Note: This
is typical resource distribution problem to get the maximum
gain.</FONT></SPAN></P>
<P align=left>仒 4 Simple numerical example<BR> Suppose a case
of two sections under power constraint <IMG
src="Part1_2.files/part_1_2_htm_eqn18120.gif" border=0 NAMO_EQN__><!--NAMO_EQN__ 176 1P_{1}+P_{2}=1-->. Blue curves in graphs
below show mutual entropy for varying <I>P<SPAN
style="FONT-SIZE: 10pt">1</SPAN></I>. Black lines show results of the
unified PAM system, i.e.,</P>
<P align=center><IMG src="Part1_2.files/part_1_2_htm_eqn31100.gif"
border=0 NAMO_EQN__><!--NAMO_EQN__ 208 12\log _{2}(1+\frac{P_{1}+P_{2}}{N_{1}+N_{2}})--></P>
<P align=left>It can be said that blue curves are very flat around the
optimum except extremely lopsided power distribution. In most cases, equal
power distribution gives good performance close to the optimum.</P>
<P align=left><FONT color=#660000><SPAN style="FONT-SIZE: 12pt">Note: In
order to realize blue colored curves by OFDM, we must select best pair of
bit/symbol and error rate for each sub-channel.</SPAN></FONT></P>
<P align=center><IMG height=682 src="Part1_2.files/img41.gif" width=780
border=0></P>
<P align=left> </P>
<P align=left>仒 5 Cases of the distorted channel<BR> Let
<IMG src="Part1_2.files/part_1_2_htm_eqn14105.gif" border=0 NAMO_EQN__><!--NAMO_EQN__ 192 1a_{i}--> denote attenuation of
<I>i </I>th frequency section, the water pouring principle can be
applied similarly as</P>
<P align=center><IMG src="Part1_2.files/part_1_2_htm_eqn15154.gif"
border=0 NAMO_EQN__><!--NAMO_EQN__ 208 1M=\sum _{i=1}^{W}\log _{2}\: (1+\frac{a_{i}^{2}P_{i}}{N_{i}}\: )=\sum ^{W}_{i=1}\log _{2}(1+\frac{P_{i}}{N_{i}\slash a_{i}^{2}})--></P>
<P align=left>Here, let us apply the water pouring to a case like metal
cable that signal power decays rapidly toward high frequency.</P>
<P align=center><I>W=8</I>
<I>Ni = 1/1000</I> for all
<I>i</I></P>
<P align=center> </P>
<P align=center><IMG src="Part1_2.files/img17.gif" border=0></P>
<P align=center><IMG src="Part1_2.files/img43.gif" border=0></P>
<P align=center><IMG src="Part1_2.files/img44.gif" border=0></P>
<P align=left>Case A : Total signal power = 8</P>
<UL>
<UL>
<P align=left>Total channel capacity for optimum distribution = 28.449
bit<BR>Total channel capacity for uniform distribution = 28.031
bit<BR>PAM with Equalizer = 14.638 bit</P>
<P align=left> </P></UL></UL>
<P align=center><IMG src="Part1_2.files/img45.gif" border=0></P>
<P align=center><IMG src="Part1_2.files/img46.gif" border=0></P>
<P align=left>Case B : Total signal power = 1</P>
<UL>
<P align=left>Total channel capacity for optimum distribution =
15.431 bit<BR>Total channel capacity for uniform distribution =
13.317 bit<BR>PAM with Equalizer = 3.199 bit</P></UL>
<P align=left> </P>
<P align=center><IMG src="Part1_2.files/img47.gif" border=0></P>
<P align=center><IMG src="Part1_2.files/img48.gif" border=0></P>
<P align=left> </P></TD></TR></TBODY></TABLE>
<P> </P></BODY></HTML>
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -