⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 chap2.tex

📁 Concrete_Mathematics_2nd_Ed_TeX_Source_Code
💻 TEX
📖 第 1 页 / 共 5 页
字号:
\enddisplayNow we can work on this last sum and try to express it in terms of~$S_n$.If we succeed, we obtain an equation whose solution is the sum we seek.For example, let's use this approach to find the sum of a\g If it's geometric, there should be a geometric proof.\par\medskip\bigskip\noindent\qquad\unitlength=1pt\beginpicture(54,108)(0,-108)\put(0,0){\line(0,-1){108}}\put(36,0){\line(0,-1){108}}\put(48,0){\line(0,-1){36}}\put(54,-12){\line(0,-1){96}}\put(0,0){\line(1,0){48}}\put(48,0){\line(0,-1){36}}\put(0,-108){\line(1,0){54}}\put(36,-36){\line(1,0){18}}\put(48,-12){\line(1,0){6}}\put(0,-5){\line(3,1){15}}\put(0,-10){\line(3,1){30}}\multiput(0,-15)(0,-5){19}{\line(3,1){36}}\put(6,-108){\line(3,1){30}}\put(21,-108){\line(3,1){15}}\put(39.5,-108){\line(-1,3){3.5}}\put(44.5,-108){\line(-1,3){8.5}}\put(49.5,-108){\line(-1,3){13.5}}\put(54,-106.5){\line(-1,3){18}}\put(54,-91.5){\line(-1,3){18}}\put(54,-76.5){\line(-1,3){13.5}}\put(54,-61.5){\line(-1,3){8.5}}\put(54,-46.5){\line(-1,3){3.5}}\put(50,-36){\line(1,3){4}}\put(48,-24){\line(1,3){4}}\put(48,-33){\line(1,3){6}}\put(38,0){\line(3,-1){10}}\put(46,-36){\line(-3,1){10}}\multiput(36,-4.0952)(0,-4.7619){6}{\line(3,-1){12}}\endpicture\ggeneral {\it"geometric progression"},\begindisplayS_n=\sum_{0\le k\le n}ax^k\,.\enddisplayThe general perturbation scheme in \eq(|perturb-setup|) tells us that\begindisplayS_n+ax^{n+1}=ax^0+\sum_{0\le k\le n}ax^{k+1}\,,\enddisplayand the sum on the right is $x\sum_{0\le k\le n}ax^k=xS_n$ by the distributivelaw. Therefore $S_n+ax^{n+1}=a+xS_n$, and we can solve for $S_n$ to obtain\begindisplay\sum_{k=0}^n ax^k={a-ax^{n+1}\over 1-x},\qquad\hbox{for $x\ne1$}.\eqno\eqref|geom-sum|\enddisplay(When $x=1$, the sum is of course simply $(n+1)a$.) The right-hand side\g Ah yes, this formula was drilled into me in high school.\gcan be remembered as the first term included in the sum minus the firstterm excluded (the term after the last), divided by $1$~minusthe term ratio.That was almost too easy. Let's try the perturbation technique on a slightlymore difficult sum,\begindisplayS_n=\sum_{0\le k\le n}k\,2^k\,.\enddisplayIn this case we have $S_0=0$, $S_1=2$, $S_2=10$, $S_3=34$, $S_4=98$;what is the general formula? According to~\eq(|perturb-setup|) we have\begindisplayS_n+(n+1)2^{n+1}=\sum_{0\le k\le n}(k+1)2^{k+1}\,;\enddisplayso we want to express the right-hand sum in terms of $S_n$. Well, wecan break it into two sums with the help of the associative law,\begindisplay\sum_{0\le k\le n}k\,2^{k+1}\;+\;\sum_{0\le k\le n}2^{k+1}\,,\enddisplayand the first of the remaining sums is $2S_n$. The other sum is ageometric progression, which equals $(2-2^{n+2})/(1-2)=2^{n+2}-2$by \eq(|geom-sum|). Therefore we have $S_n+(n+1)2^{n+1}=2S_n+2^{n+2}-2$,and algebra yields\begindisplay\sum_{0\le k\le n}k\,2^k=(n-1)2^{n+1}+2\,.\enddisplayNow we understand why $S_3=34$: It's $32+2$, not $2\cdt17$.A similar derivation with $x$ in place of~$2$would have given us the equation $S_n+(n+1)x^{n+1}=xS_n+(x-x^{n+2})/(1-x)$; hence we candeduce that\begindisplay\sum_{k=0}^n kx^k={x-(n+1)x^{n+1}+nx^{n+2}\over(1-x)^2}\,, \qquad\hbox{for $x\ne1$.}\eqno\eqref|k-x-to-the-k|\enddisplayIt's interesting to note that we could have derived this closed form in acompletely different way, by using elementary techniques ofdifferential "calculus". If we startwith the equation\begindisplay\sum_{k=0}^n x^k={1-x^{n+1}\over 1-x}\enddisplayand take the "derivative" of both sides with respect to~$x$, we get\begindisplay\sum_{k=0}^n kx^{k-1}={(1{-}x)\bigl(-(n{+}1)x^n\bigr)+1{-}x^{n+1}\over(1-x)^2}={1-(n{+}1)x^n+nx^{n+1}\over(1-x)^2}\,,\enddisplaybecause the derivative of a sum is the sum of the derivatives of its terms.We will see many more connections between calculus and discrete mathematicsin later chapters.\beginsection 2.4 Multiple SumsThe terms of a sum might be specified by two or more indices, not just by one."!sums, multiple"For example, here's a "double sum" of nine terms, governed\g Oh no, a nine-term governor.\g by two indices$j$ and~$k$:\g\vskip19pt Notice that this doesn't mean to sum over all\/ ${j\ge1}$ and all\/ ${k\le3}$.\g\begindisplay\sum_{1\le j,k\le3}a_jb_k=\vtop{\halign{$#\hfil$\cr  a_1b_1+a_1b_2+a_1b_3\cr  \quad{}+a_2b_1+a_2b_2+a_2b_3\cr  \quad{}+a_3b_1+a_3b_2+a_3b_3\,.\cr}}\enddisplayWe use the same notations and methods for such sums as we do for sumswith a single index. Thus, if $P(j,k)$ is a "property" of $j$ and~$k$, thesum of all terms $a_{j,k}$ such that $P(j,k)$ is true can be written in twoways, one of which uses "Iverson's convention" and sums over {\it all\/} pairsof integers $j$ and~$k$:\begindisplay\sum_{P(j,k)}a_{j,k}=\sum_{j,k}a_{j,k}\,\bigi[P(j,k)\bigr]\,.\enddisplayOnly one $\sum$ sign is needed, although there is more than one index ofsummation; $\sum$ denotes a sum over all combinations of indices that apply.We also have occasion to use two $\sum$'s, when we're talking about asum of sums. For example,\begindisplay\sum_j\sum_k a_{j,k}\,\bigi[P(j,k)\bigr]\enddisplayis an abbreviation for\begindisplay\sum_j\biggl(\sum_k a_{j,k}\,\bigi[P(j,k)\bigr]\biggr)\,,\enddisplaywhich is the sum, over all integers $j$, of $\sum_k a_{j,k}\,\bigi[P(j,k)\bigr]$,\g Multiple\/ {\textsl\char6}'s are evaluated right to left (inside-out).\gthe latter being the sum over all integers~$k$ of all terms $a_{j,k}$ forwhich $P(j,k)$ is true. In such cases we say that thedouble sum is ``summed first on~$k$.\qback''A sum that depends on more than one index can be summed first on any oneof its indices.In this regard we have a basic law called {\it "interchanging the order ofsummation"}, which generalizes the associative law \eq(|assoc-law|) wesaw earlier:\begindisplay\sum_j\sum_k a_{j,k}\bigi[P(j,k)\bigr]=\sum_{P(j,k)}a_{j,k}=\sum_k\sum_j a_{j,k}\bigi[P(j,k)\bigr]\,.\eqno\eqref|interch|\enddisplayThe middle term of this law is a sum over two indices. On the left,$\sum_j\sum_k$ stands for summing first on~$k$, then on~$j$. On the right,$\sum_k\sum_j$ stands for summing first on~$j$, then on~$k$. In practice whenwe want to evaluate a double sum in closed form, it's usually easier tosum it first on one index rather than on the other; we get to choose whichever ismore convenient.Sums of sums are no reason to panic,\g Who's panicking? I~think this rule is fairly obvious compared tosome of the stuff in Chapter~1.\gbut they can appear confusing to a beginner, so let's do some more examples.The nine-term sum we began withprovides a good illustration of the manipulation of double sums,because that sum can actually be simplified, and the simplificationprocess is typical of what we can do with $\sum\sum$'s:\begindisplay\sum_{1\le j,k\le3}a_jb_k&=\sum_{j,k}a_jb_k\[1\le j,k\le3] =\sum_{j,k}a_jb_k\[1\le j\le3]\[1\le k\le3]\cr&=\sum_j\sum_k a_jb_k\[1\le j\le3]\[1\le k\le3]\cr&=\sum_j a_j\[1\le j\le3]\sum_k b_k\[1\le k\le3]\cr&=\sum_j a_j\[1\le j\le3]\biggl(\sum_k b_k\[1\le k\le3]\biggr)\cr&=\biggl(\sum_j a_j\[1\le j\le3]\biggr)\biggl(\sum_k b_k\[1\le k\le3]\biggr)\cr&=\biggl(\sum_{j=1}^3a_j\biggr)\biggl(\sum_{k=1}^3 b_k\biggr)\,.\cr\enddisplayThe first line here denotes a sum of nine terms in no particular order.The second line groups them in threes,$(a_1b_1+a_1b_2+a_1b_3)+(a_2b_1+a_2b_2+a_2b_3) +(a_3b_1+a_3b_2+a_3b_3)$.The third line uses the distributive law to factor out the $a$'s, since$a_j$ and $\[1\le j\le3]$ do not depend on~$k$; this gives$a_1(b_1+b_2+b_3)+a_2(b_1+b_2+b_3)+a_3(b_1+b_2+b_3)$.The fourth line is the same as the third, but with a redundant pair ofparentheses thrown in so that the fifth line won't look so mysterious.The fifth line factors out the $(b_1+b_2+b_3)$ that occurs for eachvalue of~$j$: $(a_1+a_2+a_3)(b_1+b_2+b_3)$. The last line is just anotherway to write the previous line. This method of derivation can be usedto prove a {\it general "distributive law"},\begindisplay\sum\twoconditions{j\in J}{k\in K}a_jb_k=\biggl(\,\sum_{j\in J}a_j\biggr)\biggl(\,\sum_{k\in K}b_k\biggr)\,,\eqno\eqref|gen-dist-law|\enddisplayvalid for all sets of indices $J$ and $K$.The basic law \eq(|interch|) for interchanging the order of summationhas many variations, which arise when we want to restrict the ranges of the indicesinstead of summing over all integers $j$ and~$k$. These variations comein two flavors, "vanilla" and "rocky road". First, the vanilla version:\begindisplay\sum_{j\in J}\sum_{k\in K}a_{j,k}=\sum\twoconditions{j\in J}{k\in K}a_{j,k}=\sum_{k\in K}\sum_{j\in J}a_{j,k}\,.\eqno\eqref|vanilla|\enddisplayThis is just another way to write \eq(|interch|), since theIversonian $\[j\in J,k\in K]$factors into $\[j\in J]\[k\in K]$. The vanilla-flavored law applieswhenever the ranges of $j$ and~$k$ are independent of each other.The rocky-road formula for interchange is a little trickier. It applies when therange of an inner sum depends on the index variable of the outer sum:\begindisplay\sum_{j\in J}\,\sum_{k\in K(j)}a_{j,k}=\sum_{k\in K'}\,\sum_{j\in J'(k)}a_{j,k}\,.\eqno\enddisplayHere the sets $J$, $K(j)$, $K'$, and $J'(k)$ must be related in such a waythat\begindisplay\[j\in J]\bigi[k\in K(j)\bigr]=\[k\in K']\bigi[j\in J'(k)\bigr]\,.\enddisplayA factorization like this is always possible in principle, because we can let$J=K'$ be the set of all integers and $K(j)=J'(k)$ be the basic property$P(j,k)$ that governs a double sum. But there are important special caseswhere the sets $J$, $K(j)$, $K'$, and $J'(k)$ have a simple form.These arise frequently in applications. For example, here's a particularlyuseful factorization:\begindisplay\[1\le j\le n]\[j\le k\le n]=\[1\le j\le k\le n]=\[1\le k\le n]\[1\le j\le k]\,.\eqno\enddisplayThis Iversonian equation allows us to write"!factorization of summation conditions"\begindisplay\sum_{j=1}^n\sum_{k=j}^n a_{j,k}=\sum_{1\le j\le k\le n} a_{j,k}=\sum_{k=1}^n\sum_{j=1}^k a_{j,k}\,.\eqno\eqref|rocky-road|\enddisplayOne of these two\g(Now is a good time to do warmup exercises |triple| and |iverson-sum|.)"!food"\smallskip(Or to check out the Snickers bar languishing in the freezer.)\gsums of sums is usually easier to evaluate than the other; we can use\thiseq\ to switch from the hard one to the easy one.Let's apply these ideas to a useful example. Consider the array\begindisplay\left[\matrix{a_1 a_1 & a_1 a_2 & a_1 a_3 & \ldots & a_1 a_n \cra_2 a_1 & a_2 a_2 & a_2 a_3 & \ldots & a_2 a_n \cra_3 a_1 & a_3 a_2 & a_3 a_3 & \ldots & a_3 a_n \cr\vdots	& \vdots  & \vdots  & \ddots & \vdots  \cra_n a_1 & a_n a_2 & a_n a_3 & \ldots & a_n a_n \cr}\right]\enddisplayof $n^2$ products $a_ja_k$. Our goal will be to find a simple formula for\def\uppertriangle{@   \vbox{\hrule\kern-.2pt\hbox{\smallln\char'100\kern-.2pt\vrule\kern-.2pt}}}\def\lowertriangle{@   \vbox{\hbox{\kern-.2pt\vrule\kern-.2pt\smallln\char'100}\kern-.2pt\hrule}}\begindisplayS_{\uppertriangle}=\sum_{1\le j\le k\le n}a_ja_k\,,\enddisplaythe sum of all elements on or above the main diagonal of this array."!triangular array"Because $a_ja_k=a_ka_j$, the array is symmetrical about its main diagonal;therefore $S_{\uppertriangle}$ will be approximately half the sumof {\it all\/} the elements (except for a fudge\g Does rocky road have fudge in it?\gfactor that takes account of the main diagonal).Such considerations motivate the following manipulations. We have\begindisplayS_{\uppertriangle}=\sum_{1\le j\le k\le n}a_ja_k =\sum_{1\le k\le j\le n}a_ka_j =\sum_{1\le k\le j\le n}a_ja_k =S_{\lowertriangle}\,,\enddisplaybecause we can rename $(j,k)$ as $(k,j)$. Furthermore, since\begindisplay\[1\le j\le k\le n]+\[1\le k\le j\le n]=\[1\le j,k\le n]+\[1\le j=k\le n]\,,\enddisplaywe have\begindisplay2S_{\uppertriangle}=S_{\uppertriangle}+S_{\lowertriangle} =\sum_{1\le j,k\le n}a_ja_k\;+\;\sum_{1\le j=k\le n}a_ja_k\,.\enddisplayThe first sum is $\bigl(\sum_{j=1}^n a_j\bigr)\bigl(\sum_{k=1}^n a_k\bigr)=\bigl(\sum_{k=1}^n a_k\bigr)^2$, by the general distributive law\eq(|gen-dist-law|). The second sum is $\sum_{k=1}^n a_k^2$. Thereforewe have\begindisplayS_{\uppertriangle}=\sum_{1\le j\le k\le n}a_ja_k =\half \Biggl(\biggl(\sum_{k=1}^n a_k\biggr)^{\!2} +\sum_{k=1}^n a_k^2\Biggr)\,,\eqno\eqref|upper-triangle|\enddisplayan expression for the upper triangular sum in terms of simpler single sums.Encouraged by such success, let's look at another double sum:\begindisplayS=\sum_{1\le j<k\le n}(a_k-a_j)(b_k-b_j)\,.\enddisplayAgain we have symmetry when $j$ and $k$ are interchanged:\begindisplayS=\sum_{1\le k<j\le n}(a_j-a_k)(b_j-b_k) =\sum_{1\le k<j\le n}(a_k-a_j)(b_k-b_j)\,.\enddisplaySo we can add $S$ to itself, making use of the identity\begindisplay\[1\le j<k\le n]+\[1\le k<j\le n]=\[1\le j,k\le n]-\[1\le j=k\le n]

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -