$$ \newcommand{\RR}{\mathbb{R}} \newcommand{\QQ}{\mathbb{Q}} \newcommand{\CC}{\mathbb{C}} \newcommand{\NN}{\mathbb{N}} \newcommand{\ZZ}{\mathbb{Z}} \newcommand{\FF}{\mathbb{F}} % ALTERNATE VERSIONS % \newcommand{\uppersum}[1]{{\textstyle\sum^+_{#1}}} % \newcommand{\lowersum}[1]{{\textstyle\sum^-_{#1}}} % \newcommand{\upperint}[1]{{\textstyle\smallint^+_{#1}}} % \newcommand{\lowerint}[1]{{\textstyle\smallint^-_{#1}}} % \newcommand{\rsum}[1]{{\textstyle\sum_{#1}}} \newcommand{\uppersum}[1]{U_{#1}} \newcommand{\lowersum}[1]{L_{#1}} \newcommand{\upperint}[1]{U_{#1}} \newcommand{\lowerint}[1]{L_{#1}} \newcommand{\rsum}[1]{{\textstyle\sum_{#1}}} \newcommand{\partitions}[1]{\mathcal{P}_{#1}} \newcommand{\sampleset}[1]{\mathcal{S}_{#1}} \newcommand{\erf}{\operatorname{erf}} $$

21  Switching Limits

Highlights of this Chapter: we consider the delicate problem of switching the order a limit and an infinite sum. We prove a theorem - the Dominated Convergence Theorem for Sums - that provides a condition under which this interchange is allowed, and explore a couple consequences for double summations. This Dominated Convergence Theorem is the first of several analogous theorems that will play an important role in what follows.

The fact that an infinite series is defined as a limit - precisely the limit of partial sums - has been of great utility so far, as all of our techniques for dealing with series fundamentally rest on limit theorems for sequences!

\[\sum_{k\geq 0} a_k :=\lim_{N\to\infty}\sum_{k=0}^N a_k\]

But once we start to deal with multiple series at a time, this can present newfound difficulties. Indeed, it’s rather common in practice to end up with an infinite sequence of infinite series.

For example, imagine that a function \(f(x)\) is defined by a power series \(f(x)=\sum_{k\geq 0}a_kx^k\). If \(a\in\RR\) is some point in its domain, how could we hope to test continuity of \(f\) at \(a\)? Using the sequence definition of continuity, all we need to do is choose a sequence \(x_n\to a\) and attempt to evaluate \(\lim f(x_n)\). But for each \(n\) we know \(f(x_n)\) is defined as an infinite series! Thus, we are forced to deal with taking a limit of series - a limit of limits.

\[\lim_n f(x_n) = \lim_n \sum_{k\geq 0} a_kx_n^k = \lim_n \lim_N \sum_{k=0}^N a_kx_n^k\]

There’s an intuitive urge to just switch the order of the limits - equivalently, to “pull the limit inside the sum”. But such an operation is not always justified. Its easy to come up with examples of limits that cannot be switched:

\[\lim_n\lim_m \frac{n}{n+m}=\lim_n\left(\lim_m \frac{n}{n+m}\right)=\lim_n 0=0\] \[\lim_m\lim_n\frac{n}{n+m} = \lim_m\left(\lim_n \frac{n}{n+m}\right)=\lim_m 1=1\]

Even worse (for us) this behavior can manifest even when dealing with series

Example 21.1 \[\begin{align*} 1 &= \frac{1}{2}+\frac{1}{2}\\ &=\frac{1}{4}+\frac{1}{4}+\frac{1}{4}+\frac{1}{4}\\ &=\frac{1}{8}+\frac{1}{8}+\frac{1}{8}+\frac{1}{8}+\frac{1}{8}+\frac{1}{8}+\frac{1}{8}+\frac{1}{8} \end{align*}\]

Taking the termwise limit and adding them up gives \[1=0+0+0+\cdots +0 = 0\]

This is nonsense! And the nonsense arises from implicitly exchanging two limits. To make this precise, one may define for each \(n\) the series \[a_n(k)=\begin{cases} 1/2^n & 0\leq k<2^n\\ 0 & \mathrm{else} \end{cases} \]

Then each of the rows above is the sum \(1=\sum_{k\geq 0}a_n(k)\) for \(n=2,3,4\). Since this is constant it is true that the limit is \(1\), but it is not true that the limit of the sums is the sum of the limits, which is zero.

\[1=\lim_n \sum_{k\geq 0}a_n(k) \neq \sum_{k\geq 0}\lim a_n(k)=0\]

So, its hopefully clear that to be able to use series in realistic contexts, we are in desparate need of a theorem which tells us when we can interchange limits and summantions.

21.1 Dominated Convergence (Tannery’s Theorem)

Because limit interchange is so fundamental to analysis, there are many theorems of this sort, of varying strengths and complexities. The one we will visit here is usually called Tannery’s theorem (named for Jules Tannery, an analyst at the end of the 1800s). With the luxury of hindsight, we now realize Tannery’s theorem is a particularly special case of a much more general result called Dominated Convergence, of which we will meet other special cases in the chapters to come. As such, I will call it by its more descriptive and general name throughout.

First, let’s set the stage precisely. For each \(n\), we have an infinite series \(s_n\), and we are interested in the limit \(\lim_n s_n\) (here, we will always write subscripts on the limit as multiple variables are involved!) For each fixed \(n\), the series \(s_n\) is an infinite sum, over some summation index \(k\):

\[s_n=\sum_{k\geq 0}a_k(n)\]

Where for each \(k\) we write the term as \(a_k(n)\) to remember that it also depends on \(n\) (the notation \(a_{k,n}\) is also perfectly acceptable). We seek a theorem that gives us the conditions on which we can take the term-wise limit, that is when

\[\lim_n \sum_{k\geq 0}a_k(n)=\sum_{k\geq 0}\lim_n a_k(n)\]

Dominated convergence assures us that such a switch is justified so long as the entire process - all of the \(a_k(n)\)s are bounded by a convergent series.

Theorem 21.1 (Dominated Convergence for Series) For each \(k\) let \(a_k(n)\) be a function of \(n\), and assume the following:

  • For each \(k\), \(a_k(n)\) is convergent.
  • For each \(n\), \(\sum_k a_k(n)\) is convergent.
  • There is an \(M_k\) with \(|a_k(n)|\leq M_k\) for all \(n\).
  • \(\sum M_k\) is convergent.

Then \(\sum_k\lim_n a_k(n)\) is convergent, and \[\lim_n\sum_k a_k(n)=\sum_k\lim_n a_k(n)\]

Proof. First, we show that \(\sum_k a_k\) converges. Since for all \(n\), \(|a_k(n)|\leq M_k\) we know this remains true in the limit, so \(\lim_n |a_k(n)|=|a_k|<M_k\). Thus, by comparison we see \(\sum_k |a_k|\) converges, and hence so does \(\sum_k a_k\).

Now, the main event. Let \(\epsilon>0\). To show that \(\lim_n \sum_k a_k(n)=\sum_k a_k\) we will show that there there is some \(N\) beyond which these two sums always differ by less than \(\epsilon\).

Since \(\sum_k M_k\) converges, by the Cauchy criterion there is some \(L\) where \[\sum_{k\geq L}M_k<\frac{\epsilon}{3}\]

For arbitrary \(n\), we compute

\[\begin{align*} \left|\sum_{k\geq 0} a_k(n)-\sum_{k\geq 0}a_k\right| &= \left|\sum_{k< L} (a_k(n)-a_k)+\sum_{k\geq L}a_k(n)+\sum_{k\geq L}a_k\right|\\ &\leq \left|\sum_{k< L} (a_k(n)-a_k)\right|+\left|\sum_{k\geq L}a_k(n)\right|+\left|\sum_{k\geq L}a_k\right|\\ &\leq \sum_{k< L}|a_k(n)-a_k| +\sum_{k< L}|a_k(n)|+\sum_{k\geq L}|a_k|\\ &\leq \sum_{k< L}|a_k(n)-a_k|+ 2\sum_{k>L}M_k\\ & < \sum_{k< L}|a_k(n)-a_k|+\frac{2\epsilon}{3} \end{align*}\]

That is, for an arbitrary \(n\) we can bound the difference essentially in terms of the first \(L\) terms: the rest are uniformly less than \(2\epsilon/3\). But for each of these \(L\) terms, we know that \(a_k(n)\to a_k\) so we can find an \(N\) making that difference as small as we like. Let’s choose \(N_k\) such that \(|a_k(n)-a_k|<\epsilon/3L\) for each \(k<L\) and then take

\[N=\max\{N_0,N_1,\ldots N_{L-1}\}\]

Now, for any \(n>N\) we are guaranteed that |a_k(n)-a_k|</3L$ and thus that

\[\sum_{k<L}|a_k(n)-a_k|< L\frac{\epsilon}{3L}=\frac{\epsilon}{3}\]

Combining with the above, we now have for all \(n>N\), \[\left|\sum_{k\geq 0} a_k(n)-\sum_{k\geq 0}a_k\right|<\epsilon\] as required.

And, a direct generalization to limits of functions (which are after all defined in terms of sequences!)

Theorem 21.2 (Dominated Convergence for Function Limits) For each \(k\), let \(f_k(x)\) be a function of \(x\) on a domain \(D\). For a fixed \(a\in\RR\), assume there is some interval \(I\subset D\) containing \(a\) such that:

  • For each \(k\), \(\lim_{x\to a}f_k(x)\) exists.
  • \(\sum_k f_k(x)\) is convergent for each \(x\in I\).
  • There is an \(M_k\) with \(|f_k(x)|\leq M_k\) for all \(x\in I\).
  • \(\sum M_k\) is convergent.

Then, the sum \(\sum_k\lim_{x\to a}f_k(x)\) is convergent and

\[\lim_{x\to a}\sum_k f_k(x)=\sum_k\lim_{x\to a}f_k(x)\]

Proof. Let \(x_n\subset I\) be an arbitrary sequence with \(x_n\to a\) and \(x_n\neq a\). We assumed \(\lim_{x\to a} f_k(x)\) exists. As \(x_n\to a\), it follows by definition that \(\lim_n f_k(x_n)=\lim_{x\to a}f_k(x)\), so this limit also exists, and so (1) holds. Additionally for each fixed \(n\), \(\sum_k f_k(x_n)\) is convergent, as \(x_n\in I\) and we assumed convergence for each \(x\in I\).

As we assumed \(M_k\) bounds \(|f_k(x)|\) for all \(x\in I\) it also does so for all \(x_n\) in our sequence, so (3) and (4) are satisfied for the original dominated convergence, Theorem 21.1. Thus, we may conclude that the series \(\sum_k \lim_n f_k(x_n)\) is convergent, and that \[\lim_n\sum_k f_k(x_n)=\sum_k \lim_n f_k(x_n)=\sum_k \lim_{x\to a}f_k(x)\]

Because \(x_n\) was arbitrary, this applies for all sequences \(x_n\to a\) with \(x_n\neq a\). Thus, the overall limit \(\lim_{x\to a}\sum_k f_k(x)\) exists, and is equal to this common value

\[\lim_{x\to a}\sum_k f_k(x)=\sum_k \lim_{x\to a}f_k(x)\]

There is a natural version of this theorem for products as well (though we will not need it in this course, I will state it here anyway)

Theorem 21.3 (\(\bigstar\) Dominated Convergence for Products) For each \(k\) let \(a_k(n)\) be a function of \(n\), and assume the following:

  • For each \(k\), \(a_k(n)\) is convergent.
  • For each \(n\), \(\prod_{k\geq 0} a_k(n)\) is convergent.
  • There is an \(M_k\) with \(|a_k(n)|\leq M_k\) for all \(n\).
  • \(\sum M_k\) is convergent.

Then \(\prod_{k\geq 0}\lim_n (1+a_{k}(n))\) is convergent, and \[\lim_n\prod_{k\geq 0} (1+a_k(n))=\prod_{k\geq 0}(1+\lim_n a_k(n))\]

Exercise 21.1 Use Dominated Convergence to prove that

\[\frac{1}{2}=\lim_n \left[\frac{1+2^n}{2^n\cdot 3 +4}+\frac{1+2^n}{2^n\cdot 3^2 +4^2}+\frac{1+2^n}{2^n\cdot 3^3 +4^3}+\cdots\right]\]

  • Write in summation notation, and give a formula for the terms \(a_k(n)\)
  • Show that \(\lim_n a_k(n) =\frac{1}{3^k}\)
  • Show that for all \(n\), \(|a_k(n)|\leq \frac{2}{3^k}\)

Use these facts to show that the hypotheses of dominated convergence hold true, and then use the theorem to help you take the limit.

21.2 Application: Continuity of Power Series

We will find several applications for dominated convergence during our study of calculus, proving analogs for both derivatives () and integrals (). But our most immediate application is to the problem of continuity of power series originally posed at the beginning of this section: we can now easily prove that every power series is continuous on the interior of its interval of convergence.

Theorem 21.4 (Continuity within Radius of Convergence) Let \(f(x)=\sum_k a_kx^k\) be a power series with radius of convergence \(r\). Then if \(|x|<r\), \(f\) is continuous at \(x\).

Proof. Without loss of generality take \(x>0\), and let \(x_n\) be an arbitrary sequence in \((-r,r)\) converging to \(x\). We aim to show that \(f(x_n)\to f(x)\).

As \(x<r\) choose some \(y\) with \(x<y<r\) (perhaps, \(y=(x+r)/2\)). Since \(x_n\to x\) there is some \(N\) past which \(x_n\) is always less than \(y\) (take \(\epsilon = y-x\) and apply the definition of \(x_n\to x\)). As truncating the terms of the sequence before this does not change its limit, we may without loss of generality assume that \(x_n<y\) for all \(n\). Thus, we may define \(M_k = a_k y^k\), and we are in a situation to verify the hypotheses of Dominated Convergence:

  • Since \(x_n\to x\), we have \(a_kx_n^k\to a_kx^k\) by the limit theorems.
  • For each \(n\), \(f(x_n)=\sum_k a_k x_n^k\) is convergent as \(x_n\) is within the radius of convergence.
  • \(M_k=a_ky^k\) bounds \(a_kx_n^k\) for all \(n\), as \(0<x_n<y\).
  • \(\sum_k M_k\) converges as this is just \(f(y)\) and \(y\) is within the radius of convergence.

Applying the theorem, we see \[\lim_n f(x_n)=\lim_n\sum_k a_kx_n^k=\sum_k \lim a_kx_n^k=\sum_k a_kx^k=f(x)\]

Thus for arbitrary \(x_n\to x\) we have \(f(x_n)\to f(x)\), so \(f\) is continuous at \(x\).

21.3 Application: Double Sums

Another useful application of dominated convergence is to switching the order of a double sum. A double sequence is a map \(\NN\times\NN\to \RR\), where we write \(a_{m,n}\) for the value \(a(m,n)\). Such sequences like \(n/(n+m)\) occured in our original example about switching limits above.

Given a double sequence, one may want to define an double sum

\[\sum_{m,n\geq 0}a_{m,n}\]

But, how should one do this? Because we have two indices, there are two possible orders we could attempt to compute this sum:

\[\sum_{n\geq 0}\sum_{m\geq 0}a_{m,n} \hspace{0.5cm}\textrm{or}\hspace{0.5cm}\sum_{m\geq 0}\sum_{n\geq 0}a_{m,n}\]

Definition 21.1 (Double Sum) Given a double sequence \(a_{m,n}\) its double sum \(\sum_{m,n\geq 0}a_{m,n}\) is defined if both orders of iterated summation converge, and are equal. In this case, the value of the double sum is defined to be their common value:

\[\sum_{m,n\geq 0}a_{m,n}:=\sum_{n\geq 0}\sum_{m\geq 0}a_{m,n} =\sum_{m\geq 0}\sum_{n\geq 0}a_{m,n}\]

We should be worried from previous experience that in general these two things need not be equal, so the double sum may not exist! Indeed, we can make this worry precise, by seeing that to relate one to the other is really an exchange of order of limits:

\[\sum_{m\geq 0}=\lim_M \sum_{0\leq m\leq M}\hspace{1cm}\sum_{n\geq 0}=\lim_N \sum_{0\leq n\leq N}\]

And so, expanding the above with these definitions (and using the limit laws to pull a limit out of a finite sum) we see

\[\sum_{n\geq 0}\sum_{m\geq 0}a_{m,n}=\lim_N \sum_{0\leq n\leq N}\left(\lim_M \sum_{0\leq m\leq M}a_{m,n}\right)\] \[=\lim_N\lim_M\left( \sum_{0\leq n\leq N}\sum_{0\leq m\leq M} a_{m,n}\right)=\lim_N\lim_M \sum_{\begin{smallmatrix}0\leq m\leq M\\ 0\leq n\leq N\end{smallmatrix}}a_{m,n}\]

Where in the final line we have put both indices under a single sum to indicate that it is a finite sum, and the order does not matter. Doing the same with the other order yields the exact same finite sum, but with the order of limits reversed:

\[\sum_{m\geq 0}\sum_{n\geq 0}a_{m,n}=\lim_M\lim_N \sum_{\begin{smallmatrix}0\leq m\leq M\\ 0\leq n\leq N\end{smallmatrix}}a_{m,n}\]

Because this is an exchange-of-limits-problem, we can hope to provide conditions under which it is allowed using Tannery’s theorem.

Theorem 21.5 Let \(a_{m,n}\) be a double sequence, and assume that either \[\sum_{m\geq 0}\sum_{n\geq 0}|a_{m,n}|\hspace{1cm}\textrm{or}\hspace{1cm}\sum_{n\geq 0}\sum_{m\geq 0}|a_{m,n}|\] converges. Then the double sum also converges \[\sum_{m,n\geq 0}a_{m,n}\] (meaning either both orders of iterated sum converge, and are equal)

Exercise 21.2 (Cauchy’s Double Summation Formula) Use Dominated Convergence to prove the double summation formula (Theorem 21.5): without loss of generality, assume that \(\sum_{m\geq 0}\sum_{n\geq 0}|a_{m,n}|\) converges, and use this to show that both orders of iterated sum converge and are equal \[\sum_{m\geq 0}\sum_{n\geq 0}a_{m,n}=\sum_{n\geq 0}\sum_{m\geq 0}a_{m,n}\]

Hint: Assuming \(\sum_{m\geq 0}\sum_{n\geq 0}|a_{m,n}|\) converges, set \(M_m=\sum_{n\geq 0}|a_{m,n}|\) and show the various hypotheses of Dominated convergence apply

Exercise 21.3 (Applying the Double Sum) Since switching the order of limits involves commuting terms that are arbitrarily far apart, techniques like double summation allow one to prove many identities that are rather difficult to show directly. We will make a crucial use of this soon, in understanding exponential functions. But here is a first example:

For any \(k\in\NN\), prove the following equality of infinite sums:

\[\frac{z^{1+k}}{1-z}+\frac{(z^2)^{1+k}}{1-z^2}+\frac{(z^3)^{1+k}}{1-z^3}+\cdots =\frac{z^{1+k}}{1-z^{1+k}}+\frac{z^{2+k}}{z^{2+k}}+\frac{z^{3+k}}{1-z^{3+k}}+\cdots\]

Hint: first write each side as a summation: \[\sum_{n\geq 1}\frac{z^{n(k+1)}}{1-z^n}=\sum_{m\geq 1}\frac{z^{m+k}}{1-z^{m+k}}\]

*Then setting \(a_{m,n}=z^{n(m+k)}\), show that Cauchy summation applies to the double sum \(\sum_{m,n}\geq 0 a_{m,n}\) and compute the sum in each order, arriving that the claimed equality.