$$ \newcommand{\RR}{\mathbb{R}} \newcommand{\QQ}{\mathbb{Q}} \newcommand{\CC}{\mathbb{C}} \newcommand{\NN}{\mathbb{N}} \newcommand{\ZZ}{\mathbb{Z}} \newcommand{\FF}{\mathbb{F}} % ALTERNATE VERSIONS % \newcommand{\uppersum}[1]{{\textstyle\sum^+_{#1}}} % \newcommand{\lowersum}[1]{{\textstyle\sum^-_{#1}}} % \newcommand{\upperint}[1]{{\textstyle\smallint^+_{#1}}} % \newcommand{\lowerint}[1]{{\textstyle\smallint^-_{#1}}} % \newcommand{\rsum}[1]{{\textstyle\sum_{#1}}} \newcommand{\uppersum}[1]{U_{#1}} \newcommand{\lowersum}[1]{L_{#1}} \newcommand{\upperint}[1]{U_{#1}} \newcommand{\lowerint}[1]{L_{#1}} \newcommand{\rsum}[1]{{\textstyle\sum_{#1}}} \newcommand{\partitions}[1]{\mathcal{P}_{#1}} \newcommand{\sampleset}[1]{\mathcal{S}_{#1}} \newcommand{\erf}{\operatorname{erf}} $$

19  Convergence

Highlights of this Chapter: Finding the value of a series explicitly is difficult, so we develop some theory to determine convergence without explicitly finding the limit. Our main tool is comparison, which is built using the Monotone convergence theorem; and in particular comparison with a geometric series - the Ratio Test. Along the way to developing this theory we study a few important special series:

  • We prove the harmonic series \(\sum\frac{1}{n}\) diverges.
  • In contrast, we prove that the sum of reciprocal squares \(\sum\frac{1}{n^2}\) converges. In the final project we will show its value is \(\pi^2/6\).

In this section, we build up some technology to prove the convergence (and divergence) of series, without explicitly being able to compute the limit of partial sums. Such results will prove incredibly useful, as in the future we will encounter many theorems of the form if \(\sum a_n\) converges, then… and we will need to a method of proving convergence to continue.

19.1 The Cauchy Criterion

For sequences, after some work we were able to find a definition equivalent to the original notion of convergence, which did not mention the precise value of the limit. This is exactly the sort of thing we seek for our investigation into series, so we carry it over directly here:

Definition 19.1 (Cauchy Criterion) A series \(s_n=\sum a_n\) satisfies the Cauchy criterion if for every \(\epsilon>0\) there is an \(N\) such that for any \(n,m>N\) we have \[\left|\sum_m^n a_k\right|<\epsilon\]

Exercise 19.1 Prove a series satisfies the Cauchy criterion if and only if its sequence of partial sums is a Cauchy sequence.

Because we know that being convergent and cauchy are equivalent, this means that all series that satisfy the Cauchy criterion are convergent, and conversely if a series does not, then it must diverge. We use this second observation to construct an easy-to-apply test for divergence:

Corollary 19.1 (Divergence Test) If a series \(\sum a_n\) converges, then \(\lim a_n=0\). Equivalently, if \(a_n\not\to 0\) then \(\sum a_n\) diverges.

Proof. Let’s apply the cauchy condition to the single value \(m\). This says for all \(\epsilon>0\) there is some \(N\) where for \(m>N\) we have \[\left|\sum_{k=m}^{m}a_k\right|=|a_m|<\epsilon\]

But making \(|a_m|<\epsilon\) for all \(m>N\) is exactly the definition of \(a_m\to 0\).

This is useful mostly to immediately rule out the possibility that certain series converge. For instance it tells us that \(\sum (1+\frac{1}{n})\) must diverge as the terms approach \(1\), not zero. But, when the terms approach zero its not very helpful: there are many series with \(a_n\to 0\) which do converge, and many which diverge. To distinguish between these, we need to build up some more powerful tools.

19.1.1 Absolute Convergence

Below we will develop several theorems that apply exclusively to series of positive terms. That may seem at first to be a significant obstacle, as many series involve both addition and subtraction! So, we take some time here to assuage such worries, and provide a means of probing a general series using information about its nonnegative counterpart.

Definition 19.2 (Absolute Convergence) A series \(\sum a_n\) converges absolutely if the associated series of absolute values \(\sum |a_n|\) is convergent.

Of course, such a definition is only useful if facts about the nonnegative series imply facts about the original. Happily, that is the case.

Theorem 19.1 (Absolute Convergence Implies Convergence) Every absolutely convergent series is a convergent series.

Proof. Let \(\sum a_n\) be absolutely convergent. Then \(\sum |a_n|\) converges, and its partial sums satisfy the Cauchy criterion. This means for any \(\epsilon\) we can find an \(N\) where \[|a_n|+|a_{n+1}|+\cdots+|a_m|<\epsilon\]

But, by the triangle inequality we know that \[|a_n+a_{n+1}+\cdots+a_n|\leq |a_n|+|a_{n+1}|+\cdots+|a_m|\] Thus, our original series \(\sum a_k\) satisfies the Cauchy Criterion, as \[\left|\sum_{k=m}^na_k\right|<\epsilon\] And, since Cauchy is equivalent to convergence, this implies \(\sum a_k\) is a convergent series.

19.2 Comparison

One of the very most useful convergence tests for a series is comparison. This lets us show that a series we care about (that may be hard to compute with) converges or diverges by comparing it to a simpler series - much like the squeeze theorem did for us with sequences. This theorem gives less information than the squeeze theorem (it doesn’t give us the exact value of the series we are interested in) but it is also easier to use (it only requires a bound, not an upper and lower bound with the same limit).

Theorem 19.2 (Comparison For Series) Let \(\sum a_n\) and \(\sum b_n\) be two series of nonnegative terms, with \(0\leq a_n\leq b_n\).

  • If \(\sum b_n\) converges, then \(\sum a_n\) converges.
  • If \(\sum a_n\) diverges, then \(\sum b_n\) diverges.

The proof is just a rehashing of our old friend, Monotone Convergence.

Proof. We prove the first of the two claims, and leave the second as an exercise. If \(x_n\geq 0\) then the series \(s_n=\sum_{k=0}^n x_k\) is monotone increasing (as by definition \(s_{n}=s_{n-1}+x_n\) and \(x_n\geq 0\) we see \(s_{n}\geq s_{n-1}\) for all \(n\)).

Thus. \(\sum a_n\) and \(\sum b_n\) are monotone sequences. If \(\sum b_n\) converges, we know by the Monotone Convergence Theorem that it its limit \(\beta\) is the supremum of the partial sums, so for all \(n\) \[\sum_{k=0}^n b_k \leq \beta\] But, since \(a_k\leq b_k\) for all \(k\), we see the same is true of the partial sums \[\sum_{k=0}^n a_k\leq \sum_{k=0}^n b_k\] Stringing these inequalities together, we see that \(\sum a_k\) is bounded above by \(\beta\). Since it is monotone (as the sum of nonnegative terms) as well, Monotone convergence assures us that it converges, as claimed.

Exercise 19.2 Let \(\sum a_n\) and \(\sum b_n\) be two series of nonnegative terms, with \(0\leq a_n\leq b_n\). Prove that if \(\sum a_n\) diverges, then \(\sum b_n\) diverges.

The comparison test is incredibly useful: two of the most famous series it lets us understand are left as exercises below.

Exercise 19.3 Prove that \(\sum \frac{1}{n^2}\) converges. *Hint: compare with \(1/((n-1)n)\), which telescopes.

Exercise 19.4 Show the harmonic series \(\sum \frac{1}{n}\) diverges, by comparing it with the partial sums of \[1, 1/2, 1/4, 1/4, 1/8, 1/8, 1/8, 1/8, 1/16,...\]

19.3 The Ratio Test

We saw in the last chapter that geometric series - where the consecutive ratios of every pair of terms is constant - are particularly easy to sum. Now that we have comparison, we can leverage this to provide a powerful convergence test for a much larger collection of series: those whose consecutive rations are constant in the limit.

Theorem 19.3 (The Ratio Test) Let \(\sum a_n\) be a series, and assume that the sequence of consecutive ratios converges, \[\lim \left|\frac{a_{n+1}}{a_n}\right|=\alpha\] Then \(\sum a_n\) converges if \(\alpha<1\), and diverges if \(\alpha>1\).

Proof. We prove the convergence claim for \(\alpha<1\) here, and leave the divergence for \(\alpha>1\) as an exercise.

Assume that \(\lim \left|\frac{a_{n+1}}{a_n}\right|<1\), and let \(N\) be such that for all \(n>N\) we have \[\left|\frac{a_{n+1}}{a_n}\right|<r\] For some fixed \(0<r<1\) (perhaps do this by choosing your favorite \(\epsilon>0\), defining \(r=1-\epsilon\) and using the convergence hypothesis). Thus, for all \(n>N\) we know \(|a_{n+1}|<r|a_n|\), and so inductively \(|a_{N+n}|<r^n|a_N|\). Summing the series, we see that \[\sum_{k=0}^n |a_{N+k}|<\sum_{k=0}^n r^k|a_N|\] Thus, starting from the \(N^{th}\) term, our series is bounded above by a multiple of a geometric series! And, since we know geometric series converge, we can use comparison to see that \(\sum_{k\geq N}|a_k|\) is convergent.

But the first finitely many terms of a series cannot affect whether or not it converges, so we see that

\[\sum_{k\geq 0}|a_k|\textrm{ is convergent}\]

This is the definition of \(\sum a_k\) being absolutely convergent, and thus \(\sum a_k\) is itself convergent.

Exercise 19.5 Prove that if \(\lim \left|\frac{a_{n+1}}{a_n}\right|>1\), the series \(\sum a_n\) diverges.

Note that this test does not tell us anything when \(\alpha=1\): it only says that our series is growing faster than a geometric series with \(r<1\) but slower than such a series with \(r>1\). There is plenty of room for both behaviors in this gap:

Example 19.1 The sequence \(\sum \frac{1}{n}\) diverges, but its limiting ratio is \[\lim \left|\frac{\frac{1}{n+1}}{\frac{1}{n}}\right|=\lim\left|\frac{n}{n+1}\right|=1\]

But, the sequence \(\sum \frac{1}{n^2}\) converges, with the same limiting ratio: \[\lim \left|\frac{\frac{1}{(n+1)^2}}{\frac{1}{n^2}}\right|=\lim\left|\frac{n^2}{(n+1)^2}\right|=1\]

Remark 19.1. There is an even more general version of the ratio test were we don’t assume that \(|a_{n+1}/a_n|\) converges, but only that it is eventually strictly bounded above by \(1\). Precisely, all that’s actually required is \(\lim_{N}\sup\left\{\left|\tfrac{a_{n+1}}{a_n}\right|\mid n\geq N\right\}<1\)

Exercise 19.6 Prove that the following series converges: \[\sum_{n\geq 0}\frac{1}{n!}\]

19.4 \(\bigstar\) Other Convergence Tests

Because series are ubiquitous throughout mathematics, there are many more convergence theorems that have been developed than we have the time to cover here. Though we will not need them in our course, I list two of the most popular (following the ratio test) below for reference.

Theorem 19.4 (The Root Test) Let \(\sum a_n\) be a series, and assume that the sequence of \(n^{th}\) roots converges, \[\lim \sqrt[n]{|a_n|}=\alpha\] Then \(\sum a_n\) converges if \(\alpha<1\), and diverges if \(\alpha>1\).

The following test shows up in a Calculus II course; though we are not ready to rigorously discuss it yet as it requires integration. Once we gain some ability with integrals, this will allow us to leverage our abilities with the Fundamental Theorem to prove new facts about series.

Theorem 19.5 (The Integral Test) If \(f\) is a continuous function such that the sequence \(a_n=f(n)\) is a defined by evaluating \(f\) at integer values, then the sum \(\sum a_n\) converges if and only if the integral \(\int_0^\infty f(x)dx\) converges.

19.5 \(\bigstar\) Conditionally Convergent Series

Definition 19.3 A series converges conditionally if it converges, but is not absolutely convergent.

Such series caused much trouble in the foundations of analysis, as they can exhibit rather strange behavior. We met one such series in the introduction, the alternating sum of \(1/n\) which seemed to converge to different values depending on the order we added its terms. Here we begin an investigation into such phenomena.

19.5.1 Alternating Series

Definition 19.4 (Alternating Series) An alternating series is a series of the form \(\sum (-1)^n b_n\) for \(a_n\) a nonnegative series. That is, every term switches from positive to negative.

Theorem 19.6 (Alternating Series Test) If \(\sum (-1)^na_n\) is alternating, then it converges if \(a_n\) decreases monotonically with limit zero.

Before jumping in, its helpful to take a look at a few partial sums to start. For example, \(s_4\):

\[s_4=a_0-a_1+a_2-a_3+a_4=(a_0-a_1)+(a_2-a_3)+a_4\]

Grouping the terms of this finite sum like so shows that \(s_4\) is a sum of positive numbers (since \(a_n\) is decreasing, so \(a_n-a_{n-1}\geq 0\)): thus \(s_4\geq 0\).

\[s_4=a_0-a_1+a_2-a_3+a_4=a_0-(a_1-a_2)-(a_3-a_4)\] This grouping shows \(s_4\) is equal to \(a_0\) minus a bunch of nonnegative terms: thus \(s_4\leq a_0\). This extends directly

Exercise 19.7 Let \(s_n=\sum_{k=0}^n (-1)^ka_k\) be an alternating series with \(a_n\to 0\) monotonically. Prove by induction that

  • All the partial sums \(s_n\) are nonnegative.
  • All partial sums are bounded above by the first term \(a_0\).

Corollary 19.2 Starting the sum at \(N\) instead of \(0\), the same argument shows that \(\left|\sum_{k=N}^n (-1)^ka_k\right|\leq |a_N|\) for all \(n\geq N\).

What other patterns can we notice? Increasing from \(s_4\) to \(s_6\) we see \[s_6=a_0-a_1+a_2-a_3+a_4-a_5+a_6\] \[=s_4-a_5+a_6=s_4-(a_5-a_6)\] Thus \(s_6\leq s_4\). A similar look at \(s_3\) and \(s_5\) shows \[s_5=a_0-a_1+a_2-a_3+a_4-a_5=s_3+(a_4-a_5)\] So \(s_5\geq s_3\)! This is a sort of pattern we’ve seen before, where it’s helpful to look at the even versus odd subsequences individually:

Exercise 19.8 Let \(s_n=\sum_{k=0}^n (-1)^ka_k\) be an alternating series, and prove by induction that

  • The even subsequence is monotone decreasing
  • The odd subsequence is monotone increasing

Because each of these subsequences is monotone and bounded (by the previous exercise) they converge via monotone convergence. Now, all we need to see is they converge to the same limit to assure convergence of the entire series, by Theorem 11.2.

Proposition 19.1 Let \(s_n=\sum_{k=0}^n (-1)^ka_k\) be an alternating series with \(a_n\to 0\) monotonically. Then \(s_n\) converges.

Proof. Let \(e_n=s_{2n}\) and \(o_n=s_{2n+1}\) be the even and odd subsequences respectively, and note that \(o_n=e_n-a_{2n+1}\). Then, since we know the subsequence \(a_{2n+1}\) converges to zero (as \(a_n\to 0\), so all subsequences have the same limit) we can apply the limit theorems and see \[\lim o_n = \lim e_n-a_{2n+1}=\lim e_n-\lim a_{2n+1}=\lim e_n\] So, the odd and even subsequences do have the same limit, as required.

19.5.2 Properties of Conditionally Convergent Series

First we look at the main example of a conditionally convergent series.

Example 19.2 \(\sum \frac{(-1)^n}{n}\) is conditionally convergent:

  • It converges, by the alternating series test.
  • But it is not absolutely convergent, as \(\sum \frac{1}{n}\) diverges by EXR

This series is famous from the introduction to our course, where we saw that its value when summed is the natural logarithm of 2, but that this value changes when we reorder the terms! This is a general behavior of conditionally convergent series; and one hint of this is that the sum of their positive and negative terms separately each diverges to \(\pm\infty\).

Theorem 19.7 If \(\sum a_k\) is conditionally convergent, let \(p_k\) be the subsequence of all positive terms of \(a_k\) and \(n_k\) be the subsequence of all negative terms. Prove that \[\sum p_k\to\infty\hspace{1cm}\sum n_k\to-\infty\]

For an absolutely convergent series, this cannot happen, and the sums of all the positive terms converges, as does the sum of all the negative terms.

Exercise 19.9 Prove that if \(\sum a_n\) is absolutely convergent, then its subseries of positive terms and its subseries of negative terms both converge.