25 Power Series
Highlights of this Chapter: we prove to marvelous results about power series: we show that they are differentiable (and get a formula for their derivative), and we also prove a formula about how to approximate functions well with a power series, and in the limit get a power series representation of a known function, in terms of its derivatives at a single point.
25.1 Differentiating Power Series
The goal of this section is to prove that power series are differentiable, and that we can differentiate them term by term. That is, we seek to prove
Because a derivative is defined as a limit, this process of bringing the derivative inside the sum is really an exchange of limits: and we know the tool for that! Dominated Convergence
25.1.1 Dominated Convergence
The crux of differentiating a power series is to be able to bring the derivative inside the sum. Because derivatives are limits, we can use dominated convergence to understand when we can switch sums and limits. One crucial step here is the Mean Value Theorem.
Theorem 25.1 Let
- For each
, is differentiable at all . - For each
, is convergent. - There is an
with , for all . - The sum
is convergent.
Then, the sum
Proof. Recall the limit definition of the derivative (Definition 22.1):
And now, rewriting the limit of partial sums as an infinite sum, we see
If we are justified in switching the limit and the sum via Theorem 21.2, this becomes
which is exactly what we want. Thus, all we need to do is justify that the conditions of Theorem 21.2 are satisfied, for the terms
Step 1: Show
Step 2: Show
Step 3: Find an
Step 4: Show
25.1.2 Term - By - Term Differentiation
Now, we will attempt to apply dominated convergence for derivatives to a power series. Should this work, we will find the derivative can be calculated via term-by-term differentiation:
So, let’s begin by investigating this series: can we figure out when it converges?
Proposition 25.1 Let
Proof. Say we computed the radius of convergence of
Simplifying this fraction and breaking into components gives
We can compute the limit of the first term here directly, as
(A small note: while for all series we will see we can easily compute the radius of convergence via the ratio test; if we were not able to we would need a more involved argument above to help us fill in that first line).
Now that we know our proposed derivative actually makes sense (converges), its time to show we are actually justified in exchanging the sum limit and the derivative limit, using Dominated Convergence.
Theorem 25.2 (Differentiation of Power Series) Let
Proof. The terms on the right are the term-by-term derivatives of
To start, let
Requirement 1:
Requirement 2:
Requirement 3: There is an
Requirement 4:
Thus, all the requirements are satisfied, and dominated convergence allows us to switch the order of the sum with differentiation.
Example 25.1 We know the geometric series converges to
The fact that power series are differentiable on their entire radius of convergence puts a strong constraint on which sort of functions can ever be written as the limit of such a series.
Example 25.2 The absolute value
25.2 Power Series Representations
Definition 25.1 A power series representation of a function
How could one try to track down a power series representation of a given function? Power series - being limits of polynomials - are actually pretty constrained objects: it turns out with a little thought that for a given
Theorem 25.3 (Candidate Series Representation) Let
Proof. Let
Now, we know the first coefficient of
Since
Continuing in this way, the second derivative will have a multiple of
And evaluating the equality
This pattern continues indefinitely, as
As the constant term of
In each case there was no choice to be made, so long as
Definition 25.2 (Taylor Series) For any smooth function
In the limit as
We’ve seen for example, that the geometric series
So the next natural step is to study this representation: does it actually converge to
25.2.1 Taylor’s Error Formula
Our next goal is to understand how to create power series that converge to specific functions, and more importantly prove that our series actually do what we want! To do so, we are going to need some tools relating a functions derivatives to its values. Rolle’s Theorem / the Mean Value Theorem does this for the first derivative, and so we present a generalization here the polynomial mean value theorem, which does so for
Theorem 25.4 (Generalized Rolle’s Theorem) Let
Proof. Because
Continuing in this way, we get a
Corollary 25.1 (A polynomial Mean Value Theorem) Let
Proof. Define the function
Theorem 25.5 (Taylor’s Error Formula) Let
Then for any fixed
For some
Proof. Fix a point
We need to modify
Since
As
25.2.2 Series Based at
All of our discussion (and indeed, everything we will need about power series for our course) dealt with defining a power series based on derivative information at zero. But of course, this was an arbitrary choice: one could do exactly the same thing based at any point
Theorem 25.6 Let
Exercise 25.1 Prove this.
25.3 Smoothness & Analyticity
The above theorem is extremely useful for calculational purposes: it tells us how to find the power series of a derivative. But it also provides a window into the special nature of power series themselves. For, not only did we learn that a power series is differentiable, but we learned that its derivative is another power series (Theorem 25.2) with the same radius of convergence (Proposition 25.1). Since its a power series, we can apply Theorem 25.2 again to find its derivative, which is another power series, and so on.
Thus a power series isn’t only differentiable, but can be differentiated over and over again! Recall such functions are called smooth CITE DEF.
Proposition 25.2 (Power Series are Smooth Functions) Let
Proof. Let
This is a much stronger requirement, which allows us to much finer recognize when a function cannot be written as a power series:
Example 25.3 Consider the function
This is a function which is continuous and differentiable
However, the ability to be represented by a power series is even stricter than being smooth, motivating the definition of analytic functions which pervades much of advanced analysis.
25.3.1 Analytic Functions
Definition 25.3 (Analytic Functions) An analytic function is a function
Corollary 25.2 (The exponential is analytic) The function
Corollary 25.3 (Sine and Cosine are Analytic) As you’ll prove on the final project, these functions have power series that converge on the entire real line, and further work (in the project, via complex exponentials; or alternatively with the Taylor Error formula) shows the limits equal the sine and cosine at all points. Thus these functions are analytic.
In both of these examples, we needed only a single power series to verify analyticity as it converged everywhere! But for functions whose power series have limited radii of convergence, one may need to use many power series to cover the entire domain of the function
Exercise 25.2 (The function
But, then show at every
Thus, while we need infinitely many power series to fully cover the graph of
In fact, probably every smooth function you have ever heard of is analytic: its hard to imagine what could go wrong - somehow you can take infinitely many derivatives, but in the end, the error term does not go to zero?
It’s a surprising fact of real analysis with - with very wide implications - that there exist smooth but non-analytic functions.
Exercise 25.3 Consider the function
Show that
Hint: compute the derivative via right and left hand limits. We know the left hand limit is always zero, so you just need to show the right hand limit is zero for each derivative…