Series of Function

12.2 Preliminaries

Definition 12.1.1: Let $\sum_{j=1}^\infty f_j$ be a series of real-values function each of whith is defined on a set $S$ in $\mathbb{R}^n$. For each $\mathbf{x}$ in $S$, let $F_k(\mathbf{x}) = \sum_{j=1}^k f_j(\mathbf{x})$. Then $\{F_k\}$ is a sequence of functions defined on $S$ and is called the sequence of partial sums of the series.

i) If $\{F_k\}$ converges pointwise to a function $F$ on $S$, then we say the series $\sum_{j=1}^\infty f_j$ converges pointiwise to $F$ on $S$. We write $\sum_{j=1}^\infty f_j=F$ [pointwise].

ii) If $\{F_k\}$ converges uniformly to a function $F$ on $S$, then we say that the series $\sum_{j=1}^\infty f_j$ converges uniformly to $F$ on $S$. We write $\sum_{j=1}^\infty f_j=F$ [uniformly].


Theorem 12.1.1 Cauchy's Criterion for Uniform Convergence: Let $\sum_{j=1}^\infty f_j$ be a series of bounded, real-valued functions on $S$ and let $\{F_k\}$ denote the sequence of partial funs of $\sum_{j=1}^\infty f_j$. The series $\sum_{j=1}^\infty f_j$ converges uniformly if and only if $\{F_k\}$ is uniformly Cauchy.


12.2 Uniform Convergence

Theorem 12.2.1: If each function $f_j$ is continuous on $S$ and if $\sum_{j=1}^\infty f_j = F$ [uniformly], then $F$ is continuous on $S$.


Theorem 12.2.2 Dini's Theorem: Suppose that $\{f_k\}$ is a monotone sequence of continuous functions on a compact set $S$ in $\mathbb{R}^n$ that converges pointwise to a continuous function $f$. Then the convergence must be uniform.


Corollary 12.2.3: Suppose that $\sum_{j=1}^\infty f_j$ is a series of continuous functions on compact set $S$ in $\mathbb{R}^n$, that the sequence $\{F_k\}$ of partial sums is monotone, and that $\{F_k\}$ converges pointwise to a continuous function $F$ on $S$. Then $$\sum_{j=1}^\infty f_j = F\quad \mbox{[uniformly]}$$


Theorem 12.2.4: Let $\{f_k\}$ be a sequence of differentiable functions on $(a,b)$ satisfying the following conditions.

i) For some $c_0$ in $(a,b)$, $\{f_k(c_0)\}$ converges.

ii) For each $k$, $f_k'$ is bounded on $(a,b)$.

iii) The sequence $\{f_k'\}$ converges uniformly to some function $g$ on $(a,b)$.

Then $\{f_k\}$ converges uniformly to some differentiable function $f$ on $(a,b)$ and $f'=g$.


Corollary 12.2.5: Suppose that $\sum_{j=1}^\infty f_j$ is a series of functions, each of which is differentiable on $(a,b)$, satisfying the following conditions.

i) For some $c_0$ in $(a,b)$, $\sum_{j=1}^\infty f_j(c_0)$ converges.

ii) For each $k$, the functon $F_k'=\sum_{j=1}^\infty f_j'$ is bounded on $(a,b)$.

iii) The derived series $\sum_{j=1}^\infty f_j'$ converges uniformly to some function $g$.

Then $\sum_{j=1}^\infty f_j$ converges uniformly to some differentiable function $F$ on $(a,b)$ and $F'=g$ on $(a,b)$.


Theorem 12.2.6: Suppose that $\sum_{j=1}^\infty f_j$ is a series of functions each of which is integrable on $[a,b]$. Suppose also that the seires $\sum_{j=1}^\infty f_j=F$ [uniformly] on $[a,b]$. Then $F$ is integrable on $[a,b]$. For $x$ in $[a,b]$, define $$G_k(x)=\sum_{j=1}^k \int_a^x f_j(t)dt$$ and $$G(x)=\int_a^x F(t)dt$$

Then $lim_{k\rightarrow \infty} G_k=G$ [uniformly] on $[a,b]$. In particular, $$\int_a^b F(x) dx = \sum_{j=1}^\infty \int_a^b f_j(x)dx$$


12.3 Tests for Uniform Convergence

Theorem 12.3.1 Weierstrass's $M$-Test: Suppose that $\sum_{j=1}^\infty f_j$ is a series of bounded, real-valued functions on a set $S$ in $\mathbb{R}^n$. Assume that there is a convergent series $\sum_{j=1}^\infty M_j$ of positive numbers such that, for all $j$ in $\mathbb{N}$, $|| f_j ||_\infty\le M_j$. Then the series $\sum_{j=1}^\infty f_j$ converges uniformly and absolutely on $S$.


Theorem 12.3.2 Abel's Test for Uniform Convergence: Let $S$ be a subset of $\mathbb{R}^n$. Suppose that $\{f_j\}$ and $\{g_j\}$ are sequence of bounded, real-valued functions on $S$ such that 

i) The series $\sum_{j=1}^\infty f_j$ is uniformly convergent on $S$.

ii) The sequence $\{g_j\}$ is monotone and uniformly bounded on $S$.

Then $\sum_{j=1}^\infty f_jg_j$ converges uniformly on $S$.


12.4 Power Series

Definition 12.4.1: A (real) power seires about the point $x_0$ in $\mathbb{R}$ is a series of the form $\sum_{j=0}^\infty a_j(x-x_0)^j$, where the coefficients $a_j$ are real numbers.


Definition 12.4.2: Given a power series $\sum_{j=0}^\infty a_jx^j$, let $$\lambda = \lim\sup \left| a_j\right| ^{1/j}$$

If $\lambda$ is finite and positive, define $R=1/\lambda$: if $\lambda=0$. define $R=\infty$; if $\lambda=\infty$. define $R=0$. The $R$ thus defined is called the radius of convergence of $\sum_{j=0}^\infty a_jx^j$. The set of points where the seires converges is called the interval of convergence of the series.


Theorem 12.4.1: Let $\sum_{j=0}^\infty a_jx^j$ be a power series with radius of convergence $R$.

i) If $0<R<\infty$ then $\sum_{j=0}^\infty a_jx^j$ 

ii) If $R=\infty$, then $\sum_{j=0}^\infty a_jx^j$ converges absolutely for all real numbers $x$.

iii) If $R=0$, then $\sum_{j=0}^\infty a_jx^j$ converges (trivially) for $x=0$ and diverges for all other values of $x$.


Theorem 12.4.2: Let $\sum_{j=0}^\infty a_jx^j$ be a power series with radius of convergence $R\ne 0$. If $0<r<R$, then $\sum_{j=0}^\infty a_jx^j$ converges uniformly and absolutely to a continuous function on $[-r,r]$.


Corollary 12.4.3: The power series $\sum_{j=0}^\infty a_jx^j$ converges uniformly and absolutely to a continuous function on any compact interval $[a,b]$ contained in $(-R,R)$.


Theorem 12.4.4 Abel's Theorem: If $R> 0$ is the radius of convergence of the power series $\sum_{j=0}^\infty a_jx^j$ and if the power series converges at $x=R$ (or at $x=-R$) then it coverges uniformly on $[0,R]$ (or on $[-R,0]$)


Theorem 12.4.5: Let $\sum_{j=0}^\infty a_jx^j$ be a power series with radius of convergence $R=\ne 0$. Let $F(x)=\sum_{j=0}^\infty a_jx^j$ for $x$ in the interval of convergence. Then $F$ is differentiable on $(-R,R)$ and $F'(x)=\sum_{j=0}^\infty (j+1)a_{j+1}x^j$. The derived series also has radius of convergence $R$ and converges uniformly on any compact subset of $(-R,R)$.


Theorem 12.4.6: Let $\sum_{j=0}^\infty a_jx^j$ be a power series with radius of convergence $R\ne 0$. For $x$ in $(-R,R)$, let $F(x)=\sum_{j=0}^\infty a_jx^j$. Then the function $F$ has derivatives of all orders on the interval $(-R,R)$. The $k$th derivative of $F$ is given by $$F^{(k)}(x)=\sum_{j=0}^\infty \frac{(j+k)!}{j!} a_{j+k}x^j$$

Finally, we confirm that if the derived series converges at an endpoint of the interval of convergence, then the original power series must also converge there.


Theorem 12.4.7: Suppose that $\sum_{j=0}^\infty a_jx^j$ has radius of convergence $R$ with $0<R<\infty$. If the series $\sum_{j=0}^\infty (j+1)a_{j+1}x^j$ converges at $R$ (or at $-R$). then $\sum_{j=0}^\infty a_jx^j$ also converges at $R$ (or at $-R$).


Theorem 12.4.8: Let $\sum_{j=0}^\infty a_jx^j$ be a power series with radius of convergence $R\ne 0$ and interval of convergence $I$. For $x$ in $I$, let $F(x)=\sum_{j=0}^\infty a_jx^j$. The series $\sum_{j=1}^\infty a_{j-1}x^j/j$ also has radius of convergence $R$ and $$\int_0^x F(t) dt = \sum_{j=1}^\infty \frac{a_{j-1}x^j}{j}\ \mbox{[uniformly]}$$ on any compact interval $[a,b]$ that contains $0$ and is contained in $I$.


Theorem 12.4.9: If $\sum_{j=0}^\infty a_jx^j$ has radius of convergence $R$ with $0<R<\infty$ and if $\sum_{j=0}^\infty a_jx^j$ converges at $R$ (or at $-R$), then the integrated series $\sum_{j=0}^\infty a_{j-1}x^j/j$ also converges at $R$ (or at $-R$).


Theorem 12.4.10: Let $\sum_{j=0}^\infty a_j(x-x_0)^j$ be a power series about $x_0$ with radius of convergence $R\ne 0$ and interval of convergence $I$.

i) The series $\sum_{j=0}^\infty a_j(x-x_0)^j$ converges uniformly to a continuous function $F$ on any compact subset of $I$.

ii) The function $F$ has derivatives of all orders on the interval $(x_0-R,x_0+R)$ given by $$F^{(k)}(x)=\sum_{j=0}^\infty \frac{(j+k)!}{j!} a_{j+k}(x-x_0)^j$$

This series converges uniformly to $F^{(k)}$ on any compact subset of the interior $I^0$ of $I$. If the $k$th derived series converges at an endpoint of $I$, then all the seires representing the lower-order derivatives of $F$ converge there also.

iii) $$\inf_{x_0}^x F(t)dt = \sum_{j=0}^\infty a_j \int_{x_0}^x (t-x_0)^j dt = \sum_{j=1}^\infty \frac{a_{j-1}(x-x_0)^j}{j}\quad\mbox{[uniformly]}$$ on any compact interval in $I$. If the original power series $\sum_{j=0}^\infty a_j(x-x_0)^j$ converges at an endpoint of $I$, then the integrated series converges there also.


Lemma 1: For all $k$ in $\mathbb{N}$, 

i) $$\sum_{j=0}^k \frac{1}{(2j)!(2k-2j)!} = \frac{1}{(2k)!} \sum_{j=0}^k C(2k,2j)$$

ii) $$\sum_{j=1}^k \frac{1}{(2j-1)!(2k-2j+1)!} = \frac{1}{(2k)!} \sum_{j=1}^k C(2k,2j-1)$$


Lemma 2: For all $k$ in $\mathbb{N}$, $$\sum_{j=0}^k C(2k,2j)=\sum_{j=1}^k C(2k,2j-1)=2^{2k-1}$$


12.5 The Taylor Series Representation of Functions

Theorem 12.5.1: If $f$ can be represented by a power series of the form $\sum_{j=0}^\infty a_j(x-x_0)^j$, then $a_j=f^{(j)}(x_0)/j!$ for all $j$.


Corollary 12.5.2: If $\sum_{j=0}^\infty a_j(x-x_0)^j$ converges to $0$ for every $x$ in an interval $[a,b]$, then $a_j=0$ for all $j$.


Definition 12.5.1: Let $f$ be a function that hs derivatives of all orders on an interval $I$ and let $x_0$ be a point of $I$. The corresponding power series $$\sum_{j=0}^\infty \frac{f^{(j)}(x_0)}{j!}(x-x_0)^j$$ is called the Taylor series of $f$ about the point $x_0$.

Theorem 12.5.3: Let $f$ be a function that has derivatives of all orders on an interval $[a,b]$ and let $x_0$ be a point in $[a,b]$. Then $f$ can be represented by a power series on $[a,b]$, that is, $f(x)=\sum_{j=0}^\infty f^{(j)}(x_0)(x-x_0)^j/j!$ [uniformly] on $[a,b]$, if and only if the sequence $\{p_k\}$ of Taylor polynomials of $f$ converges uniformly to $f$ on $[a,b]$. Equivalently, $\lim_{k\rightarrow \infty} R_k (x_0;x)=0$ [uniformly] on $[a,b]$.


Theorem 12.5.4: Gauss's Convergence Test: Let $\sum_{j=1}^\infty a_j$ be a series for which $\left| a_{j+1}/a_j\right|$ can be written in the form $$\left| \frac{a_{j+1}}{a_j}\right| = 1 - \frac{c}{j}+\frac{d_j}{j^2}$$ where $c$ is a constant and $\{d_j\}$ is a bounded sequence.

i) If $c>1$, then $\sum_{j=1}^\infty a_j$ converges absolutely.

ii) If $c\le 1$, then $\sum_{j=1}^\infty a_j$ either converges conditionally or diverges; it does not converge absoluely.


Theorem 12.6.1: The differentiable function $f$ solves the initial value problem 

i) $f(x_0)=y_0$

ii) $f'(x)=F(x,f(x))$ for all $x$ in $I$.

if and only if it solves the integral equation

$$f(x)=y_0 + \int_{x_0}^x F(t,f(t))dt$$


Lemma: For $k$ in $\mathbb{N}$ and for $x$ in $I$, $$\left| f_k(x) - f_{k-1}(x)\right| \le \frac{||F||_infty M^{k-1} \left| x-x_0\right|^k}{k!}$$


Theorem 12.6.2 Picard: If $F$ satisfies the Lipshitz condition (12.12), then the sequence $\{f_k\}$ constructed above converges uniformly to a solution $f$ of the integral equation $$f(x)=y_0 + \int_{x_0}^x F(t,f(t))dt$$ on the interval $I=[x_0-r,x_0+r]$. The function $f$ also solves the initial value problem $y'=F(x,y)$, $y(x_0)=y_0$.



Reference

Douglass - Introduction to Mathematical Analysis