# Limit points of real sequences

Let’s start by recalling an important theorem of real analysis:

THEOREM. A necessary and sufficient condition for the convergence of a real sequence is that it is bounded and has a unique limit point.

As a consequence of the theorem, a sequence having a unique limit point is divergent if it is unbounded. An example of such a sequence is the sequence $u_n = \frac{n}{2}(1+(-1)^n),$ whose initial values are $0, 1, 0, 2, 0, 3, 0, 4, 0, 5, 6, \dots$ $$(u_n)$$ is an unbounded sequence whose unique limit point is $$0$$.

Let’s now look at sequences having more complicated limit points sets.

### A sequence whose set of limit points is the set of natural numbers

Consider the sequence $$(v_n)$$ whose initial terms are $1, 1, 2, 1, 2, 3, 1, 2, 3, 4, 1, 2, 3, 4, 5, \dots$ $$(v_n)$$ is defined as follows $v_n=\begin{cases} 1 &\text{ for } n= 1\\ n – \frac{k(k+1)}{2} &\text{ for } \frac{k(k+1)}{2} \lt n \le \frac{(k+1)(k+2)}{2} \end{cases}$ $$(v_n)$$ is well defined as the sequence $$(\frac{k(k+1)}{2})_{k \in \mathbb N}$$ is strictly increasing with first term equal to $$1$$. $$(v_n)$$ is a sequence of natural numbers. As $$\mathbb N$$ is a set of isolated points of $$\mathbb R$$, we have $$V \subseteq \mathbb N$$, where $$V$$ is the set of limit points of $$(v_n)$$. Conversely, let’s take $$m \in \mathbb N$$. For $$k + 1 \ge m$$, we have $$\frac{k(k+1)}{2} + m \le \frac{(k+1)(k+2)}{2}$$, hence $u_{\frac{k(k+1)}{2} + m} = m$ which proves that $$m$$ is a limit point of $$(v_n)$$. Finally the set of limit points of $$(v_n)$$ is the set of natural numbers.

# A power series converging everywhere on its circle of convergence defining a non-continuous function

Consider a complex power series $$\displaystyle \sum_{k=0}^\infty a_k z^k$$ with radius of convergence $$0 \lt R \lt \infty$$ and suppose that for every $$w$$ with $$\vert w \vert = R$$, $$\displaystyle \sum_{k=0}^\infty a_k w^k$$ converges.

We provide an example where the power expansion at the origin $\displaystyle f(z) = \sum_{k=0}^\infty a_k z^k$ is discontinuous on the closed disk $$\vert z \vert \le R$$.

The function $$f$$ is constructed as an infinite sum $\displaystyle f(z) = \sum_{n=1}^\infty f_n(z)$ with $$f_n(z) = \frac{\delta_n}{a_n-z}$$ where $$(\delta_n)_{n \in \mathbb N}$$ is a sequence of positive real numbers and $$(a_n)$$ a sequence of complex numbers of modulus larger than one and converging to one. Let $$f_n^{(r)}(z)$$ denote the sum of the first $$r$$ terms in the power series expansion of $$f_n(z)$$ and $$\displaystyle f^{(r)}(z) \equiv \sum_{n=1}^\infty f_n^{(r)}(z)$$.

We’ll prove that:

1. If $$\sum_n \delta_n \lt \infty$$ then $$\sum_{n=1}^\infty f_n^{(r)}(z)$$ converges and $$f(z) = \lim\limits_{r \to \infty} \sum_{n=1}^\infty f_n^{(r)}(z)$$ for $$\vert z \vert \le 1$$ and $$z \neq 1$$.
2. If $$a_n=1+i \epsilon_n$$ and $$\sum_n \delta_n/\epsilon_n < \infty$$ then $$\sum_{n=1}^\infty f_n^{(r)}(1)$$ converges and $$f(1) = \lim\limits_{r \to \infty} \sum_{n=1}^\infty f_n^{(r)}(1)$$
3. If $$\delta_n/\epsilon_n^2 \to \infty$$ then $$f(z)$$ is unbounded on the disk $$\vert z \vert \le 1$$.

First, let’s recall this corollary of Lebesgue’s dominated convergence theorem:

Let $$(u_{n,i})_{(n,i) \in \mathbb N \times \mathbb N}$$ be a double sequence of complex numbers. Suppose that $$u_{n,i} \to v_i$$ for all $$i$$ as $$n \to \infty$$, and that $$\vert u_{n,i} \vert \le w_i$$ for all $$n$$ with $$\sum_i w_i < \infty$$. Then for all $$n$$ the series $$\sum_i u_{n,i}$$ is absolutely convergent and $$\lim_n \sum_i u_{n,i} = \sum_i v_i$$.
Continue reading A power series converging everywhere on its circle of convergence defining a non-continuous function

# A uniformly but not normally convergent function series

Consider a functions series $$\displaystyle \sum f_n$$ of functions defined on a set $$S$$ to $$\mathbb R$$ or $$\mathbb C$$. It is known that if $$\displaystyle \sum f_n$$ is normally convergent, then $$\displaystyle \sum f_n$$ is uniformly convergent.

The converse is not true and we provide two counterexamples.

Consider first the sequence of functions $$(g_n)$$ defined on $$\mathbb R$$ by:
$g_n(x) = \begin{cases} \frac{\sin^2 x}{n} & \text{for } x \in (n \pi, (n+1) \pi)\\ 0 & \text{else} \end{cases}$ The series $$\displaystyle \sum \Vert g_n \Vert_\infty$$ diverges as for all $$n \in \mathbb N$$, $$\Vert g_n \Vert_\infty = \frac{1}{n}$$ and the harmonic series $$\sum \frac{1}{n}$$ diverges. However the series $$\displaystyle \sum g_n$$ converges uniformly as for $$x \in \mathbb R$$ the sum $$\displaystyle \sum g_n(x)$$ is having only one term and $\vert R_n(x) \vert = \left\vert \sum_{k=n+1}^\infty g_k(x) \right\vert \le \frac{1}{n+1}$

For our second example, we consider the sequence of functions $$(f_n)$$ defined on $$[0,1]$$ by $$f_n(x) = (-1)^n \frac{x^n}{n}$$. For $$x \in [0,1]$$ $$\displaystyle \sum (-1)^n \frac{x^n}{n}$$ is an alternating series whose absolute value of the terms converge to $$0$$ monotonically. According to Leibniz test, $$\displaystyle \sum (-1)^n \frac{x^n}{n}$$ is well defined and we can apply the classical inequality $\displaystyle \left\vert \sum_{k=1}^\infty (-1)^k \frac{x^k}{k} – \sum_{k=1}^m (-1)^k \frac{x^k}{k} \right\vert \le \frac{x^{m+1}}{m+1} \le \frac{1}{m+1}$ for $$m \ge 1$$. Which proves that $$\displaystyle \sum (-1)^n \frac{x^n}{n}$$ converges uniformly on $$[0,1]$$.

However the convergence is not normal as $$\sup\limits_{x \in [0,1]} \frac{x^n}{n} = \frac{1}{n}$$.

# Root test

The root test is a test for the convergence of a series $\sum_{n=1}^\infty a_n$ where each term is a real or complex number. The root test was developed first by Augustin-Louis Cauchy.

We denote $l = \limsup\limits_{n \to \infty} \sqrt[n]{\vert a_n \vert}.$ $$l$$ is a non-negative real number or is possibly equal to $$\infty$$. The root test states that:

• if $$l < 1$$ then the series converges absolutely;
• if $$l > 1$$ then the series diverges.

The root test is inconclusive when $$l = 1$$.

### A case where $$l=1$$ and the series diverges

The harmonic series $$\displaystyle \sum_{n=1}^\infty \frac{1}{n}$$ is divergent. However $\sqrt[n]{\frac{1}{n}} = \frac{1}{n^{\frac{1}{n}}}=e^{- \frac{1}{n} \ln n}$ and $$\limsup\limits_{n \to \infty} \sqrt[n]{\frac{1}{n}} = 1$$ as $$\lim\limits_{n \to \infty} \frac{\ln n}{n} = 0$$.

### A case where $$l=1$$ and the series converges

Consider the series $$\displaystyle \sum_{n=1}^\infty \frac{1}{n^2}$$. We have $\sqrt[n]{\frac{1}{n^2}} = \frac{1}{n^{\frac{2}{n}}}=e^{- \frac{2}{n} \ln n}$ Therefore $$\limsup\limits_{n \to \infty} \sqrt[n]{\frac{1}{n^2}} = 1$$, while the series $$\displaystyle \sum_{n=1}^\infty \frac{1}{n^2}$$ is convergent as we have seen in the ratio test article. Continue reading Root test

# Ratio test

The ratio test is a test for the convergence of a series $\sum_{n=1}^\infty a_n$ where each term is a real or complex number and is nonzero when $$n$$ is large. The test is sometimes known as d’Alembert’s ratio test.

Suppose that $\lim\limits_{n \to \infty} \left\vert \frac{a_{n+1}}{a_n} \right\vert = l$ The ratio test states that:

• if $$l < 1$$ then the series converges absolutely;
• if $$l > 1$$ then the series diverges.

What if $$l = 1$$? One cannot conclude in that case.

### Cases where $$l=1$$ and the series diverges

Consider the harmonic series $$\displaystyle \sum_{n=1}^\infty \frac{1}{n}$$. We have $$\lim\limits_{n \to \infty} \frac{n+1}{n} = 1$$. It is well know that the harmonic series diverges. Recall that one proof uses the Cauchy’s convergence test based for $$k \ge 1$$ on the inequalities: $\sum_{n=2^k+1}^{2^{k+1}} \frac{1}{n} \ge \sum_{n=2^k+1}^{2^{k+1}} \frac{1}{2^{k+1}} = \frac{2^{k+1}-2^k}{2^{k+1}} \ge \frac{1}{2}$

An even simpler case is the series $$\displaystyle \sum_{n=1}^\infty 1$$.

### Cases where $$l=1$$ and the series converges

We also have $$\lim\limits_{n \to \infty} \left\vert \frac{a_{n+1}}{a_n} \right\vert = 1$$ for the infinite series $$\displaystyle \sum_{n=1}^\infty \frac{1}{n^2}$$. The series is however convergent as for $$n \ge 1$$ we have:$0 \le \frac{1}{(n+1)^2} \le \frac{1}{n(n+1)} = \frac{1}{n} – \frac{1}{n+1}$ and the series $$\displaystyle \sum_{n=1}^\infty \left(\frac{1}{n} – \frac{1}{n+1} \right)$$ obviously converges.

Another example is the alternating series $$\displaystyle \sum_{n=1}^\infty \frac{(-1)^n}{n}$$.

# A continuous function with divergent Fourier series

It is known that for a piecewise continuously differentiable function $$f$$, the Fourier series of $$f$$ converges at all $$x \in \mathbb R$$ to $$\frac{f(x^-)+f(x^+)}{2}$$.

We describe Fejér example of a continuous function with divergent Fourier series. Fejér example is the even, $$(2 \pi)$$-periodic function $$f$$ defined on $$[0,\pi]$$ by: $f(x) = \sum_{p=1}^\infty \frac{1}{p^2} \sin \left[ (2^{p^3} + 1) \frac{x}{2} \right]$
According to Weierstrass M-test, $$f$$ is continuous. We denote $$f$$ Fourier series by $\frac{1}{2} a_0 + (a_1 \cos x + b_1 \sin x) + \dots + (a_n \cos nx + b_n \sin nx) + \dots.$

As $$f$$ is even, the $$b_n$$ are all vanishing. If we denote for all $$m \in \mathbb N$$:$\lambda_{n,m}=\int_0^{\pi} \sin \left[ (2m + 1) \frac{t}{2} \right] \cos nt \ dt \text{ and } \sigma_{n,m} = \sum_{k=0}^n \lambda_{k,m},$
we have:\begin{aligned} a_n &=\frac{1}{\pi} \int_{-\pi}^{\pi} f(t) \cos nt \ dt= \frac{2}{\pi} \int_0^{\pi} f(t) \cos nt \ dt\\ &= \frac{2}{\pi} \int_0^{\pi} \left(\sum_{p=1}^\infty \frac{1}{p^2} \sin \left[ (2^{p^3} + 1) \frac{x}{2} \right]\right) \cos nt \ dt\\ &=\frac{2}{\pi} \sum_{p=1}^\infty \frac{1}{p^2} \int_0^{\pi} \sin \left[ (2^{p^3} + 1) \frac{x}{2} \right] \cos nt \ dt\\ &=\frac{2}{\pi} \sum_{p=1}^\infty \frac{1}{p^2} \lambda_{n,2^{p^3-1}} \end{aligned} One can switch the $$\int$$ and $$\sum$$ signs as the series is normally convergent.

We now introduce for all $$n \in \mathbb N$$:$S_n = \frac{\pi}{2} \sum_{k=0}^n a_k = \sum_{p=1}^\infty \sum_{k=0}^n \frac{1}{p^2} \lambda_{k,2^{p^3-1}} =\sum_{p=1}^\infty \frac{1}{p^2} \sigma_{n,2^{p^3-1}}$

We will prove below that for all $$n,m \in \mathbb N$$ we have $$\sigma_{m,m} \ge \frac{1}{2} \ln m$$ and $$\sigma_{n,m} \ge 0$$. Assuming those inequalities for now, we get:$S_{2^{p^3-1}} \ge \frac{1}{p^2} \sigma_{2^{p^3-1},2^{p^3-1}} \ge \frac{1}{2p^2} \ln(2^{p^3-1}) = \frac{p^3-1}{2p^2} \ln 2$
As the right hand side diverges to $$\infty$$, we can conclude that $$(S_n)$$ diverges and consequently that the Fourier series of $$f$$ diverges at $$0$$. Continue reading A continuous function with divergent Fourier series

# Radius of convergence of power series

We look here at the radius of convergence of the sum and product of power series.

Let’s recall that for a power series $$\displaystyle \sum_{n=0}^\infty a_n x^n$$ where $$0$$ is not the only convergence point, the radius of convergence is the unique real $$0 < R \le \infty$$ such that the series converges whenever $$\vert x \vert < R$$ and diverges whenever $$\vert x \vert > R$$.

Given two power series with radii of convergence $$R_1$$ and $$R_2$$, i.e.
\begin{align*}
\displaystyle f_1(x) = \sum_{n=0}^\infty a_n x^n, \ \vert x \vert < R_1 \\ \displaystyle f_2(x) = \sum_{n=0}^\infty b_n x^n, \ \vert x \vert < R_2 \end{align*} The sum of the power series \begin{align*} \displaystyle f_1(x) + f_2(x) &= \sum_{n=0}^\infty a_n x^n + \sum_{n=0}^\infty b_n x^n \\ &=\sum_{n=0}^\infty (a_n + b_n) x^n \end{align*} and its Cauchy product:
\begin{align*}
\displaystyle f_1(x) \cdot f_2(x) &= \left(\sum_{n=0}^\infty a_n x^n\right) \cdot \left(\sum_{n=0}^\infty b_n x^n \right) \\
&=\sum_{n=0}^\infty \left( \sum_{l=0}^n a_l b_{n-l}\right) x^n
\end{align*}
both have radii of convergence greater than or equal to $$\min \{R_1,R_2\}$$.

The radii can indeed be greater than $$\min \{R_1,R_2\}$$. Let’s give examples.

# A trigonometric series that is not a Fourier series (Lebesgue-integration)

We already provided here an example of a trigonometric series that is not the Fourier series of a Riemann-integrable function (namely the function $$\displaystyle x \mapsto \sum_{n=1}^\infty \frac{\sin nx}{\sqrt n}$$).

Applying an Abel-transformation (like mentioned in the link above), one can see that the function $f(x)=\sum_{n=2}^\infty \frac{\sin nx}{\ln n}$ is everywhere convergent. We now prove that $$f$$ cannot be the Fourier series of a Lebesgue-integrable function. The proof is based on the fact that for a $$2 \pi$$-periodic function $$g$$, Lebesgue-integrable on $$[0,2 \pi]$$, the sum $\sum_{n=1}^\infty \frac{c_n-c_{-n}}{n}$ is convergent where $$(c_n)_{n \in \mathbb Z}$$ are the complex Fourier coefficients of $$g$$: $c_n = \frac{1}{2 \pi} \int_0^{2 \pi} g(t)e^{-ikt} \ dt.$ As the series $$\displaystyle \sum_{n=2}^\infty \frac{1}{n \ln n}$$ is divergent, we will be able to conclude that the sequence defined by $\gamma_0=\gamma_1=\gamma_{-1} = 0, \, \gamma_n=- \gamma_{-n} = \frac{1}{\ln n} \ (n \ge 2)$ cannot be the Fourier coefficients of a Lebesgue-integrable function, hence that $$f$$ is not the Fourier series of any Lebesgue-integrable function. Continue reading A trigonometric series that is not a Fourier series (Lebesgue-integration)

# Playing with liminf and limsup

Let’s consider real sequences $$(a_n)_{n \in \mathbb N}$$ and $$(b_n)_{n \in \mathbb N}$$. We look at inequalities involving limit superior and limit inferior of those sequences. Following inequalities hold:
\begin{aligned} & \liminf a_n + \liminf b_n \le \liminf (a_n+b_n)\\ & \liminf (a_n+b_n) \le \liminf a_n + \limsup b_n\\ & \liminf a_n + \limsup b_n \le \limsup (a_n+b_n)\\ & \limsup (a_n+b_n) \le \limsup a_n + \limsup b_n \end{aligned} Let’s prove for example the first inequality, reminding first that $\liminf\limits_{n \to \infty} a_n = \lim\limits_{n \to \infty} \left(\inf\limits_{m \ge n} a_m \right).$ For $$n \in \mathbb N$$, we have for all $$m \ge n$$ $\inf\limits_{k \ge n} a_k + \inf\limits_{k \ge n} b_k \le a_m + b_m$ hence $\inf\limits_{k \ge n} a_k + \inf\limits_{k \ge n} b_k \le \inf\limits_{k \ge n} \left(a_k+b_k \right)$ As the sequences $$(\inf\limits_{k \ge n} a_k)_{n \in \mathbb N}$$ and $$(\inf\limits_{k \ge n} b_k)_{n \in \mathbb N}$$ are non-increasing we get for all $$n \in \mathbb N$$, $\liminf a_n + \liminf b_n \le \inf\limits_{m \ge n} \left(a_m+b_m \right)$ which leads finally to the desired inequality $\liminf a_n + \liminf b_n \le \liminf (a_n+b_n).$ Continue reading Playing with liminf and limsup

# A trigonometric series that is not a Fourier series (Riemann-integration)

We’re looking here at convergent trigonometric series like $f(x) = a_0 + \sum_{k=1}^\infty (a_n \cos nx + b_n \sin nx)$ which are convergent but are not Fourier series. Which means that the terms $$a_n$$ and $$b_n$$ cannot be written$\begin{array}{ll} a_n = \frac{1}{\pi} \int_0^{2 \pi} g(t) \cos nt \, dt & (n= 0, 1, \dots) \\ b_n = \frac{1}{\pi} \int_0^{2 \pi} g(t) \sin nt \, dt & (n= 1, 2, \dots) \end{array}$ where $$g$$ is any integrable function.

This raises the question of the type of integral used. We cover here an example based on Riemann integral. I’ll cover a Lebesgue integral example later on.

We prove here that the function $f(x)= \sum_{n=1}^\infty \frac{\sin nx}{\sqrt{n}}$ is a convergent trigonometric series but is not a Fourier series. Continue reading A trigonometric series that is not a Fourier series (Riemann-integration)