# Counterexamples around series (part 2)

We follow the article counterexamples around series (part 1) providing additional funny series examples.

### If $$\sum u_n$$ converges and $$(u_n)$$ is non-increasing then $$u_n = o(1/n)$$?

This is true. Let’s prove it.
The hypotheses imply that $$(u_n)$$ converges to zero. Therefore $$u_n \ge 0$$ for all $$n \in \mathbb N$$. As $$\sum u_n$$ converges we have $\displaystyle \lim\limits_{n \to \infty} \sum_{k=n/2}^{n} u_k = 0.$ Hence for $$\epsilon \gt 0$$, one can find $$N \in \mathbb N$$ such that $\epsilon \ge \sum_{k=n/2}^{n} u_k \ge \frac{1}{2} (n u_n) \ge 0$ for all $$n \ge N$$. Which concludes the proof.

### $$\sum u_n$$ convergent is equivalent to $$\sum u_{2n}$$ and $$\sum u_{2n+1}$$ convergent?

Is not true as we can see taking $$u_n = \frac{(-1)^n}{n}$$. $$\sum u_n$$ converges according to the alternating series test. However for $$n \in \mathbb N$$ $\sum_{k=1}^n u_{2k} = \sum_{k=1}^n \frac{1}{2k} = 1/2 \sum_{k=1}^n \frac{1}{k}.$ Hence $$\sum u_{2n}$$ diverges as the harmonic series diverges.

### $$\sum u_n$$ absolutely convergent is equivalent to $$\sum u_{2n}$$ and $$\sum u_{2n+1}$$ absolutely convergent?

This is true and the proof is left to the reader.

### $$\sum u_n$$ is a positive convergent series then $$(\sqrt[n]{u_n})$$ is bounded?

Is true. If not, there would be a subsequence $$(u_{\phi(n)})$$ such that $$\sqrt[\phi(n)]{u_{\phi(n)}} \ge 2$$. Which means $$u_{\phi(n)} \ge 2^{\phi(n)}$$ for all $$n \in \mathbb N$$ and implies that the sequence $$(u_n)$$ is unbounded. In contradiction with the convergence of the series $$\sum u_n$$.

### If $$(u_n)$$ is strictly positive with $$u_n = o(1/n)$$ then $$\sum (-1)^n u_n$$ converges?

It does not hold as we can see with $u_n=\begin{cases} \frac{1}{n \ln n} & n \equiv 0 [2] \\ \frac{1}{2^n} & n \equiv 1 [2] \end{cases}$ Then for $$n \in \mathbb N$$ $\sum_{k=1}^{2n} (-1)^k u_k \ge \sum_{k=1}^n \frac{1}{2k \ln 2k} – \sum_{k=1}^{2n} \frac{1}{2^k} \ge \sum_{k=1}^n \frac{1}{2k \ln 2k} – 1.$ As $$\sum \frac{1}{2k \ln 2k}$$ diverges as can be proven using the integral test with the function $$x \mapsto \frac{1}{2x \ln 2x}$$, $$\sum (-1)^n u_n$$ also diverges.

# Counterexamples around series (part 1)

Unless otherwise stated, $$(u_n)_{n \in \mathbb{N}}$$ and $$(v_n)_{n \in \mathbb{N}}$$ are two real sequences.

### If $$(u_n)$$ is non-increasing and converges to zero then $$\sum u_n$$ converges?

Is not true. A famous counterexample is the harmonic series $$\sum \frac{1}{n}$$ which doesn’t converge as $\displaystyle \sum_{k=p+1}^{2p} \frac{1}{k} \ge \sum_{k=p+1}^{2p} \frac{1}{2p} = 1/2,$ for all $$p \in \mathbb N$$.

### If $$u_n = o(1/n)$$ then $$\sum u_n$$ converges?

Does not hold as can be seen considering $$u_n=\frac{1}{n \ln n}$$ for $$n \ge 2$$. Indeed $$\int_2^x \frac{dt}{t \ln t} = \ln(\ln x) – \ln (\ln 2)$$ and therefore $$\int_2^\infty \frac{dt}{t \ln t}$$ diverges. We conclude that $$\sum \frac{1}{n \ln n}$$ diverges using the integral test. However $$n u_n = \frac{1}{\ln n}$$ converges to zero. Continue reading Counterexamples around series (part 1)

# Counterexamples on real sequences (part 3)

Let $$(u_n)$$ be a sequence of real numbers.

### If $$u_{2n}-u_n \le \frac{1}{n}$$ then $$(u_n)$$ converges?

This is wrong. The sequence
$u_n=\begin{cases} 0 & \text{for } n \notin \{2^k \ ; \ k \in \mathbb N\}\\ 1- 2^{-k} & \text{for } n= 2^k\end{cases}$
is a counterexample. For $$n \gt 2$$ and $$n \notin \{2^k \ ; \ k \in \mathbb N\}$$ we also have $$2n \notin \{2^k \ ; \ k \in \mathbb N\}$$, hence $$u_{2n}-u_n=0$$. For $$n = 2^k$$ $0 \le u_{2^{k+1}}-u_{2^k}=2^{-k}-2^{-k-1} \le 2^{-k} = \frac{1}{n}$ and $$\lim\limits_{k \to \infty} u_{2^k} = 1$$. $$(u_n)$$ does not converge as $$0$$ and $$1$$ are limit points.

### If $$\lim\limits_{n} \frac{u_{n+1}}{u_n} =1$$ then $$(u_n)$$ has a finite or infinite limit?

This is not true. Let’s consider the sequence
$u_n=2+\sin(\ln n)$ Using the inequality $$\vert \sin p – \sin q \vert \le \vert p – q \vert$$
which is a consequence of the mean value theorem, we get $\vert u_{n+1} – u_n \vert = \vert \sin(\ln (n+1)) – \sin(\ln n) \vert \le \vert \ln(n+1) – \ln(n) \vert$ Therefore $$\lim\limits_n \left(u_{n+1}-u_n \right) =0$$ as $$\lim\limits_n \left(\ln(n+1) – \ln(n)\right) = 0$$. And $$\lim\limits_{n} \frac{u_{n+1}}{u_n} =1$$ because $$u_n \ge 1$$ for all $$n \in \mathbb N$$.

I now assert that the interval $$[1,3]$$ is the set of limit points of $$(u_n)$$. For the proof, it is sufficient to prove that $$[-1,1]$$ is the set of limit points of the sequence $$v_n=\sin(\ln n)$$. For $$y \in [-1,1]$$, we can pickup $$x \in \mathbb R$$ such that $$\sin x =y$$. Let $$\epsilon > 0$$ and $$M \in \mathbb N$$ , we can find an integer $$N \ge M$$ such that $$0 < \ln(n+1) - \ln(n) \lt \epsilon$$ for $$n \ge N$$. Select $$k \in \mathbb N$$ with $$x +2k\pi \gt \ln N$$ and $$N_\epsilon$$ with $$\ln N_\epsilon \in (x +2k\pi, x +2k\pi + \epsilon)$$. This is possible as $$(\ln n)_{n \in \mathbb N}$$ is an increasing sequence and the length of the interval $$(x +2k\pi, x +2k\pi + \epsilon)$$ is equal to $$\epsilon$$. We finally get $\vert u_{N_\epsilon} - y \vert = \vert \sin \left(\ln N_\epsilon \right) - \sin \left(x + 2k \pi \right) \vert \le \left(\ln N_\epsilon - (x +2k\pi)\right) \le \epsilon$ proving that $$y$$ is a limit point of $$(u_n)$$.

# A function whose Maclaurin series converges only at zero

Let’s describe a real function $$f$$ whose Maclaurin series converges only at zero. For $$n \ge 0$$ we denote $$f_n(x)= e^{-n} \cos n^2x$$ and $f(x) = \sum_{n=0}^\infty f_n(x)=\sum_{n=0}^\infty e^{-n} \cos n^2 x.$ For $$k \ge 0$$, the $$k$$th-derivative of $$f_n$$ is $f_n^{(k)}(x) = e^{-n} n^{2k} \cos \left(n^2 x + \frac{k \pi}{2}\right)$ and $\left\vert f_n^{(k)}(x) \right\vert \le e^{-n} n^{2k}$ for all $$x \in \mathbb R$$. Therefore $$\displaystyle \sum_{n=0}^\infty f_n^{(k)}(x)$$ is normally convergent and $$f$$ is an indefinitely differentiable function with $f^{(k)}(x) = \sum_{n=0}^\infty e^{-n} n^{2k} \cos \left(n^2 x + \frac{k \pi}{2}\right).$ Its Maclaurin series has only terms of even degree and the absolute value of the term of degree $$2k$$ is $\left(\sum_{n=0}^\infty e^{-n} n^{4k}\right)\frac{x^{2k}}{(2k)!} > e^{-2k} (2k)^{4k}\frac{x^{2k}}{(2k)!} > \left(\frac{2kx}{e}\right)^{2k}.$ The right hand side of this inequality is greater than $$1$$ for $$k \ge \frac{e}{2x}$$. This means that for any nonzero $$x$$ the Maclaurin series for $$f$$ diverges.

# Raabe-Duhamel’s test

The Raabe-Duhamel’s test (also named Raabe’s test) is a test for the convergence of a series $\sum_{n=1}^\infty a_n$ where each term is a real or complex number. The Raabe-Duhamel’s test was developed by Swiss mathematician Joseph Ludwig Raabe.

It states that if:

$\displaystyle \lim _{n\to \infty }\left\vert{\frac {a_{n}}{a_{n+1}}}\right\vert=1 \text{ and } \lim _{{n\to \infty }} n \left(\left\vert{\frac {a_{n}}{a_{{n+1}}}}\right\vert-1 \right)=R,$
then the series will be absolutely convergent if $$R > 1$$ and divergent if $$R < 1$$. First one can notice that Raabe-Duhamel's test maybe conclusive in cases where ratio test isn't. For instance, consider a real $$\alpha$$ and the series $$u_n=\frac{1}{n^\alpha}$$. We have $\lim _{n\to \infty } \frac{u_{n+1}}{u_n} = \lim _{n\to \infty } \left(\frac{n}{n+1} \right)^\alpha = 1$ and therefore the ratio test is inconclusive. However $\frac{u_n}{u_{n+1}} = \left(\frac{n+1}{n} \right)^\alpha = 1 + \frac{\alpha}{n} + o \left(\frac{1}{n}\right)$ for $$n$$ around $$\infty$$ and $\lim _{{n\to \infty }} n \left(\frac {u_{n}}{u_{{n+1}}}-1 \right)=\alpha.$ Raabe-Duhamel's test allows to conclude that the series $$\sum u_n$$ diverges for $$\alpha <1$$ and converges for $$\alpha > 1$$ as well known.

When $$R=1$$ in the Raabe’s test, the series can be convergent or divergent. For example, the series above $$u_n=\frac{1}{n^\alpha}$$ with $$\alpha=1$$ is the harmonic series which is divergent.

On the other hand, the series $$v_n=\frac{1}{n \log^2 n}$$ is convergent as can be proved using the integral test. Namely $0 \le \frac{1}{n \log^2 n} \le \int_{n-1}^n \frac{dt}{t \log^2 t} \text{ for } n \ge 3$ and $\int_2^\infty \frac{dt}{t \log^2 t} = \left[-\frac{1}{\log t} \right]_2^\infty = \frac{1}{\log 2}$ is convergent, while $\frac{v_n}{v_{n+1}} = 1 + \frac{1}{n} +\frac{2}{n \log n} + o \left(\frac{1}{n \log n}\right)$ for $$n$$ around $$\infty$$ and therefore $$R=1$$ in the Raabe-Duhamel’s test.

# Counterexamples around Cauchy condensation test

According to Cauchy condensation test: for a non-negative, non-increasing sequence $$(u_n)_{n \in \mathbb N}$$ of real numbers, the series $$\sum_{n \in \mathbb N} u_n$$ converges if and only if the condensed series $$\sum_{n \in \mathbb N} 2^n u_{2^n}$$ converges.

The test doesn’t hold for any non-negative sequence. Let’s have a look at counterexamples.

### A sequence such that $$\sum_{n \in \mathbb N} u_n$$ converges and $$\sum_{n \in \mathbb N} 2^n u_{2^n}$$ diverges

Consider the sequence $u_n=\begin{cases} \frac{1}{n} & \text{ for } n \in \{2^k \ ; \ k \in \mathbb N\}\\ 0 & \text{ else} \end{cases}$ For $$n \in \mathbb N$$ we have $0 \le \sum_{k = 1}^n u_k \le \sum_{k = 1}^{2^n} u_k = \sum_{k = 1}^{n} \frac{1}{2^k} < 1,$ therefore $$\sum_{n \in \mathbb N} u_n$$ converges as its partial sums are positive and bounded above. However $\sum_{k=1}^n 2^k u_{2^k} = \sum_{k=1}^n 1 = n,$ so $$\sum_{n \in \mathbb N} 2^n u_{2^n}$$ diverges.

### A sequence such that $$\sum_{n \in \mathbb N} v_n$$ diverges and $$\sum_{n \in \mathbb N} 2^n v_{2^n}$$ converges

Consider the sequence $v_n=\begin{cases} 0 & \text{ for } n \in \{2^k \ ; \ k \in \mathbb N\}\\ \frac{1}{n} & \text{ else} \end{cases}$ We have $\sum_{k = 1}^{2^n} v_k = \sum_{k = 1}^{2^n} \frac{1}{k} – \sum_{k = 1}^{n} \frac{1}{2^k} > \sum_{k = 1}^{2^n} \frac{1}{k} -1$ which proves that the series $$\sum_{n \in \mathbb N} v_n$$ diverges as the harmonic series is divergent. However for $$n \in \mathbb N$$, $$2^n v_{2^n} = 0$$ and $$\sum_{n \in \mathbb N} 2^n v_{2^n}$$ converges.

# Counterexamples around the Cauchy product of real series

Let $$\sum_{n = 0}^\infty a_n, \sum_{n = 0}^\infty b_n$$ be two series of real numbers. The Cauchy product $$\sum_{n = 0}^\infty c_n$$ is the series defined by $c_n = \sum_{k=0}^n a_k b_{n-k}$ According to the theorem of Mertens, if $$\sum_{n = 0}^\infty a_n$$ converges to $$A$$, $$\sum_{n = 0}^\infty b_n$$ converges to $$B$$ and at least one of the two series is absolutely convergent, their Cauchy product converges to $$AB$$. This can be summarized by the equality $\left( \sum_{n = 0}^\infty a_n \right) \left( \sum_{n = 0}^\infty b_n \right) = \sum_{n = 0}^\infty c_n$

The assumption stating that at least one of the two series converges absolutely cannot be dropped as shown by the example $\sum_{n = 0}^\infty a_n = \sum_{n = 0}^\infty b_n = \sum_{n = 0}^\infty \frac{(-1)^n}{\sqrt{n+1}}$ Those series converge according to Leibniz test, as the sequence $$(1/\sqrt{n+1})$$ decreases monotonically to zero. However, the Cauchy product is defined by $c_n=\sum_{k=0}^n \frac{(-1)^k}{\sqrt{k+1}} \cdot \frac{(-1)^{n-k}}{\sqrt{n-k+1}} = (-1)^n \sum_{k=0}^n \frac{1}{\sqrt{(k+1)(n-k+1)}}$ As we have $$1 \le k+ 1 \le n+1$$ and $$1 \le n-k+ 1 \le n+1$$ for $$k = 0 \dots n$$, we get $$\frac{1}{\sqrt{(k+1)(n-k+1)}} \ge \frac{1}{n+1}$$ and therefore $$\vert c_n \vert \ge 1$$ proving that the Cauchy product of $$\sum_{n = 0}^\infty a_n$$ and $$\sum_{n = 0}^\infty b_n$$ diverges.

The Cauchy product may also converge while the initial series both diverge. Let’s consider $\begin{cases} (a_n) = (2, 2, 2^2, \dots, 2^n, \dots)\\ (b_n) = (-1, 1, 1, 1, \dots) \end{cases}$ The series $$\sum_{n = 0}^\infty a_n, \sum_{n = 0}^\infty b_n$$ diverge. Their Cauchy product is the series defined by $c_n=\begin{cases} -2 & \text{ for } n=0\\ 0 & \text{ for } n>0 \end{cases}$ which is convergent.

# Pointwise convergence not uniform on any interval

We provide in this article an example of a pointwise convergent sequence of real functions that doesn’t converge uniformly on any interval.

Let’s consider a sequence $$(a_p)_{p \in \mathbb N}$$ enumerating the set $$\mathbb Q$$ of rational numbers. Such a sequence exists as $$\mathbb Q$$ is countable.

Now let $$(g_n)_{n \in \mathbb N}$$ be the sequence of real functions defined on $$\mathbb R$$ by $g_n(x) = \sum_{p=1}^{\infty} \frac{1}{2^p} f_n(x-a_p)$ where $$f_n : x \mapsto \frac{n^2 x^2}{1+n^4 x^4}$$ for $$n \in \mathbb N$$.

### $$f_n$$ main properties

$$f_n$$ is a rational function whose denominator doesn’t vanish. Hence $$f_n$$ is indefinitely differentiable. As $$f_n$$ is an even function, we can study it only on $$[0,\infty)$$.

We have $f_n^\prime(x)= 2n^2x \frac{1-n^4x^4}{(1+n^4 x^4)^2}.$ $$f_n^\prime$$ vanishes at zero (like $$f_n$$) is positive on $$(0,\frac{1}{n})$$, vanishes at $$\frac{1}{n}$$ and is negative on $$(\frac{1}{n},\infty)$$. Hence $$f_n$$ has a maximum at $$\frac{1}{n}$$ with $$f_n(\frac{1}{n}) = \frac{1}{2}$$ and $$0 \le f_n(x) \le \frac{1}{2}$$ for all $$x \in \mathbb R$$.

Also for $$x \neq 0$$ $0 \le f_n(x) =\frac{n^2 x^2}{1+n^4 x^4} \le \frac{n^2 x^2}{n^4 x^4} = \frac{1}{n^2 x^2}$ consequently $0 \le f_n(x) \le \frac{1}{n} \text{ for } x \ge \frac{1}{\sqrt{n}}.$

### $$(g_n)$$ converges pointwise to zero

First, one can notice that $$g_n$$ is well defined. For $$x \in \mathbb R$$ and $$p \in \mathbb N$$ we have $$0 \le \frac{1}{2^p} f_n(x-a_p) \le \frac{1}{2^p} \cdot\ \frac{1}{2}=\frac{1}{2^{p+1}}$$ according to previous paragraph. Therefore the series of functions $$\sum \frac{1}{2^p} f_n(x-a_p)$$ is normally convergent. $$g_n$$ is also continuous as for all $$p \in \mathbb N$$ $$x \mapsto \frac{1}{2^p} f_n(x-a_p)$$ is continuous. Continue reading Pointwise convergence not uniform on any interval

# Counterexample around infinite products

Let’s recall two theorems about infinite products $$\prod \ (1+a_n)$$. The first one deals with nonnegative terms $$a_n$$.

THEOREM 1 An infinite product $$\prod \ (1+a_n)$$ with nonnegative terms $$a_n$$ converges if and only if the series $$\sum a_n$$ converges.

The second is related to infinite products with complex terms.

THEOREM 2 The absolute convergence of the series $$\sum a_n$$ implies the convergence of the infinite product $$\prod \ (1+a_n)$$. Moreover $$\prod \ (1+a_n)$$ is not zero providing $$a_n \neq -1$$ for all $$n \in \mathbb N$$.

The converse of Theorem 2 is not true as shown by following counterexample.

We consider $$a_n=(-1)^n/(n+1)$$. For $$N \in \mathbb N$$ we have:
$\prod_{n=1}^N \ (1+a_n) = \begin{cases} \frac{1}{2} &\text{ for } N \text{ odd}\\ \frac{1}{2}(1+\frac{1}{N+1}) &\text{ for } N \text{ even} \end{cases}$ hence the infinite product $$\prod \ (1+a_n)$$ converges (to $$\frac{1}{2}$$) while the series $$\sum \left\vert a_n \right\vert = \sum \frac{1}{n+1}$$ diverges (it is the harmonic series with first term omitted).

# Counterexamples around Lebesgue’s Dominated Convergence Theorem

Let’s recall Lebesgue’s Dominated Convergence Theorem. Let $$(f_n)$$ be a sequence of real-valued measurable functions on a measure space $$(X, \Sigma, \mu)$$. Suppose that the sequence converges pointwise to a function $$f$$ and is dominated by some integrable function $$g$$ in the sense that $\vert f_n(x) \vert \le g (x)$ for all $$n \in \mathbb N$$ and all $$x \in X$$.
Then $$f$$ is integrable and $\lim\limits_{n \to \infty} \int_X f_n(x) \ d \mu = \int_X f(x) \ d \mu$

### Let’s see what can happen if we drop the domination condition.

We consider the space $$\mathbb R$$ endowed with Lebesgue measure and for $$E \subseteq \mathbb R$$ we denote by $$\chi_E$$ the indicator function of $$E$$ defined by $\chi_E(x)=\begin{cases} 1 \text{ if } x \in E\\ 0 \text{ otherwise}\end{cases}$ For $$n \in \mathbb N$$, the function $$f_n=\frac{1}{2n}\chi_{(n^2-n,n^2+n)}$$ is measurable and we have $\int_{\mathbb R} \frac{1}{2n}\chi_{(n^2-n,n^2+n)}(x) \ dx = \int_{n^2-n}^{n^2+n} \frac{1}{2n} \ dx = 1$ The sequence $$(f_n)$$ converges uniformly (and therefore pointwise) to the always vanishing function as for $$n \in \mathbb N$$ we have for all $$x \in \mathbb R$$ $$\vert f_n(x) \vert \le \frac{1}{2n}$$. Hence the conclusion of Lebesgue’s Dominated Convergence Theorem doesn’t hold for the sequence $$(f_n)$$.

Let’s verify that the sequence $$(f_n)$$ is not dominated by some integrable function $$g$$. For $$p < q$$ integers, we have \begin{aligned} q^2-q-(p^2+p) &= q^2-p^2 -q-p\\ &= (q-p)(q+p) -q -p\\ &\ge (q+p) -q-p=0 \end{aligned} Hence for $$p \neq q$$ integers the intervals $$(p^2-p,p^2+p)$$ and $$(q^2-q,q^2+q)$$ are disjoint. Consequently for all $$x \in \mathbb R$$ the sum $$\sum_{n \in \mathbb N} f_n(x)$$ amounts to only one term and the function $$\sum_{n \in \mathbb N} f_n$$ is well defined. If $$g$$ dominates the sequence $$(f_n)$$, it satisfies $$0 \le \sum_{n \in \mathbb N} f_n \le g$$. But $\int_{\mathbb R} \sum_{n \in \mathbb N} f_n(x) \ dx = \sum_{n \in \mathbb N} \int_{\mathbb R} f_n(x) \ dx = \sum_{n \in \mathbb N} 1 = \infty$ and $$g$$ cannot be integrable. Continue reading Counterexamples around Lebesgue’s Dominated Convergence Theorem