Converse of fundamental theorem of calculus

The fundamental theorem of calculus asserts that for a continuous real-valued function \(f\) defined on a closed interval \([a,b]\), the function \(F\) defined for all \(x \in [a,b]\) by
\[F(x)=\int _{a}^{x}\!f(t)\,dt\] is uniformly continuous on \([a,b]\), differentiable on the open interval \((a,b)\) and \[
F^\prime(x) = f(x)\]
for all \(x \in (a,b)\).

The converse of fundamental theorem of calculus is not true as we see below.

Consider the function defined on the interval \([0,1]\) by \[
f(x)= \begin{cases}
2x\sin(1/x) – \cos(1/x) & \text{ for } x \neq 0 \\
0 & \text{ for } x = 0 \end{cases}\] \(f\) is integrable as it is continuous on \((0,1]\) and bounded on \([0,1]\). Then \[
F(x)= \begin{cases}
x^2 \sin \left( 1/x \right) & \text{ for } x \neq 0 \\
0 & \text{ for } x = 0 \end{cases}\] \(F\) is differentiable on \([0,1]\). It is clear for \(x \in (0,1]\). \(F\) is also differentiable at \(0\) as for \(x \neq 0\) we have \[
\left\vert \frac{F(x) – F(0)}{x-0} \right\vert = \left\vert \frac{F(x)}{x} \right\vert \le \left\vert x \right\vert.\] Consequently \(F^\prime(0) = 0\).

However \(f\) is not continuous at \(0\) as it does not have a right limit at \(0\).

Four elements rings

A group with four elements is isomorphic to either the cyclic group \(\mathbb Z_4\) or to the Klein four-group \(\mathbb Z_2 \times \mathbb Z_2\). Those groups are commutative. Endowed with the usual additive and multiplicative operations, \(\mathbb Z_4\) and \(\mathbb Z_2 \times \mathbb Z_2\) are commutative rings.

Are all four elements rings also isomorphic to either \(\mathbb Z_4\) or \(\mathbb Z_2 \times \mathbb Z_2\)? The answer is negative. Let’s provide two additional examples of commutative rings with four elements not isomorphic to \(\mathbb Z_4\) or \(\mathbb Z_2 \times \mathbb Z_2\).

The first one is the field \(\mathbb F_4\). \(\mathbb F_4\) is a commutative ring with four elements. It is not isomorphic to \(\mathbb Z_4\) or \(\mathbb Z_2 \times \mathbb Z_2\) as both of those rings have zero divisor. Indeed we have \(2 \cdot 2 = 0\) in \(\mathbb Z_4\) and \((1,0) \cdot (0,1)=(0,0)\) in \(\mathbb Z_2 \times \mathbb Z_2\).

A second one is the ring \(R\) of the matrices \(\begin{pmatrix}
x & 0\\
y & x\end{pmatrix}\) where \(x,y \in \mathbb Z_2\). One can easily verify that \(R\) is a commutative subring of the ring \(M_2(\mathbb Z_2)\). It is not isomorphic to \(\mathbb Z_4\) as its characteristic is \(2\). This is not isomorphic to \(\mathbb Z_2 \times \mathbb Z_2\) either as \(\begin{pmatrix}
0 & 0\\
1 & 0\end{pmatrix}\) is a non-zero matrix solution of the equation \(X^2=0\). \((0,0)\) is the only solution of that equation in \(\mathbb Z_2 \times \mathbb Z_2\).

One can prove that the four rings mentioned above are the only commutative rings with four elements up to isomorphism.

Counterexamples around series (part 2)

We follow the article counterexamples around series (part 1) providing additional funny series examples.

If \(\sum u_n\) converges and \((u_n)\) is non-increasing then \(u_n = o(1/n)\)?

This is true. Let’s prove it.
The hypotheses imply that \((u_n)\) converges to zero. Therefore \(u_n \ge 0\) for all \(n \in \mathbb N\). As \(\sum u_n\) converges we have \[
\displaystyle \lim\limits_{n \to \infty} \sum_{k=n/2}^{n} u_k = 0.\] Hence for \(\epsilon \gt 0\), one can find \(N \in \mathbb N\) such that \[
\epsilon \ge \sum_{k=n/2}^{n} u_k \ge \frac{1}{2} (n u_n) \ge 0\] for all \(n \ge N\). Which concludes the proof.

\(\sum u_n\) convergent is equivalent to \(\sum u_{2n}\) and \(\sum u_{2n+1}\) convergent?

Is not true as we can see taking \(u_n = \frac{(-1)^n}{n}\). \(\sum u_n\) converges according to the alternating series test. However for \(n \in \mathbb N\) \[
\sum_{k=1}^n u_{2k} = \sum_{k=1}^n \frac{1}{2k} = 1/2 \sum_{k=1}^n \frac{1}{k}.\] Hence \(\sum u_{2n}\) diverges as the harmonic series diverges.

\(\sum u_n\) absolutely convergent is equivalent to \(\sum u_{2n}\) and \(\sum u_{2n+1}\) absolutely convergent?

This is true and the proof is left to the reader.

\(\sum u_n\) is a positive convergent series then \((\sqrt[n]{u_n})\) is bounded?

Is true. If not, there would be a subsequence \((u_{\phi(n)})\) such that \(\sqrt[\phi(n)]{u_{\phi(n)}} \ge 2\). Which means \(u_{\phi(n)} \ge 2^{\phi(n)}\) for all \(n \in \mathbb N\) and implies that the sequence \((u_n)\) is unbounded. In contradiction with the convergence of the series \(\sum u_n\).

If \((u_n)\) is strictly positive with \(u_n = o(1/n)\) then \(\sum (-1)^n u_n\) converges?

It does not hold as we can see with \[
u_n=\begin{cases} \frac{1}{n \ln n} & n \equiv 0 [2] \\
\frac{1}{2^n} & n \equiv 1 [2] \end{cases}\] Then for \(n \in \mathbb N\) \[
\sum_{k=1}^{2n} (-1)^k u_k \ge \sum_{k=1}^n \frac{1}{2k \ln 2k} – \sum_{k=1}^{2n} \frac{1}{2^k} \ge \sum_{k=1}^n \frac{1}{2k \ln 2k} – 1.\] As \(\sum \frac{1}{2k \ln 2k}\) diverges as can be proven using the integral test with the function \(x \mapsto \frac{1}{2x \ln 2x}\), \(\sum (-1)^n u_n\) also diverges.

Group homomorphism versus ring homomorphism

A ring homomorphism is a function between two rings which respects the structure. Let’s provide examples of functions between rings which respect the addition or the multiplication but not both.

An additive group homomorphism that is not a ring homomorphism

We consider the ring \(\mathbb R[x]\) of real polynomials and the derivation \[
\begin{array}{l|rcl}
D : & \mathbb R[x] & \longrightarrow & \mathbb R[x] \\
& P & \longmapsto & P^\prime \end{array}\] \(D\) is an additive homomorphism as for all \(P,Q \in \mathbb R[x]\) we have \(D(P+Q) = D(P) + D(Q)\). However, \(D\) does not respect the multiplication as \[
D(x^2) = 2x \neq 1 = D(x) \cdot D(x).\] More generally, \(D\) satisfies the Leibniz rule \[
D(P \cdot Q) = P \cdot D(Q) + Q \cdot D(P).\]

A multiplication group homomorphism that is not a ring homomorphism

The function \[
\begin{array}{l|rcl}
f : & \mathbb R & \longrightarrow & \mathbb R \\
& x & \longmapsto & x^2 \end{array}\] is a multiplicative group homomorphism of the group \((\mathbb R, \cdot)\). However \(f\) does not respect the addition.