Tag Archives: power-series

A positive smooth function with all derivatives vanishing at zero

Let’s consider the set \(\mathcal C^\infty(\mathbb R)\) of real smooth functions, i.e. functions that have derivatives of all orders on \(\mathbb R\).

Does a positive function \(f \in \mathcal C^\infty(\mathbb R)\) with all derivatives vanishing at zero exists?

Such a map \(f\) cannot be expandable in power series around zero, as it would vanish in a neighborhood of zero. However, the answer to our question is positive and we’ll prove that \[
f(x) = \left\{\begin{array}{lll}
e^{-\frac{1}{x^2}} &\text{if} &x \neq 0\\
0 &\text{if} &x = 0 \end{array}\right. \] provides an example.

\(f\) is well defined and positive for \(x \neq 0\). As \(\lim\limits_{x \to 0} -\frac{1}{x^2} = -\infty\), we get \(\lim\limits_{x \to 0} f(x) = 0\) proving that \(f\) is continuous on \(\mathbb R\). Let’s prove by induction that for \(x \neq 0\) and \(n \in \mathbb N\), \(f^{(n)}(x)\) can be written as \[
f^{(n)}(x) = \frac{P_n(x)}{x^{3n}}e^{-\frac{1}{x^2}}\] where \(P_n\) is a polynomial function. The statement is satisfied for \(n = 1\) as \(f^\prime(x) = \frac{2}{x^3}e^{-\frac{1}{x^2}}\). Suppose that the statement is true for \(n\) then \[
f^{(n+1)}(x)=\left[\frac{P_n^\prime(x)}{x^{3n}} – \frac{3n P_n(x)}{x^{3n+1}}+\frac{2 P_n(x)}{x^{3n+3}}\right] e^{-\frac{1}{x^2}}\] hence the statement is also true for \(n+1\) by taking \(P_{n+1}(x)=
x^3 P_n^\prime(x) – 3n x^2 P_n(x) + 2 P_n(x)\). Which concludes our induction proof.

Finally, we have to prove that for all \(n \in \mathbb N\), \(\lim\limits_{x \to 0} f^{(n)}(x) = 0\). For that, we use the power expansion of the exponential map \(e^x = \sum_{n=0}^\infty \frac{x^n}{n!}\). For \(x \neq 0\), we have \[
\left\vert x \right\vert^{3n} e^{\frac{1}{x^2}} \ge \frac{\vert x \vert^{3n}}{(2n)! \vert x \vert ^{4n}} = \frac{1}{(2n)! \vert x \vert^n}\] Therefore \(\lim\limits_{x \to 0} \left\vert x \right\vert^{3n} e^{\frac{1}{x^2}} = \infty\) and \(\lim\limits_{x \to 0} f^{(n)}(x) = 0\) as \(f^{(n)}(x) = \frac{P_n(x)}{x^{3n} e^{\frac{1}{x^2}}}\) with \(P_n\) a polynomial function.

A power series converging everywhere on its circle of convergence defining a non-continuous function

Consider a complex power series \(\displaystyle \sum_{k=0}^\infty a_k z^k\) with radius of convergence \(0 \lt R \lt \infty\) and suppose that for every \(w\) with \(\vert w \vert = R\), \(\displaystyle \sum_{k=0}^\infty a_k w^k\) converges.

We provide an example where the power expansion at the origin \[
\displaystyle f(z) = \sum_{k=0}^\infty a_k z^k\] is discontinuous on the closed disk \(\vert z \vert \le R \).

The function \(f\) is constructed as an infinite sum \[
\displaystyle f(z) = \sum_{n=1}^\infty f_n(z)\] with \(f_n(z) = \frac{\delta_n}{a_n-z}\) where \((\delta_n)_{n \in \mathbb N}\) is a sequence of positive real numbers and \((a_n)\) a sequence of complex numbers of modulus larger than one and converging to one. Let \(f_n^{(r)}(z)\) denote the sum of the first \(r\) terms in the power series expansion of \(f_n(z)\) and \(\displaystyle f^{(r)}(z) \equiv \sum_{n=1}^\infty f_n^{(r)}(z)\).

We’ll prove that:

  1. If \(\sum_n \delta_n \lt \infty\) then \(\sum_{n=1}^\infty f_n^{(r)}(z)\) converges and \(f(z) = \lim\limits_{r \to \infty} \sum_{n=1}^\infty f_n^{(r)}(z)\) for \(\vert z \vert \le 1\) and \(z \neq 1\).
  2. If \(a_n=1+i \epsilon_n\) and \(\sum_n \delta_n/\epsilon_n < \infty\) then \(\sum_{n=1}^\infty f_n^{(r)}(1)\) converges and \(f(1) = \lim\limits_{r \to \infty} \sum_{n=1}^\infty f_n^{(r)}(1)\)
  3. If \(\delta_n/\epsilon_n^2 \to \infty\) then \(f(z)\) is unbounded on the disk \(\vert z \vert \le 1\).

First, let’s recall this corollary of Lebesgue’s dominated convergence theorem:

Let \((u_{n,i})_{(n,i) \in \mathbb N \times \mathbb N}\) be a double sequence of complex numbers. Suppose that \(u_{n,i} \to v_i\) for all \(i\) as \(n \to \infty\), and that \(\vert u_{n,i} \vert \le w_i\) for all \(n\) with \(\sum_i w_i < \infty\). Then for all \(n\) the series \(\sum_i u_{n,i}\) is absolutely convergent and \(\lim_n \sum_i u_{n,i} = \sum_i v_i\).
Continue reading A power series converging everywhere on its circle of convergence defining a non-continuous function

Root test

The root test is a test for the convergence of a series \[
\sum_{n=1}^\infty a_n \] where each term is a real or complex number. The root test was developed first by Augustin-Louis Cauchy.

We denote \[l = \limsup\limits_{n \to \infty} \sqrt[n]{\vert a_n \vert}.\] \(l\) is a non-negative real number or is possibly equal to \(\infty\). The root test states that:

  • if \(l < 1\) then the series converges absolutely;
  • if \(l > 1\) then the series diverges.

The root test is inconclusive when \(l = 1\).

A case where \(l=1\) and the series diverges

The harmonic series \(\displaystyle \sum_{n=1}^\infty \frac{1}{n}\) is divergent. However \[\sqrt[n]{\frac{1}{n}} = \frac{1}{n^{\frac{1}{n}}}=e^{- \frac{1}{n} \ln n} \] and \(\limsup\limits_{n \to \infty} \sqrt[n]{\frac{1}{n}} = 1\) as \(\lim\limits_{n \to \infty} \frac{\ln n}{n} = 0\).

A case where \(l=1\) and the series converges

Consider the series \(\displaystyle \sum_{n=1}^\infty \frac{1}{n^2}\). We have \[\sqrt[n]{\frac{1}{n^2}} = \frac{1}{n^{\frac{2}{n}}}=e^{- \frac{2}{n} \ln n} \] Therefore \(\limsup\limits_{n \to \infty} \sqrt[n]{\frac{1}{n^2}} = 1\), while the series \(\displaystyle \sum_{n=1}^\infty \frac{1}{n^2}\) is convergent as we have seen in the ratio test article. Continue reading Root test

Ratio test

The ratio test is a test for the convergence of a series \[
\sum_{n=1}^\infty a_n \] where each term is a real or complex number and is nonzero when \(n\) is large. The test is sometimes known as d’Alembert’s ratio test.

Suppose that \[\lim\limits_{n \to \infty} \left\vert \frac{a_{n+1}}{a_n} \right\vert = l\] The ratio test states that:

  • if \(l < 1\) then the series converges absolutely;
  • if \(l > 1\) then the series diverges.

What if \(l = 1\)? One cannot conclude in that case.

Cases where \(l=1\) and the series diverges

Consider the harmonic series \(\displaystyle \sum_{n=1}^\infty \frac{1}{n}\). We have \(\lim\limits_{n \to \infty} \frac{n+1}{n} = 1\). It is well know that the harmonic series diverges. Recall that one proof uses the Cauchy’s convergence test based for \(k \ge 1\) on the inequalities: \[
\sum_{n=2^k+1}^{2^{k+1}} \frac{1}{n} \ge \sum_{n=2^k+1}^{2^{k+1}} \frac{1}{2^{k+1}} = \frac{2^{k+1}-2^k}{2^{k+1}} \ge \frac{1}{2}\]

An even simpler case is the series \(\displaystyle \sum_{n=1}^\infty 1\).

Cases where \(l=1\) and the series converges

We also have \(\lim\limits_{n \to \infty} \left\vert \frac{a_{n+1}}{a_n} \right\vert = 1\) for the infinite series \(\displaystyle \sum_{n=1}^\infty \frac{1}{n^2}\). The series is however convergent as for \(n \ge 1\) we have:\[
0 \le \frac{1}{(n+1)^2} \le \frac{1}{n(n+1)} = \frac{1}{n} – \frac{1}{n+1}\] and the series \(\displaystyle \sum_{n=1}^\infty \left(\frac{1}{n} – \frac{1}{n+1} \right)\) obviously converges.

Another example is the alternating series \(\displaystyle \sum_{n=1}^\infty \frac{(-1)^n}{n}\).

Radius of convergence of power series

We look here at the radius of convergence of the sum and product of power series.

Let’s recall that for a power series \(\displaystyle \sum_{n=0}^\infty a_n x^n\) where \(0\) is not the only convergence point, the radius of convergence is the unique real \(0 < R \le \infty\) such that the series converges whenever \(\vert x \vert < R\) and diverges whenever \(\vert x \vert > R\).

Given two power series with radii of convergence \(R_1\) and \(R_2\), i.e.
\begin{align*}
\displaystyle f_1(x) = \sum_{n=0}^\infty a_n x^n, \ \vert x \vert < R_1 \\ \displaystyle f_2(x) = \sum_{n=0}^\infty b_n x^n, \ \vert x \vert < R_2 \end{align*} The sum of the power series \begin{align*} \displaystyle f_1(x) + f_2(x) &= \sum_{n=0}^\infty a_n x^n + \sum_{n=0}^\infty b_n x^n \\ &=\sum_{n=0}^\infty (a_n + b_n) x^n \end{align*} and its Cauchy product:
\begin{align*}
\displaystyle f_1(x) \cdot f_2(x) &= \left(\sum_{n=0}^\infty a_n x^n\right) \cdot \left(\sum_{n=0}^\infty b_n x^n \right) \\
&=\sum_{n=0}^\infty \left( \sum_{l=0}^n a_l b_{n-l}\right) x^n
\end{align*}
both have radii of convergence greater than or equal to \(\min \{R_1,R_2\}\).

The radii can indeed be greater than \(\min \{R_1,R_2\}\). Let’s give examples.
Continue reading Radius of convergence of power series