Prerequisite:


A sequence is an ordered list of numbers indexed by the natural numbers. A series is the result of adding a sequence together. Both concepts require precise definitions of convergence, and the tests for deciding whether a series converges constitute a substantial body of classical analysis.

Sequences

Definition. A sequence is a function $a: \mathbb{N} \to \mathbb{R}$, written $(a_n)_{n=1}^\infty$ or simply $(a_n)$.

Definition (Convergence). A sequence $(a_n)$ converges to $L$ if for every $\varepsilon > 0$ there exists $N \in \mathbb{N}$ such that $n > N \Rightarrow |a_n - L| < \varepsilon$. Write $\lim_{n\to\infty} a_n = L$.

If no such $L$ exists, the sequence diverges. Divergence includes the cases where $a_n \to \pm\infty$ and the case where the sequence oscillates without settling.

A convergent sequence is bounded. The converse is false: $((-1)^n)$ is bounded but diverges.

Monotone Convergence Theorem

Theorem. A sequence that is monotone and bounded converges.

More precisely:

  • If $(a_n)$ is non-decreasing ($a_{n+1} \geq a_n$) and bounded above, it converges to $\sup_n a_n$.
  • If $(a_n)$ is non-increasing and bounded below, it converges to $\inf_n a_n$.

Proof. By the completeness of $\mathbb{R}$, a non-empty bounded set has a supremum. Let $L = \sup_n a_n$. Given $\varepsilon > 0$, $L - \varepsilon$ is not an upper bound, so there exists $N$ with $a_N > L - \varepsilon$. Since the sequence is non-decreasing, $n \geq N \Rightarrow a_n \geq a_N > L - \varepsilon$. Also $a_n \leq L$. So $|a_n - L| < \varepsilon$. $\blacksquare$

This theorem is foundational: it proves the convergence of $e = \lim_{n\to\infty}(1+1/n)^n$, of continued fractions, and of many iterative algorithms without requiring knowledge of the limit in advance.

Cauchy Sequences and Completeness

Definition. A sequence $(a_n)$ is Cauchy if for every $\varepsilon > 0$ there exists $N$ such that $m, n > N \Rightarrow |a_m - a_n| < \varepsilon$.

Theorem (Cauchy criterion). A sequence of real numbers converges if and only if it is Cauchy.

The “only if” direction is easy (triangle inequality). The “if” direction is equivalent to the completeness of $\mathbb{R}$ - it fails in $\mathbb{Q}$ (the sequence of rational approximations to $\sqrt{2}$ is Cauchy in $\mathbb{Q}$ but has no rational limit). Completeness is what distinguishes $\mathbb{R}$ from $\mathbb{Q}$, and the Cauchy criterion provides a convergence test requiring no prior knowledge of the limit.

Infinite Series

Definition. Given a sequence $(a_n)$, define the $N$-th partial sum $S_N = \sum_{n=1}^N a_n$. The infinite series $\sum_{n=1}^\infty a_n$ converges if the sequence of partial sums $(S_N)$ converges, and the sum is $S = \lim_{N\to\infty} S_N$.

S_N
|                         *   *  *  *  * · · ·  → S
|                   *   *
|             *   *
|       *   *
|   * *
|  *
|
+--+--+--+--+--+--+--+--+-----------→ N
   1  2  3  4  5  6  7  8

Partial sums increasing toward limit S

Geometric Series

Theorem. For $|r| < 1$:

$$\sum_{n=0}^\infty r^n = \frac{1}{1-r}.$$

For $|r| \geq 1$ the series diverges.

Proof. Partial sum: $S_N = \sum_{n=0}^N r^n$. Multiply by $r$: $rS_N = \sum_{n=1}^{N+1} r^n$. Subtract: $(1-r)S_N = 1 - r^{N+1}$, so $S_N = (1-r^{N+1})/(1-r)$. For $|r| < 1$, $r^{N+1} \to 0$, giving $S_N \to 1/(1-r)$. For $|r| \geq 1$, $r^{N+1}$ does not tend to $0$, and the partial sums diverge. $\blacksquare$

The geometric series is ubiquitous: it gives the sum of a geometric progression, the value of a repeating decimal ($0.\overline{9} = \sum_{n=1}^\infty 9/10^n = 1$), and is the prototype for understanding convergence of power series.

Convergence Tests

Divergence Test (Necessary Condition)

Theorem. If $\sum a_n$ converges, then $a_n \to 0$.

Proof. $a_N = S_N - S_{N-1} \to S - S = 0$. $\blacksquare$

The converse is false: the harmonic series $\sum 1/n$ diverges even though $1/n \to 0$.

Integral Test

Theorem. Let $f: [1,\infty) \to \mathbb{R}$ be positive, continuous, and decreasing. Then $\sum_{n=1}^\infty f(n)$ converges if and only if $\int_1^\infty f(x),dx$ converges.

Proof sketch. From the monotonicity of $f$: $f(n+1) \leq \int_n^{n+1} f(x),dx \leq f(n)$. Summing from $1$ to $N-1$ gives $S_N - f(1) \leq \int_1^N f \leq S_N - f(N)$. The partial sums are bounded iff the integral is bounded. $\blacksquare$

Application. The harmonic series $\sum 1/n$ diverges because $\int_1^\infty 1/x,dx = \infty$. More generally, $\sum n^{-p}$ (the p-series) converges iff $p > 1$, matching the p-integral result.

Comparison and Limit Comparison Tests

Theorem (Comparison). If $0 \leq a_n \leq b_n$ for all large $n$, then:

  • $\sum b_n$ convergent $\Rightarrow$ $\sum a_n$ convergent.
  • $\sum a_n$ divergent $\Rightarrow$ $\sum b_n$ divergent.

Theorem (Limit Comparison). If $a_n, b_n > 0$ and $\lim_{n\to\infty} a_n/b_n = L \in (0,\infty)$, then $\sum a_n$ and $\sum b_n$ either both converge or both diverge.

Ratio Test

Theorem. Let $a_n > 0$ and $\rho = \lim_{n\to\infty} a_{n+1}/a_n$.

  • If $\rho < 1$: $\sum a_n$ converges.
  • If $\rho > 1$: $\sum a_n$ diverges.
  • If $\rho = 1$: the test is inconclusive.

Proof (convergence case). Choose $r \in (\rho, 1)$. Then $a_{n+1}/a_n < r$ for all $n \geq N$, so $a_N + k < a_N r^k$. The tail $\sum_{k=0}^\infty a_{N+k} \leq a_N \sum_{k=0}^\infty r^k = a_N/(1-r) < \infty$. $\blacksquare$

The ratio test compares the series to a geometric series. It is decisive for series involving factorials and exponentials. It fails ($\rho = 1$) for p-series.

Root Test (Cauchy)

Theorem. Let $\rho = \limsup_{n\to\infty} a_n^{1/n}$.

  • If $\rho < 1$: $\sum a_n$ converges absolutely.
  • If $\rho > 1$: $\sum a_n$ diverges.
  • If $\rho = 1$: inconclusive.

The root test uses $\limsup$ (not $\lim$) because the limit may not exist; it is strictly more powerful than the ratio test (whenever the ratio test gives a conclusion, so does the root test, but not vice versa).

Alternating Series Test (Leibniz)

Theorem. If $(b_n)$ is a decreasing sequence with $b_n \to 0$, then $\sum_{n=1}^\infty (-1)^{n+1} b_n$ converges.

Moreover, the error after $N$ terms satisfies $|S - S_N| \leq b_{N+1}$.

Proof. The partial sums $S_{2k}$ are increasing (adding $b_{2k+1} - b_{2k+2} \geq 0$) and bounded above by $b_1$. The partial sums $S_{2k+1}$ are decreasing and bounded below by $0$. Both subsequences converge to the same limit since $S_{2k+1} - S_{2k} = b_{2k+1} \to 0$. $\blacksquare$

Example. The alternating harmonic series $\sum (-1)^{n+1}/n = \ln 2$ converges (by Leibniz), but the harmonic series diverges - so it is conditionally convergent.

Absolute vs Conditional Convergence

Definition. $\sum a_n$ is absolutely convergent if $\sum |a_n| < \infty$. It is conditionally convergent if it converges but $\sum |a_n| = \infty$.

Theorem. Absolute convergence implies convergence.

The striking fact about conditional convergence: by Riemann’s rearrangement theorem, the terms of a conditionally convergent series can be rearranged to sum to any prescribed value in $[-\infty, +\infty]$. Absolute convergence is immune to rearrangement - the sum is the same regardless of order.

Power Series

A power series centred at $a$ is $\sum_{n=0}^\infty c_n (x-a)^n$.

Theorem (Radius of Convergence). For any power series, there exists $R \in [0,\infty]$ (the radius of convergence) such that:

  • The series converges absolutely for $|x - a| < R$.
  • The series diverges for $|x - a| > R$.
  • Behaviour at $|x - a| = R$ must be checked separately.

Finding $R$ via the ratio test. If $\lim_{n\to\infty} |c_{n+1}/c_n| = L$, then $R = 1/L$ (with $R = \infty$ if $L = 0$ and $R = 0$ if $L = \infty$).

Cauchy-Hadamard formula. $R = 1/\limsup_{n\to\infty} |c_n|^{1/n}$.

Term-by-term Differentiation and Integration

Theorem. A power series $f(x) = \sum_{n=0}^\infty c_n(x-a)^n$ with radius of convergence $R > 0$ is differentiable on $(a-R, a+R)$, and

$$f'(x) = \sum_{n=1}^\infty n,c_n(x-a)^{n-1}.$$

The differentiated series has the same radius of convergence $R$.

Similarly, $\int_a^x f(t),dt = \sum_{n=0}^\infty \frac{c_n}{n+1}(x-a)^{n+1}$, also with radius $R$.

Example. Starting from $\frac{1}{1-x} = \sum_{n=0}^\infty x^n$ (geometric, $|x| < 1$), integrate to get $-\ln(1-x) = \sum_{n=1}^\infty \frac{x^n}{n}$, hence $\ln(1+x) = \sum_{n=1}^\infty \frac{(-1)^{n+1}}{n}x^n$. Setting $x = 1$ (allowed by Abel’s theorem since the series converges at the endpoint) gives $\ln 2 = \sum_{n=1}^\infty (-1)^{n+1}/n$.

Examples

Algorithm analysis. The cost of many recursive algorithms satisfies a recurrence whose solution involves geometric series. Mergesort: $T(n) = 2T(n/2) + n$. Unrolling for $k$ levels gives a sum $\sum_{j=0}^k 2^j \cdot n/2^j = kn = n\log_2 n$ - a telescoping sum. Summing a geometric series is the engine of the Master Theorem.

Expected cost of algorithms. Expected number of trials until first success in Bernoulli($p$) trials: $E[T] = \sum_{k=1}^\infty k,(1-p)^{k-1}p = 1/p$ - derived by differentiating a geometric series term by term.

Function approximation. A neural network computing $\text{softmax}$ or $\exp$ in hardware uses polynomial approximations derived from power series, truncated at a degree where the error is below floating-point precision. The error bound from the alternating series test or Taylor remainder theorem guarantees correctness within the radius of convergence.

Numerical stability. The condition number of computing $e^x - 1$ for small $x$ is poor when done naively (catastrophic cancellation). The series $\sum_{n=1}^\infty x^n/n!$ avoids this - this is the expm1 function in every numerical library, whose correctness depends on convergence of the series and bounds from the Leibniz test.


Read Next: