Lyapunov's central limit theorem

In probability theory, Lyapunov's central limit theorem is one of the variants of the central limit theorem. Unlike the classical central limit theorem, which requires that the random variables in question be both independent and identically distributed, it only requires that they be independent. It is named for the Russian mathematician Aleksandr Lyapunov.

Statement of the theorem
Let $$X_{n}$$, $$n \in \mathbb{N}$$, be a sequence of independent random variables. Suppose that each $$X_{n}$$ has finite expected value $$\mathbb{E} [X_{n}] = \mu_{n}$$ and finite variance $$\mathrm{Var} [X_{n}] = \sigma_{n}^{2}$$. Suppose also that the third central moments


 * $$r_{n}^{3} := \mathbb{E} [ \left| X_{n} - \mu_{n} |^{3} \right]$$

are finite and satisfy the Lyapunov condition


 * $$\lim_{N \to \infty} \frac{\left( \sum_{n = 1}^{N} r_{n}^{3} \right)^{1/3}}{\left( \sum_{n = 1}^{N} \sigma_{n}^{2} \right)^{1/2}} = 0.$$

Let the random variable $$S_{N} := X_{1} + \dots + X_{N}$$ denote the $$N$$th partial sum of the random variables $$X_{n}$$. Then the normalised partial sum


 * $$Z_{N} := \frac{S_{N} - \sum_{n = 1}^{N} \mu_{n}}{\left( \sum_{n = 1}^{N} \sigma_{n}^{2} \right)^{1/2}}$$

converges in distribution to a standard normal random variable as $$N \to \infty$$.

Less formally, for "large" $$N$$, $$S_{N}$$ is approximately normally distributed with expected value


 * $$\mathbb{E} [S_{N}] \approx \sum_{n = 1}^{N} \mathbb{E} [X_{n}]$$

and variance


 * $$\mathrm{Var} [S_{N}] \approx \sum_{n = 1}^{N} \mathrm{Var} [X_{n}].$$