Rayleigh distribution

In probability theory and statistics, the Rayleigh distribution is a continuous probability distribution. It can arise when a two-dimensional vector (e.g. wind velocity) has elements that are normally distributed, are uncorrelated, and have equal variance. The vector’s magnitude (e.g. wind speed) will then have a Rayleigh distribution. The distribution can also arise in the case of random complex numbers whose real and imaginary components are i.i.d. Gaussian. In that case, the modulus of the complex number is Rayleigh-distributed. The distribution was so named after Lord Rayleigh.

The Rayleigh probability density function is


 * $$f(x|\sigma) = \frac{x \exp\left(\frac{-x^2}{2\sigma^2}\right)}{\sigma^2}$$

for $$x \in [0,\infty).$$

Properties
The raw moments are given by:


 * $$\mu_k=\sigma^k2^{k/2}\,\Gamma(1+k/2)\,$$

where $$\Gamma(z)$$ is the Gamma function.

The mean and variance of a Rayleigh random variable may be expressed as:


 * $$\mu(X) = \sigma \sqrt{\frac{\pi}{2}}\,$$

and


 * $$\textrm{var}(X) = \frac{4 - \pi}{2} \sigma^2\,.$$

The skewness is given by:


 * $$\gamma_1=\frac{2\sqrt{\pi}(\pi - 3)}{(4-\pi)^{3/2}}.$$

The excess kurtosis is given by:


 * $$\gamma_2=-\frac{6\pi^2 - 24\pi +16}{(4-\pi)^2}.$$

The characteristic function is given by:


 * $$\varphi(t)=$$
 * $$1\!-\!\sigma te^{-\sigma^2t^2/2}\sqrt{\frac{\pi}{2}}\!\left(\textrm{erfi}\!\left(\frac{\sigma t}{\sqrt{2}}\right)\!-\!i\right)$$

where $$\operatorname{erfi}(z)$$ is the complex error function. The moment generating function is given by


 * $$M(t)=\,$$
 * $$1+\sigma t\,e^{\sigma^2t^2/2}\sqrt{\frac{\pi}{2}}

\left(\textrm{erf}\left(\frac{\sigma t}{\sqrt{2}}\right)\!+\!1\right),$$

where $$\operatorname{erf}(z)$$ is the error function.

Information entropy
The information entropy is given by



H = 1 + \ln\left(\frac{\sigma}{\sqrt{2}}\right) + \frac{\gamma}{2} $$

where $$\gamma$$ is the Euler–Mascheroni constant.

Parameter estimation
Given N independent and identically distributed Rayleigh random variables with parameter $$\sigma$$, the maximum likelihood estimate of $$\sigma$$ is


 * $$\hat{\sigma}=\sqrt{\frac{1}{2N}\sum_{i=1}^N x_i^2}.$$

Related distributions

 * $$R \sim \mathrm{Rayleigh}(\sigma)$$ is a Rayleigh distribution if $$R = \sqrt{X^2 + Y^2}$$ where $$X \sim N(0, \sigma^2)$$ and $$Y \sim N(0, \sigma^2)$$ are two independent normal distributions. (This gives motivation to the use of the symbol "sigma" in the above parameterization of the Rayleigh density.)
 * If $$R \sim \mathrm{Rayleigh}(1)$$ then $$R^2$$ has a chi-square distribution with two degrees of freedom: $$R^2 \sim \chi^2_2$$
 * If $$X$$ has an exponential distribution $$X \sim \mathrm{Exponential}(x|\lambda)$$ then $$Y=\sqrt{2X\sigma^2\lambda} \sim \mathrm{Rayleigh}(y|\sigma)$$.


 * If $$R \sim \mathrm{Rayleigh}(\sigma)$$ then $$\sum_{i=1}^N R_i^2$$ has a gamma distribution with parameters $$N$$ and $$2\sigma^2$$: $$[Y=\sum_{i=1}^N R_i^2] \sim \Gamma(N,2\sigma^2)$$.


 * The Chi distribution is a generalization of the Rayleigh distribution.
 * The Rice distribution is a generalization of the Rayleigh distribution.
 * The Weibull distribution is a generalization of the Rayleigh distribution.
 * The Maxwell–Boltzmann distribution describes the magnitude of a normal vector in three dimensions.