An inequality on location and scale parameters

For probability distributions having an expected value and a median, the mean (i.e., the expected value) and the median can never differ from each other by more than one standard deviation. To express this in mathematical notation, let &mu;, m, and &sigma; be respectively the mean, the median, and the standard deviation. Then


 * $$\left|\mu-m\right| \leq \sigma.$$

(There is no need to rely on an assumption that the variance exists, i.e., is finite. Unlike the situation with the expected value, saying the variance exists is equivalent to saying the variance is finite.  But this inequality is trivially true if the variance is infinite.)

Proof
This proof uses Jensen's inequality twice. We have The first inequality comes from (the convex version of) Jensen's inequality applied to the absolute value function, which is convex. The second comes from the fact that the median minimizes the absolute deviation function


 * $$a \mapsto \mathrm{E}(\left|X-a\right|).$$

The third inequality comes from (the concave version of) Jensen's inequality applied to the square root function, which is concave. Q.E.D.

Alternative proof
The one-tailed version of Chebyshev's inequality is


 * $$\Pr(X-\mu \geq k\sigma)\leq\frac{1}{1+k^2}.$$

Letting k = 1 gives Pr(X &ge; &mu; + &sigma;) &le; 1/2 and (by changing the sign of X and so &mu;) Pr(X &le; &mu; &minus; &sigma;) &le; 1/2. So the median is within one standard deviation of the mean.