Chebyshev’s Theorem

The mean is important in that it identifies a “weighted center” for a random variable.   The variance is important in that it identifies a notion of “spread” for a random variable.   The variance has the important property that it serves on all scales as a kind of barrier on how likely a random variable is to be far from the mean.   This is made precise in the following theorem due to Chebyshev.   (Note that there are several variants of the english writing of this name   â€“   Tchebycheff is a common alternative.)

Theorem (Chebyshev): For any random variable   $X$   with mean   $\mu$   and standard deviation   $\sigma\,$ , and any   $k\gt 0\,$ , we have $$P\left( \big| X-\mu \big| \gt k\,\sigma \right) \lt \frac{1}{k^2} \;\text{.}$$

To see this, let   $p$   be the density function of   $X\;$ .   Follow the string of comparisons.
$$\begin{array}{rl} P\left( \big| X-\mu \big| \gt k\,\sigma \right) & =\int_{|x-\mu|\gt k\,\sigma}\, p(x)\, {\rm d}x \\ & \\ & \lt \int_{|x-\mu|\gt k\,\sigma}\, \left(\frac{x-\mu}{k\,\sigma}\right)^2\, p(x)\, {\rm d}x \\ & \\ & \lt\left(\frac{1}{k\,\sigma}\right)^2\,\int_{|x-\mu|\gt k\,\sigma}\, (x-\mu)^2\, p(x)\, {\rm d}x \\ & \\ & =\left(\frac{1}{k\,\sigma}\right)^2\,\sigma^2 \\ & \\ & =\frac{1}{k^2} \end{array}$$

An immediate equivalence, sometimes stated as Chebyshev’s inequality, is
$$P\left( \big| X-\mu \big| \le k\,\sigma \right) \ge 1- \frac{1}{k^2} \;\text{.}$$

Here is a simple example where we estimate the the likelihood that a random variable takes on sufficiently small values.

Example: A random variable has mean   $\mu =3$   and standard deviation   $\sigma =2\;$ .   Estimate the probability   $\displaystyle{ P\left( \big| X-3 \big| \lt 5 \right) }\;$ .

$$\begin{array}{rl} P\left( \big| X-3 \big| \lt 5 \right) & =P\left( \big| X-\mu \big| \lt \frac{5}{2}\,\sigma \right) \\ & \\ & \gt 1- \left(\frac{1}{5/2}\right)^2 \\ & \\ & =\frac{21}{25} \end{array}$$
That is, the probability that   $X$   is between   $-2$   and   $10$   is at least   $.84\;$ .

Here is an example where Chebyshev’s inequality is used to estimate   $\sigma$   for random a variable   $X\;$ .

Example: If it is known that random variable   $X$   has mean   $\mu =0\,$ , possesses a standard deviation, and satisfies $\displaystyle{ P\left( \big| X \big| \gt 3 \right) =\frac{1}{4} }\,$ , estimate   $\sigma\;$ .

Let   $k\,\sigma =3\,$ , so that   $\displaystyle{ k=\frac{3}{\sigma} }\;$ .   Then
$$P\left( \big| X \big| \gt 3 \right) =\frac{1}{4} \lt \left( \frac{\sigma}{3} \right)^2 \;\text{.}$$
Thus
$$\frac{1}{2} \lt \frac{\sigma}{3} \,\text{,}$$
and
$$\sigma\gt\frac{3}{2} \;\text{.}$$