Gauss's inequality

From HandWiki

In probability theory, Gauss's inequality (or the Gauss inequality) gives an upper bound on the probability that a unimodal random variable lies more than any given distance from its mode.

Let X be a unimodal random variable with mode m, and let τ 2 be the expected value of (X − m)2. (τ 2 can also be expressed as (μ − m)2 + σ 2, where μ and σ are the mean and standard deviation of X.) Then for any positive value of k,

[math]\displaystyle{ \Pr(|X - m| \gt k) \leq \begin{cases} \left( \frac{2\tau}{3k} \right)^2 & \text{if } k \geq \frac{2\tau}{\sqrt{3}} \\[6pt] 1 - \frac{k}{\tau\sqrt{3}} & \text{if } 0 \leq k \leq \frac{2\tau}{\sqrt{3}}. \end{cases} }[/math]

The theorem was first proved by Carl Friedrich Gauss in 1823.

Extensions to higher-order moments

Winkler in 1866 extended Gauss' inequality to rth moments [1] where r > 0 and the distribution is unimodal with a mode of zero. This is sometimes called Camp–Meidell's inequality.[2][3]

[math]\displaystyle{ P( | X | \ge k ) \le \left( \frac{ r } { r + 1 } \right)^r \frac{ \operatorname{ E }( | X | )^r } { k^r } \quad \text{if} \quad k^r \ge \frac{ r^r } { ( r + 1 )^{ r + 1 } } \operatorname{ E }( | X |^r ), }[/math]
[math]\displaystyle{ P( | X | \ge k) \le \left( 1 - \left[ \frac{ k^r }{ ( r + 1 ) \operatorname{ E }( | X | )^r } \right]^{ 1 / r } \right) \quad \text{if} \quad k^r \le \frac{r^r} { (r + 1)^{r + 1} } \operatorname{E}( | X |^r ). }[/math]

Gauss' bound has been subsequently sharpened and extended to apply to departures from the mean rather than the mode due to the Vysochanskiï–Petunin inequality. The latter has been extended by Dharmadhikari and Joag-Dev[4]

[math]\displaystyle{ P( | X | \gt k ) \le \max\left( \left[ \frac r {( r + 1 ) k } \right]^r E| X^r |, \frac s {( s - 1 ) k^r } E| X^r | - \frac 1 { s - 1 } \right) }[/math]

where s is a constant satisfying both s > r + 1 and s(s − r − 1) = rr and r > 0.

It can be shown that these inequalities are the best possible and that further sharpening of the bounds requires that additional restrictions be placed on the distributions.

See also

  • Vysochanskiï–Petunin inequality, a similar result for the distance from the mean rather than the mode
  • Chebyshev's inequality, concerns distance from the mean without requiring unimodality
  • Concentration inequality – a summary of tail-bounds on random variables.

References

  1. Winkler A. (1886) Math-Natur theorie Kl. Akad. Wiss Wien Zweite Abt 53, 6–41
  2. Pukelsheim, Friedrich (May 1994). "The Three Sigma Rule" (in en). The American Statistician 48 (2): 88–91. doi:10.1080/00031305.1994.10476030. ISSN 0003-1305. http://www.tandfonline.com/doi/abs/10.1080/00031305.1994.10476030. 
  3. Bickel, Peter J.; Krieger, Abba M. (1992). "Extensions of Chebyshev's Inequality with Applications". Probability and Mathematical Statistics 13 (2): 293–310. ISSN 0208-4147. http://www.math.uni.wroc.pl/~pms/files/13.2/Article/13.2.11.pdf. Retrieved 6 October 2012. 
  4. Dharmadhikari, S. W.; Joag-Dev, K. (1985). "The Gauss–Tchebyshev inequality for unimodal distributions". Teoriya Veroyatnostei i ee Primeneniya 30 (4): 817–820. http://www.mathnet.ru/links/1043b08ee307d37ab64be90bd9332b88/tvp1919.pdf. 
  • Gauss, C. F. (1823). "Theoria Combinationis Observationum Erroribus Minimis Obnoxiae, Pars Prior". Commentationes Societatis Regiae Scientiarum Gottingensis Recentiores 5. 
  • Upton, Graham; Cook, Ian (2008). "Gauss inequality". A Dictionary of Statistics. Oxford University Press. http://www.answers.com/topic/gauss-inequality. 
  • Sellke, T.M.; Sellke, S.H. (1997). "Chebyshev inequalities for unimodal distributions". American Statistician (American Statistical Association) 51 (1): 34–40. doi:10.2307/2684690. 
  • Pukelsheim, F. (1994). "The Three Sigma Rule". American Statistician (American Statistical Association) 48 (2): 88–91. doi:10.2307/2684253.