Hostname: page-component-8448b6f56d-dnltx Total loading time: 0 Render date: 2024-04-23T21:07:13.655Z Has data issue: false hasContentIssue false

A CHEBYSHEV INEQUALITY FOR MULTIVARIATE NORMAL DISTRIBUTION

Published online by Cambridge University Press:  27 February 2007

Davaadorjin Monhor
Affiliation:
Faculty of Geoinformatics, University of West Hungary, H-8002 Szekesfehervar, Hungary, E-mail: monhor@ella.hu

Abstract

Chebyshev inequality estimates the probability for exceeding the deviation of a random variable from its mathematical expectation in terms of the variance of the random variable. In modern probability theory, the Chebyshev inequality is the most frequently used tool for proving different convergence processes; for example, it plays a fundamental role in proofs of various forms of laws of large numbers. The mathematical expression of the bound on the probability in the Chebyshev inequality is very simple and can be modified easily for different kinds of sequence of random variables (e.g., for the case of sums of independent random variables). This fact lies behind these frequent applications. In this setting, the Chebyshev inequality has pure theoretical “applications” in probability theory and its role is to provide “a guarantee” of convergence but not to give a bound on concrete probability content.

In the present article we consider the Chebyshev inequality as a probability bound that is essential for the translation from its conventional theoretical applications to the practical setting if easy-to-compute multivariate generalizations are derived.

Such an inequality for the random vectors having multivariate Normal distribution is proved. The new inequality gives a lower bound in terms of variances on the probability that the random vector in question falls into an Euclidean ball with center at mean vector. The need and importance of consideration of this kind of multivariate Chebyshev inequality stemmed from several problems in engineering and informational sciences (Hassibi and Boyd [9], Jeng [10], Jeng and Woods [11], Molina, Katseggelos, Mateos, Hermoso, and Segall [13]). Jeng [10] derived an inequality that gives an upper bound for the probability in question. The simultaneous application of the established multivariate Chebyshev inequality and Jeng's inequality is useful in practical problems by providing lower and upper bounds on the probability content.

The inequality is attractive by its being easy to compute and its similarity to the original Chebyshev inequality, in contrast to well-known complicated multivariate Chebyshev inequalities. The present article also gives some insights into the very origin of the Chebyshev inequality, which makes the article self-contained.

Type
Research Article
Copyright
© 2007 Cambridge University Press

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

REFERENCES

Anderson, T.W. (1958). An Introduction to multivariate statistical analysis. New York: Wiley.
Berge, P.O. (1937). A note on a form of Tchebycheff's theorem for two variables. Biometrika 29: 405406.Google Scholar
Bickel, P.J. & Krieger, A.M. (1992). Extensions of Chebyshev's inequality with applications. Probability and Mathematical Statistics 13: 293310.Google Scholar
Bienaymé, I.J. (1853). Considérations à l'appui de la découverte de Laplace sur loi de probabilité dans la méthode des moindres carrés. Comptes Rendus de l'Academie des Sciences, Paris 37: 309324.Google Scholar
Chebyshev, P.L. (1867). Des valeurs Moyennes, Journal de Mathématique Pures et Appliquées 12: 177184.Google Scholar
Chebyshev, P.L. (1929). In D.E. Smith (ed.), A source book in mathematics. New York: McGraw-Hill, pp. 580587.
Gauss, C.F. (1823). Theoria Combinationis Observationum Erroribus minimis Obnoxiae. Pars Prior, Pars Posterior, Supplementum. Göttingen: Dieterich.
Gauss, K.F. (1995). Theory of the combination of observations least subject to errors, Part One, Part Two, Supplement. G.W. Stewart (trans.). Philadelphia: SIAM.CrossRef
Hassibi, A. & Boyd, S. (1998). Integer parameter estimation in linear models with applications to GPS. IEEE Transactions on Signal Processing 46: 29382952.Google Scholar
Jeng, F.C. (1988). Compound Gauss–Markov random fields for image estimation and restoration. PhD thesis, Renselaer Polytechnic Institute, Troy, NY.
Jeng F.C., &Woods, J.W. (1990). Simulated annealing compound Gauss–Markov random fields. IEEE Transactions on Information Theory 36: 94107.Google Scholar
Lal, D.N. (1955). A note on a form of Tchebycheff's inequality for two variables. Sankhya 15: 317320.Google Scholar
Molina, R., Katseggelos, A.K., Mateos, J., Hermoso, A., & Segall, C.A. (2000). Restoration of severely blurred high range images using stochastic and deterministic relaxation algorithms in compound Gauss-Markov random fields. Pattern Recognition 33: 555571.Google Scholar
Monhor, D. (1983). An inequality for multivariate Dirichlet distibution. Acta Mathematica Hungarica 47: 161163.Google Scholar
Monhor, D. & Takemoto, S. (2005). Understanding the concept of outlier and its relevance to the assessment of data quality: Probabilistic background theory. Earth, Planets and Space 57: 10091018.Google Scholar
Olkin, I. & Pratt, J.W. (1958). A multivariate Tchebycheff's inequality. Annals of Mathematical Statistics 29: 226234.Google Scholar
Ross, S.M. (1993). Introduction to probability models, 5th ed. San Diego: Academic Press.
Wozencraft, J.M. & Jacobs, I.M. (1965). Principles of communications engineering. New York: Wiley.