Publication:
On asymptotic properties of information-theoretic divergences

Loading...
Thumbnail Image
Full text at PDC
Publication Date
2003-07
Advisors (or tutors)
Editors
Journal Title
Journal ISSN
Volume Title
Publisher
Institute of Electrical and Electronics Engineers
Citations
Google Scholar
Research Projects
Organizational Units
Journal Issue
Abstract
Abstract—Mutual asymptotic equivalence is established within three classes of information-theoretic divergences of discrete probability distributions, namely, -divergences of Csiszár, -divergences of Bregman, and -divergences of Burbea–Rao. These equivalences are used to find asymptotic distributions of the corresponding divergence statistics for testing the goodness of fit when the hypothetic distribution is uniform. All results are based on standard expansion techniques and on a new relation between the Bregman and Burbea–Rao divergences formulated in Lemma 2.
Description
Unesco subjects
Keywords
Citation
L.M.Bregman,”The relaxation method of finding the common point of convex sets and its application to the solution of problems in convex programming,”USSR Comput.Math.and Math.Phys.,vol.7,pp.200–217, 1967. J. Burbea and C. R. Rao, ”On the convexity of some divergence measures based on entropy functions,” Trans. IEEE Inform. Theory, vol. IT-28, pp. 489–495, May 1982. T. M. Cover and J. B. Thomas, Elements of Information Theory. New York: Wiley, 1991. I. Csiszár, ”Eine Informationstheoretische Ungleichung und ihre Anwendung auf den Beweis der Ergodizität von Markoffschen Ketten,” Publ. Math. Inst. Hungar. Acad. Sci., Ser. A, vol. 8, pp. 85–108, 1963. I. Csiszár, ”Information-type measure of difference of probability distributions and indirect observations,” Studia Sci. Math. Hungar., vol. 2, pp. 299–318, 1967. I. Csiszár, ”Why least squares and maximum entropy? An axiomatic approach to inference for linear inverse problems,” Ann. Statist., vol. 19, pp. 2031–2066, 1991. I. Csiszár, ”Maximum entropy and related methods,” in Trans. 12th Prague Conf. Information Theory, Statistical Decision Functions and Random Processes. Prague, Czech Rep.: Czech Acad. Sci., 1994, pp. 58–62. R. M. Gray, Entropy and Information Theory. New York: Springer-Verlag, 1990. L. Györfi, G. Morvai, and I. Vajda,”Information-theoretic methods in testing the goodness-offit,”in Proc. Int. Symp. Information Theory (ISIT 2000), Sorrento, Italy, June 25–30, 2000, p. 28. L. Györfi and I. Vajda, ”A class of modified Pearson and Neyman statistics,” Statist. Decisions, vol. 19, pp. 239–251, 2001. L. Györfi and I. Vajda, ”Asymptotic distributions for goodness-of-fit statistics in a sequence of multinomial models,” Statist. Probab. Lett., vol. 56, pp. 57–67, 2002. G. H. Hardy, J. E. Littlewood, and G. Pólya, Inequalities, 2nd ed. London, U.K.: Cambridge Univ. Press, 1952. T. Inglot, T. Jurlewicz, and T. Ledwina, ”Asymptotics for multinomial goodness of fit tests for simple hypothesis,” Theory Probab. and Applications, vol. 35, pp. 771–777,1990. L. Le Cam, Asymptotic Methods in Statistical Decision Theory. New York: Springer-Verlag, 1986. F. Liese and I. Vajda, Convex Statistical Distances. Leipzig, Germany: Teubner, 1987. A. W. Marshall and I. Olkin, Inequalities: Theory of Majorization and Its Applications. New York: Academic, 1979. M. L. Menéndez, D. Morales, L. Pardo, and I. Vajda, ”Two approaches to grouping of data and related disparity statistics,” Commun. Statist. Theory Methods, vol. 27, pp. 609–633, 1998. M. C. Pardo, ”On Burbea-Rao divergence based goodness-of-fit tests for multinomial models,"J. Multivariate Anal., vol. 69, pp. 65–87, 1999. L.Pardo,D.Morales,M.Salicrú,and M.L.Menéndez, ”Rh '-divergence statistics in applied categorical data analysis with stratified sampling,”Utilitas Math., vol. 44, pp. 145–164, 1993. M. C. Pardo and I. Vajda, ”About distances of discrete distributions satisfying the data processing theorem of information theory,” Trans. IEEE Inform. Theory, vol. 43, pp. 1288–1293, July 1997. T. R. C. Read and N. Cressie, Goodness of Fit Statistics for Discrete Multivariate Data. New York: Springer-Verlag, 1988. S. T. Rachev, ”On a class of minimum functionals in a space of probability measures,” Theory Probab. and Applications, vol. 29, pp. 41–48, 1984. R. J. Serfling, Approximation Theorems of Mathematical Statistics. New York: Wiley, 1980. S. H. Tumanyan, ”On the asymptotic distribution of the chi-square criterion,” Dokl. Akad. Nauk SSSR, vol. 94, pp. 1011–1012, 1954. I. Vajda, Theory of Statistical Inference and Information. Boston, MA: Kluwer, 1989. K. Zografos, K. Ferentinos, and T. Papaioannou, ”'-divergence statistics: Sampling properties and multinomial goodness of fit and divergence tests,” Commun. Statist. Theory Methods, vol. 19, pp. 1785–1802, 1990.
Collections