Publication:
Minimum (h,phi )-divergences estimators with weights

Loading...
Thumbnail Image
Full text at PDC
Publication Date
2003-07-30
Advisors (or tutors)
Editors
Journal Title
Journal ISSN
Volume Title
Publisher
Elsevier
Citations
Google Scholar
Research Projects
Organizational Units
Journal Issue
Abstract
We consider experiments involving the observation of a discrete random variable, or quantitative classification process and we also assume that in addition to probability of each value or class we know its "utility" or "weight" (or more precisely, we can quantify the "nature" of each value or class). In this paper a procedure of minimum divergence estimation based on (h;phi)-divergences is analyzed, for the considered experiments, and its asymptotic behaviour is studied. (C) 2002 Elsevier Science Inc. All rights reserved
Description
Unesco subjects
Keywords
Citation
M.W. Birch, A new proof of the Pearson–Fisher theorem, Annals of Mathematical Statistics 35 (1964) 817–824. C. Cox, An elementary introduction to maximum likelihood estimation for multinomial models: Birch´s theorem and the delta method, The American Statistician 38 (1984) 283–287. N. Cressie, T.R.C. Read, Multinomial goodness-of-fit tests, Journal of the Royal Statistic Society B 46 (1984) 440–464. I. Csiszár, Eine Informationstheoretische Ungleichung und ihre Anwendung auf den Beweis der Ergodizit€at von Markoffschen Ketten, Publications of the Mathematical Institute of Hungarian Academy of Sciences 8 (Series A) (1963) 85–108. M.A. Gil, R. Perez, P. Gil, A family of measures of uncertainty involving utilities: definition, properties, applications and statistical inferences, Metrika 36 (1989) 129-147. S. Guiasu, Grouping data by using the weighted entropy, Journal Statistical Planning Inference 15 (1986) 63–69. F. Liese, I. Vajda, Convex Statistical Distances, Teubner, Leipzig, 1987. E. Landaburu, L. Pardo, Goodness of fit tests with weights in the classes based on ðh;/Þdivergences, Kybernetika 36 (2000) 589–602. G. Longo, Quantitative and Qualitative Measure of Information, Springer, New York, 1970. K. Matusita, On the estimation by minimum distance method, Annals of the Institute of Statistical Mathematics 24 (1954) 473–482. M.L. Menéndez, D. Morales, L. Pardo, M. Salicrú, Asymptotic behaviour and statistical applications of divergence measures in multinomial populations: a unified study, Statistical Papers 36 (1995) 1–29. D. Morales, L. Pardo, I. Vajda, Asymptotic divergence of estimates of discrete distributions, Journal of Statistical Planning and Inference 48 (1995) 347–369. J. Neyman, Contribution to the theory of the v2 test, in: Proceeding of the First Berkeley Symposium on Mathematical Statistics and Probability, 1949, pp. 239–273. C.T. Taneja, On the mean and the variance of estimates of Kullback information and relative useful information measures, Aplikace Matematiky 30 (1985) 166–175. I. Vajda, Theory of Statistical Inference and Information, Kluwer Academic Publishers, Dordrecht, 1989. A.M. Yaglom, I.M. Yaglom, Probability and Information Theory, North-Holland, Amsterdam, 1983.
Collections