Publication:
Asymptotic behaviour of an estimator based on Rao's divergence

Loading...
Thumbnail Image
Full text at PDC
Publication Date
1997
Advisors (or tutors)
Editors
Journal Title
Journal ISSN
Volume Title
Publisher
Kybernetika
Citations
Google Scholar
Research Projects
Organizational Units
Journal Issue
Abstract
In this work the procedure of minimum divergence estimation based on Burbea and Rao[2]divergence is analyzed.Asymptotic behaviour for these estimators is given. A comparative study of Rao's estimator with other classical estimators is carried out by computer simulation.
Description
This research has been supported by DGICYT grant PB 94-0308.
Unesco subjects
Keywords
Citation
M. W. Birch: A new proof of the Pearson-Fisher theorem. Ann. Math. Statist. 35 (1964), 817-824. J. Burbea, C. R. Rao: On the convexity of some divergence measures based on entropy functions. IEEE Trans. Inform. Theory 28 (1982), 489-495. D. G. Dannenbring: Procedures for estimating optimal solution values for large combinatorial problems. Management Sci. 23 (1977), 1273-1283. J. C. Fryer, C. A. Robertson: A comparison of some methods for estimating mixed normal distributions. Biometrika 59 (1972), 639-648. J. Kiefer, J. Wolfowitz: Consistency of the ML estimator in the presence of infinitely many incidental parameters. Ann. Math. Statist. 27 (1956), 887-906. S. Kullback, R. Leibler: On information and sufficiency. Ann. Math. Statist. 22 (1951), 79-86. D. Morales L. Pardo, I. Vajda: Asymptotic divergence of estimates of discrete distributions. J. Statist. Plann. Inference 48 (1955), 347-369. M. C. Pardo, I. Vajda: About distances of discrete distributions satisfying the data processing theorem on information theory. IEEE Trans. Inform. Theory 43 (1997), 4, 1288-1293.
Collections