Biblioteca de la Universidad Complutense de Madrid

Minimum (h,phi )-divergences estimators with weights


Landaburu Jiménez, María Elena y Pardo Llorente, Leandro (2003) Minimum (h,phi )-divergences estimators with weights. Applied Mathematics and Computation, 140 (1). pp. 15-28. ISSN 0096-3003

[img] PDF
Restringido a Sólo personal autorizado del repositorio hasta 31 Diciembre 2020.


URL Oficial:


We consider experiments involving the observation of a discrete random variable, or quantitative classification process and we also assume that in addition to probability of each value or class we know its "utility" or "weight" (or more precisely, we can quantify the "nature" of each value or class). In this paper a procedure of minimum divergence estimation based on (h;phi)-divergences is analyzed, for the considered experiments, and its asymptotic behaviour is studied. (C) 2002 Elsevier Science Inc. All rights reserved

Tipo de documento:Artículo
Palabras clave:Weights; Asymptotic distributions; ($h,\phi$)-divergences; Minimum; ($h,\phi$)-divergence weighted estimator; Renyi's divergence
Materias:Ciencias > Matemáticas > Probabilidades
Código ID:16500

M.W. Birch, A new proof of the Pearson–Fisher theorem, Annals of Mathematical Statistics 35 (1964) 817–824.

C. Cox, An elementary introduction to maximum likelihood estimation for multinomial models: Birch´s theorem and the delta method, The American Statistician 38 (1984) 283–287.

N. Cressie, T.R.C. Read, Multinomial goodness-of-fit tests, Journal of the Royal Statistic Society B 46 (1984) 440–464.

I. Csiszár, Eine Informationstheoretische Ungleichung und ihre Anwendung auf den Beweis der Ergodizit€at von Markoffschen Ketten, Publications of the Mathematical Institute of Hungarian Academy of Sciences 8 (Series A) (1963) 85–108.

M.A. Gil, R. Perez, P. Gil, A family of measures of uncertainty involving utilities: definition, properties, applications and statistical inferences, Metrika 36 (1989) 129-147.

S. Guiasu, Grouping data by using the weighted entropy, Journal Statistical Planning Inference 15 (1986) 63–69.

F. Liese, I. Vajda, Convex Statistical Distances, Teubner, Leipzig, 1987.

E. Landaburu, L. Pardo, Goodness of fit tests with weights in the classes based on ðh;/Þdivergences, Kybernetika 36 (2000) 589–602.

G. Longo, Quantitative and Qualitative Measure of Information, Springer, New York, 1970.

K. Matusita, On the estimation by minimum distance method, Annals of the Institute of Statistical Mathematics 24 (1954) 473–482.

M.L. Menéndez, D. Morales, L. Pardo, M. Salicrú, Asymptotic behaviour and statistical applications of divergence measures in multinomial populations: a unified study, Statistical Papers 36 (1995) 1–29.

D. Morales, L. Pardo, I. Vajda, Asymptotic divergence of estimates of discrete distributions, Journal of Statistical Planning and Inference 48 (1995) 347–369.

J. Neyman, Contribution to the theory of the v2 test, in: Proceeding of the First Berkeley Symposium on Mathematical Statistics and Probability, 1949, pp. 239–273.

C.T. Taneja, On the mean and the variance of estimates of Kullback information and relative useful information measures, Aplikace Matematiky 30 (1985) 166–175.

I. Vajda, Theory of Statistical Inference and Information, Kluwer Academic Publishers, Dordrecht, 1989.

A.M. Yaglom, I.M. Yaglom, Probability and Information Theory, North-Holland, Amsterdam, 1983.

Depositado:25 Sep 2012 09:02
Última Modificación:07 Feb 2014 09:30

Sólo personal del repositorio: página de control del artículo