Minimum divergence estimators based on grouped data



Downloads per month over past year

Menéndez Calleja, María Luisa and Morales González, Domingo and Pardo Llorente, Leandro and Vadja, Igor (2001) Minimum divergence estimators based on grouped data. Annals of the Institute of Statistical Mathematics, 53 (2). pp. 277-288. ISSN 0020-3157

[thumbnail of PardoLeandro43.pdf] PDF
Restringido a Repository staff only


Official URL:


The paper considers statistical models with real-valued observations i.i.d. by F(x, theta (0)) from a family of distribution functions (F(x, theta); theta is an element of Theta), Theta subset of R-s, s greater than or equal to 1. For random quantizations defined by sample quantiles (F-n(-1)(lambda (1)),..., F-n(-1)(lambda (m-1))) of arbitrary fixed orders 0 < <lambda>(1) < ... < lambda (m-1) < 1, there are studied estimators <theta>(phi ,n) of theta (0) which minimize phi -divergences of the theoretical and empirical probabilities. Under an appropriate regularity, all these estimators are shown to be as efficient (first order, in the sense of Rao) as the MLE in the model quantified nonrandomly by (F-1(lambda (1), theta (0)),..., F-1(lambda (m-1), theta (0))). Moreover, the Fisher information matrix I-m(theta (0), lambda) of the latter model with the equidistant orders lambda = (lambda (j) = j/m : 1 less than or equal to j less than or equal to m-1) arbitrarily closely approximates the Fisher information F(theta (0)) of the original model when m is appropriately large. Thus the random binning by a large number of quantiles of equidistant orders leads to appropriate estimates of the above considered type.

Item Type:Article
Uncontrolled Keywords:minimum divergence estimators; random quantization; asymptotic normality; efficiency; Fisher information; optimization
Subjects:Sciences > Mathematics > Applied statistics
ID Code:18053
Deposited On:30 Jan 2013 09:48
Last Modified:26 Jul 2018 07:14

Origin of downloads

Repository Staff Only: item control page