Publication:
New Family Of Estimators For The Loglinear Model Of Quasi-Independence Based On Power-Divergence Measures

Loading...
Thumbnail Image
Full text at PDC
Publication Date
2007-05
Advisors (or tutors)
Editors
Journal Title
Journal ISSN
Volume Title
Publisher
Taylor & Francis
Citations
Google Scholar
Research Projects
Organizational Units
Journal Issue
Abstract
We study the minimum power-divergence estimator, introduced and studied by N. Cressie and T. R. C. Read [Multinomial goodness-of-fit tests. J. R. Stat. Soc., Ser. B 46, 440–464 (1984), in the loglinear model of quasi-independence. A simulation study illustrates that minimum chi-squared estimator and Cressie-Read estimator are good alternatives to the classical maximum-likelihood estimator for this problem. The estimator obtained for = 2 is the most robust and efficient estimator among the family of the minimum power estimators.
Description
loglinear model, quasi-independence, maximum likelihood, minimum powerdivergence estimator
Keywords
Citation
Agresti, A., 2002,Categorical Data Analysis(2nd ed.(Wiley). Powers, D.A. and Xie,Y., 2000, Statistical Methods for Categorical Data Analysis (Academic Press). Andersen, E.B., 1990, The Statistical Analysis of Categorical Data (Springer-Verlag). Kullback, S., 1985, Kullback information. In: S. Kotz and N.L. Johnson (Eds) Encyclopedia of Statistical Sciences, Vol. 4 (NewYork: JohnWiley & Sons), pp. 421–425. Kullback, S., 1985, Minimum discrimination information (MDI) estimation. In: S. Kotz and N.C. Johnson (Eds) Encyclopedia of Statistical Sciences, Vol. 5 (NewYork: JohnWiley), pp. 527–529. Cressie, N. and Pardo, L., 2000, Minimum φ-divergence estimator and hierarchical testing in loglinear models. Statistica Sinica, 10(3), 867–884. Neyman, J., 1949, Contribution to the theory of the χ2-test. Proceedings of the First Symposium on Mathematical Statistics and Probability, University of Berkeley Press, Berkeley, pp. 239–273. Matusita, K., 1954, On the estimation by the minimum distance method. Annals of the Institute of Statistical Mathematics, 5, 59–65. Cressie, N. and Read, T.R.C., 1984, Multinomial goodness-of-fit tests. Journal of the Royal Statistic Society, Series B, 46, 440–464. Pardo, L., 2006, Statistical Inference Based on Divergence Measures (NewYork: Chapman & Hall/CRC). Rao, C.R., 1961, Asymptotic efficiency and limiting information. Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability, Vol. 1, pp. 531–546. Rao, C.R., 1962, Efficient estimates and optimum inference procedures in large samples. Journal of the Royal Statistical Society Series B, 24, 46–72. Rao, C.R., 1963, Criteria of estimation in large samples. Sankhya Series A, 25, 189–206. Read, T.R.C. and Cressie, N.A.C., 1988, Goodness-of-fit for Discrete Multivariate Data (NewYork: Springer). Lindsay, B.G., 1994, Efficiency versus robutsness: the case for minimum Hellinger distance and related methods. Annals of Statistics, 22, 1081–1114. Berkson, J., 1980, Minimum chi-square, not maximum likelihood!. Annals of Statistics, 8, 457–487. Parr, W.C., 1981, Minimum distance estimation: a bibliography. Communications in Statistics: Theory and Methods, 10, 1205–1224. Causey, B.D., 1983, Estimation of proportions for multinomial contingency tables subject to marginal constraints. Communications in Statistics: Theory and Methods, 12, 2581–2587. Harris, R.R. and Kanji, G.K., 1983, On the use of minimum chi-square estimation. The Statistician, 32, 379–394. Hodges, J.L. and Lehmann, E.L., 1970, Deficiency. Annals of Mathematical Statistics, 41, 783–801. Mitra, S., Basu, S. and Basu, A., 2000, Exact minimum disparity inference in complex multinomial models. Metron, 58, 167–185. Basu, A. and Basu, S., 1998, Penalized minimum disparity methods for multinomial models. Statistica Sinica, 8, 841–860.
Collections