Publication:
Minimum phi-divergence estimators with constraints in multinomial populations

Loading...
Thumbnail Image
Full text at PDC
Publication Date
2002
Advisors (or tutors)
Editors
Journal Title
Journal ISSN
Volume Title
Publisher
Elsevier Science
Citations
Google Scholar
Research Projects
Organizational Units
Journal Issue
Abstract
This paper presents a minimum phi-divergence estimation procedure in multinomial models in which the probabilities depend on unknown parameters that are not mathematically independent but satisfy some functional relationships, This estimator is then used in a phi-divergence statistic for solving the problem of goodness-of-fit when the unknown parameters in the probabilities are not mathematically independent. The asymptotic distribution of this family of statistics is obtained under the null and contiguous alternative hypotheses, The asymptotic distribution of residuals. when the parameters are estimated using the minimum phi-divergence estimator. is also obtained.
Description
Unesco subjects
Keywords
Citation
Aitchison, J., Silvey, S.D., 1958. Maximum likelihood estimation of parameters subject to constraints. Ann. of Math. Statist. 29, 813–828. Beran, R., 1977. Minimum Hellinger distance estimates for parametric models. Ann. Statist. 5, 445–463. Bhapkar, V.P., 1979. On tests of symmetry when higher order interactions are absent. J. Indian Statist. Assoc. 17, 17–26. Birch, M.W., 1964. A new proof of the Pearson–Fisher theorem. Ann. of Math. Statist. 35, 817–824. Bonett, D.G., 1989. Pearson chi-square estimator and test for log–linear models with expected frequencies subject to linear constraints. Statist. Probab. Lett. 8, 175–177. Cox, C., 1984. An elementary introduction to maximum likelihood estimation for multinomial models: Birch theorem and the delta method. Amer. Statist. 38, 283–287. Cressie, N., Read, T.R.C., 1984. Multinomial goodness-of-3t tests. J. Roy. Statist. Soc. Ser. B 46, 440–464. CsiszKar, I., 1967. Information type measures of difference of probability distributions and indirect observations. Studia Sci. Math. Hungar. 2, 105–113. Diamond, E.L., Mitra, S.K., Roy, S.N., 1960. Asymptotic power and asymptotic independence in the statistical analysis of categorical data. Bull. Internat. Statist. Inst. 37, 3–23. Ferguson, T.S., 1996. A course in Large Sample Theory. Chapman & Hall, London. Fraser, D.A.S., 1957. Nonparametric Methods in Statistics. Willey, New York. Gokhale, D.V., 1973. Iterative maximum likelihood estimation for discrete distributions. Sankhya 35, 293–298. Haber, M., 1985. Maximum likelihood methods for linear and log–linear models in categorical data. Comput. Statist. Data Anal. 3, 1–10. Haber, M., Brown, M.B., 1986. Maximum likelihood estimation for log–linear models when expected frequencies are subject to linear constraints. J. Amer. Statist. Assoc. 81, 477–482. Haberman, S.J., 1974. The Analysis of Frequency Data. University of Chicago Press, Chicago. Kullback, S., 1959. Information Theory and Statistics. Wiley, New York. Lindsay, B.G., 1994. Efficiency versus robustness: the case for minimum Hellinger distance and related methods. Ann. of Statist. 22, 1081–1114. Morales, D., Pardo, L., Vajda, I., 1995. Asymptotic divergences of estimates of discrete distributions. J. Statist. Plann. Inference 48, 347–369. Read, T.R.C., Cressie, N.A.C., 1988. Goodness-of-fit Statistics for Discrete Multivariate Data. Springer, New York. Silvey, S.D., 1959. The Lagrangian multiplier test. Ann. of Math. Statist. 30, 389–407. Wedderburn, R.W.M., 1974. Generalized linear models specified in terms of constraints. J. Roy. Statist. Soc. Ser. B 36, 449–454. Zografos, K., Ferentinos, K., Papaioannou, T., 1990. phi-divergence statistics: sampling properties and multinomial goodness of fit and divergence tests. Comm. Statist. (Theory and Methods) 19, 295–310.
Collections