Publication:
Some new statistics for testing hypotheses in parametric models

Loading...
Thumbnail Image
Full text at PDC
Publication Date
1997-07
Advisors (or tutors)
Editors
Journal Title
Journal ISSN
Volume Title
Publisher
Academic Press
Citations
Google Scholar
Research Projects
Organizational Units
Journal Issue
Abstract
The paper deals with simple and composite hypotheses in statistical models with i.i.d. observations and with arbitrary families dominated by a finite measures and parametrized by vector-valued variables. It introduces phi-divergence testing statistics as alternatives to the classical ones: the generalized likelihood ratio and the statistics of Wald and Rao. It is shown that, under the assumptions of standard type about hypotheses and model densities, the results about asymptotic distribution of the classical statistics established so far for the counting and Lebesgue dominating measures (discrete and continuous models) remain true also in the general case. Further, these results are extended to the phi-divergence statistics with smooth convex functions phi. The choice of phi-divergence statistics optimal from the point of view of power is discussed and illustrated by several examples.
Description
Unesco subjects
Keywords
Citation
Beran, R. (1977). Minimum Hellinger distance estimator for parametric models. Ann. Statist. 5 455-463. Brown, L. D. (1986). Fundamentals of Statistical Exponential Families. Lecture Notes, Vol. 9. Institute Mathematical Statistics, Hayward, CA. Csiszár, I. (1963). Eine Informationstheoretische Ungleichung und ihre Anwendung auf den Beweis der Ergodizita_ t von Markoffschen Ketten. Publ. Math. Inst. Hungar. Acad. Sci. Ser. A 8 85-108. Cox, D. R., and Hinkley, D. V. (1974). Theoretical Statistics. Chapman Hall, London. Fleming, W. H. (1965). Functions of Several Variables. Addison_Wesley, New York. Fraser, D. A. S. (1957). Nonparametric Methods in Statistics. Wiley, New York. Györfi, L., Vajda, I., and van der Meulen, E. (1994a). Minimum Hellinger distance point estimates under weak family regularity. Math. Methods Statist. 3 25-45. Györfi, L., Vajda, I., and van der Meulen, E. (1994b). Parameter estimation by projecting on structural parametric models. In Asymptotic Statistics (P. Mandl and M. Hus_kova_ , Eds.), pp. 261_272. Physica Verlag, Vienna. Kupperman, M. (1957). Further Applications of Information Theory to Multivariate Analysis and Statistical Inference. Ph.D. dissertation. George Washington University. Lehman, E. L. (1986). Testing Statistical Hypotheses, 2nd ed. Wiley, New York. Liese, F., and Vajda, I. (1987). Convex Statistical Distances, Teubner, Leipzig. Menéndez, M. L., Morales, D., Pardo, L., and Vajda, I. (1995). Divergence based estimation and testing of statistical models of classification. J. Multivariate Anal. 54 329-354. Menéndez, M. L., Morales, D., Pardo, L., and Vajda, I. (1996). About divergence-based good-nes-of-fit tests in the Dirichlet-multinomial model. Commun. Statist. Theory Methods 25119-133. Morales, D., Pardo, L., Salicrú, M., and Menéndez, M. L. (1994). Asymptotic properties of divergence statistics in a stratified random sampling and its applications to test statistical hypotheses. J. Statist. Planning Infer. 38 201-224. Morales, D., Pardo, L., and Vajda, I. (1995). Asymptotic divergence of estimates of discrete distributions. J. Statist. Planning Infer. 48 347-369. Morales, D., Pardo, L., and Vajda, I. (1996). About the Behaviour of Rényi's Divergence Statistics Under Null Composite Hypotheses. Res. Rep. 1870, Inst. of Inform. Theory and Automation, Czech. Acad. Sci., Prague. Neyman, J., and Pearson, E. S. (1928). On the use and interpretation of certain test criteria for purposes of statistical inference. Biometrika A 20 175-240; 263-294. Pardo, L., Morales, D., Salicrú , M., and Menéndez, M. L. (1993). The ,-divergence statistics in bivariate multinomial applications including stratification. Metrika 40 223-235. Pardo, M. C. (1994). On testing independence for multidimensional contingency tables with stratified random sampling. Inform. Sci. 78 101-118. Rao, C. R. (1947). Large sample tests of statistical hypotheses concerning several parameters with applications to problems of estimation. Proc. Cambridge Phil. Soc. 44 50-57. Rao, C. R. (1973). Linear Statistical Inference and Its Applications, 2nd ed. Wiley, New York. Read, A., and Cressie, N. A. C. (1988). Goodness-of-fit Statistics for Discrete Multivariate Data. Springer, New York. Rényi, A. (1961). On measures of entropy and information. In Proc. 4th Berkeley Symp. On Probab. Math. Statist., Univ. of California, Berkeley, Vol. 1, pp. 547-561. Salicrú, M., Menéndez, M. L., Morales, D., and Pardo, L. (1994). On the applications of divergence type measures in testing statistical hypotheses. J. Multivariate Anal. 51 372_391. Sen, P. K., and Singer, J. M. (1993). Large Sample Methods in Statistics. Chapman 6 Hall, New York. Serfling, R. J. (1980). Approximation Theorems of Mathematical Statistics. Wiley, New York. Simpson, D. G. (1989), Hellinger deviance tests: Efficiency, breakdown points, and examples. J. Amer. Statist. Assoc. 84, 107-113. Wald, A. (1943). Tests of statistical hypotheses concerning several parameters when the number of observations is large. Trans. Amer. Math. Soc. 54, 426-482. Zografos, K., Ferentinos, K., and Papaioannou, T. (1990). ,-divergence statistics: sampling properties, multinomial goodness of fit and divergence tests. Communications in Statistics, Theory and Methods, 18, 1785-1802. Zografos, K. (1993). Asymptotic properties of ,-divergence statistics and its application in contingency tables. Res. Rep. 185, Math. Dept., University of Ioannina, Ioannina, Greece.
Collections