Complutense University Library

Phi-divergence statistics for testing linear hypotheses in logistic regression models

Menéndez Calleja, María Luisa and Pardo Llorente, Julio Ángel and Pardo Llorente, Leandro (2008) Phi-divergence statistics for testing linear hypotheses in logistic regression models. Communications in statistics.Theory and methods, 37 (4). pp. 494-507. ISSN 0361-0926

[img] PDF
Restricted to Repository staff only until 2020.

143kB

Official URL: http://www.tandfonline.com/doi/pdf/10.1080/03610920701669710

View download statistics for this eprint

==>>> Export to other formats

Abstract

In this paper we introduce and study two new families of statistics for the problem of testing linear combinations of the parameters in logistic regression models. These families are based on the phi-divergence measures. One of them includes the classical likelihood ratio statistic and the other the classical Pearson's statistic for this problem. It is interesting to note that the vector of unknown parameters, in the two new families of phi-divergence statistics considered in this paper, is estimated using the minimum phi-divergence estimator instead of the maximum likelihood estimator. Minimum phi-divergence estimators are a natural extension of the maximum likelihood estimator.


Item Type:Article
Uncontrolled Keywords:General linear hypotheses; Logistic regression model; Minimum phidivergence estimator; Phi-divergence statistic
Subjects:Sciences > Mathematics > Mathematical statistics
ID Code:17325
References:

Ali, S. M., Silvey, S. D. (1966). A general class of coefficients of divergence of one distribution from another. J. Roy. Statist. Soc. Ser. B 28:131–142.

Andersen, E. B. (1997). Introduction to the Statistical Analysis of Categorical Data. New York: Springer Verlag.

Bishop, M. M., Fienberg, S. E., Holland, P. W. (1975). Discrete Multivariate Analysis: Theory and Practice. London: The MIT Press, Cambridge, Mass.

Cressie, N., Read, T. R. C. (1984). Multinomial goodness-of-fit tests. J. Roy. Statist. Soc. Ser. B 46:440–464.

Cressie, N., Pardo, L. (2000). Minimum phi-divergence estimator and hierarchical testing in loglinear models. Statist. Sinica 10:867–884.

Cressie, N., Pardo, L. (2002). Phi-Divergence statistics. In: ElShaarawi, A. H., Piegorich,

W. W., eds. Encyclopedia of Environmetrics. Vol. 3. New York: Wiley, pp. 1551–1555.

Cressie, N., Pardo, L., Pardo, M. C. (2003). Size and Power considerations for testing loglinear models using phi-divergence test statistic. Statist. Sinica 13:555–570.

Ferguson, T. S. (1996). A Course in Large Sample Theory. New York: Chapman & Hall.

Greene, W. H. (2003). Econometric Analysis. Upper Saddle River: Prentice Hall Inc.

Kullback, S. (1985). Kullback information. In: Kotz, S., Johnson, N. L., eds. Encyclopedia of Statistical Sciences. Vol. 4. New York: John Wiley.

Lawal, B. (2003). Categorical Data Analysis with SAS and SPSS Applications. London: Lawrence Erlbaum Associates, Publishers.

Liu, I., Agresti, A. (2005). The analysis of ordered categorical data. an overview and a survey of recent developments. Test 14:1–74.

Pardo, J. A., Pardo, L., K. Zografos (2002). Minimum phi-divergence estimators with constraints in multinomial populations. J. Statist. Plann. Inference 104:221–237.

Pardo, J. A., Pardo, L., Pardo, M. C. (2005). Minimum -divergence estimator in logistic regression models. Statist. Papers 47:91–108.

Pardo, J. A., Pardo, L., Pardo, M. C. (2006). Testing in logistic regression models based on phi-divergence measures. J. Statist. Plann. Inference 136:982–1006.

Stokes, M. E., Davis, C. S., Koch, G. G. (1999). Categorical Data Analysis Using SAS System.Cary NC: SAS Institute Inc.

Deposited On:05 Dec 2012 09:33
Last Modified:07 Feb 2014 09:45

Repository Staff Only: item control page