Publication:
Restricted distance-type Gaussian estimators based on density power divergence and their aplications in hypothesis testing

Loading...
Thumbnail Image
Full text at PDC
Publication Date
2023-03-17
Advisors (or tutors)
Editors
Journal Title
Journal ISSN
Volume Title
Publisher
MDPI
Citations
Google Scholar
Research Projects
Organizational Units
Journal Issue
Abstract
In this paper, we introduce the restricted minimum density power divergence Gaussian estimator (MDPDGE) and study its main asymptotic properties. In addition, we examine it robustness through its influence function analysis. Restricted estimators are required in many practical situations, such as testing composite null hypotheses, and we provide in this case constrained estimators to inherent restrictions of the underlying distribution. Furthermore, we derive robust Rao-type test statistics based on the MDPDGE for testing a simple null hypothesis, and we deduce explicit expressions for some main important distributions. Finally, we empirically evaluate the efficiency and robustness of the method through a simulation study
Description
Unesco subjects
Keywords
Citation
1. Zhang, T. General Gaussian estimation. J. Multivar. Anal. 2019, 169, 234–247. [CrossRef] 2. Kapur, J.N. Maximum Entropy Models in Science and Engineering; Wiley: New Delhi, India, 1989. 3. Castilla; E.; Zografos, K. On distance-type Gaussian estimation. J. Multivar. Anal. 2022, 188, 22. [CrossRef] 4. Basu, A.; Harris, I.R.; Hjort, N.L.; Jones, M.C. Robust and efficient estimation by minimising a density power divergence. Biometrika 1998, 85, 549–559. [CrossRef] 5. Pardo, L. Statistical Inference Based on Divergence Measures; CRC Press: Boca Raton, FL, USA, 2006. 6. Cover, T.M.; Thomas, J.A. Elements of Information Theory, 2nd ed.; John Wiley and Sons, Inc.: Hoboken, NJ, USA, 2006. 7. Guiasu, S. Information Theory with Applications; McGraw Hill: New York, NY, USA,1977. 8. Basu, A.; Mandal, A.; Martin, N.; Pardo, L. Testing composite hypothesis based on the density power divergence. Sankhya B 2018, 80, 222–262. 9. Giraitis, L.; Taniguchi, M.; Taqqui, M.S. Asymptotic normality of quadratic forms of martingale difference. Stat. Inference Stoch. Process. 2017, 20, 315–327. 10. Götze, F.; Tikhomirov, A. Asymptotic distribution of quadratic forms and applications. J. Theor. Probab. 2002, 15, 423–475. 11. Wu, W.B.; Shao, X. A limit theorem for quadratic forms and its applications. Econom. Theory 2007, 23, 930–950. 12. Eubank, R.L.; Wang, S. A central limit theorem for the sum of generalized linear and quadratic forms. Statistics 1999, 33, 85–91. 13. Horváth, L.; Shao, Q.M. Limit theorems for quadratic forms with applications to Whittle’s estimate. Ann. Appl. Probab. 1999, 9, 146–187. 14. Silvey, S D. Reprinting, Monographs on Statistical Subjects; Chapman and Hall: London, UK, 1975. 15. Sen, P.K.; Singer, J.M.; Pedroso De Lima, A.C. From Finite Sample to Asymptotic Methods in Statistics. Cambridge Series in Statistical and Probabilistic Mathematics; Cambridge University Press: Cambridge, UK, 2009. 16. Ferguson, T.S. A Course in Large Sample Theory; Chapman & Hall: London, UK, 1996. 17. Aitchison, J.; Silvey, S.D. Maximum Likelihood Estimation of Parameters Subject to Restraints. Ann. Math. Stat. 1958, 29, 813–828. 18. Pardo, J.A.; Pardo, L.; Zografos, K. Minimum f-divergence estimators with constraints in multinomial populations. J. Stat. Plan. Inference 2002, 104, 221–237. [CrossRef] 19. Ghosh, A. Influence function analysis of the restricted minimum divergence estimators: A general form. Electron. J. Stat. 2015, 9, 1017–1040. [CrossRef] 20. Jaenada , M.; Pardo, L. Robust statistical inference in generalized linear models based on minimum Renyi’s pseudodistance estimators. Entropy 2022, 24, 18. [CrossRef] 21. Jaenada, M.; Miranda, P.; Pardo, L. Robust test statistics based on restricted minimum Rényi’s pseudodistance estimators. Entropy 2022, 24, 28. [CrossRef] 22. Martin, N. Rao’s Score Tests on Correlation Matrices. arXiv 2021, arXiv:2012.14238. [CrossRef] 23. Martin, N. Robust and efficient Breusch-Pagan test-statistic: An application of the beta-score Lagrange multipliers test for non-identically distributed individuals. arXiv 2023, arXiv:2301.07245. [CrossRef] 24. Huang, C.; Liu, H.; Shi, H.; Chen, X.; Xiao, M.; Wang, Z.; Cao, J. Bifurcations in a fractional-order neural network with multiple leakage delays. Neural Netw. 2020, 131, 115–126. [CrossRef] 25. Huang, C.; Wang, J.; Chen, X.; Cao, J. Bifurcations in a fractional-order BAM neural network with four different delays. Neural Netw. 2021, 141, 344–354. [CrossRef] 26. Xu, C.; Mu, D.; Liu, Z.; Pang, Y.; Liao, M.; Aouiti, C. New insight into bifurcation of fractional-order 4D neural networks incorporating two different time delays. Commun. Nonlinear Sci. Numer. Simul. 2023, 118, 107043. [CrossRef] 27. Xu, C.; Zhang, W.; Aouiti, C.; Liu, Z.; Yao, L. Bifurcation insight for a fractional-order stage-structured predator-prey system incorporating mixed time delays. Math. Methods Appl. Sci. 2023, early view. [CrossRef] 28. Hampel, F.R. Contributions to the Theory of Robust Estimation. Ph.D. Thesis, University of California, Berkeley, CA, USA, 1968. 29. Basu, A.; Chakraborty, S.; Ghosh, A.; Pardo, L. Robust density power divergence based tests in multivariate analysis: A comparative overview of different approaches. J. Multivar. Anal. 2022, 188, 104846. 30. Basu, A.; Ghosh, A.; Martin, N.; Pardo, L. A robust generalization of the Rao test. J. Bus. Econ. Stat. 2022, 40, 868–879. 31. Castilla, E.; Jaenada, M.; Pardo, L. Estimation and testing on independent not identically distributed observations based on Rényi’s pseudodistances. IEEE Trans. Inf. Theory 2022, 68, 4588–4609. [CrossRef] 32. Castilla, E.; Jaenada, M.; Martín, N.; Pardo, L. Robust approach for comparing two dependent normal populations through Wald-type tests based on Rényi’s pseudodistance estimators. Stat. Comput. 2022, 32, 34. [CrossRef] 33. Ghosh, A.; Basu, A.; Pardo, L. Robust Wald-type tests under random censoring. Stat. Med. 2021, 40, 1285–1305. [CrossRef] 34. Menéndez, M.; Morales, D.; Pardo, L.; Vajda, I. Divergence-based estimation and testing of statistical models of classification. J. Multivar. Anal. 1995 54, 329–354. [CrossRef] [PubMed] 35. Toma, A.; Broniatowski, M. Dual Divergence Estimators and Tests: Robustness Results. J. Multivar. Anal. 2011, 102, 20–36. [CrossRef] [PubMed] 36. Toma, A.; Leoni-Aubin, S. Robust Tests Based on Divergence Estimators and Sadlepoint Approximations. J. Multivar. Anal. 2010, 101, 1143–1155. [CrossRef] [PubMed] 37. Qin, Y.; Priebe, C.E. Robust Hypothesis Testing via Lq-Likelihood. Stat. Sin. 2017, 27, 1793–1813. [CrossRef] [PubMed] 38. Cambanis, S.; Huang, S.; Simons, G. On the theory of elliptically contoured distributions. J. Multivar. Anal. 1981, 11, 368–385.[CrossRef] [PubMed] 39. Fang, K.T.; Zhang, K.T. Generalized Multivariate Analysis; Science Press Beiging: Beiging, China, 1990. 40. Fang, K.T.; Kotz, S.; Ng, K.W. Symmetric Multivariate and Related Distributions; Chapman & Hall: London, UK, 1987. 41. Gupta, A.K.; Varga, T. Elliptically Contoured Models in Statistics; Kluwer Academic Publishers: Dordrecht, The Netherlands, 1993.[CrossRef] 42. Veroniki, A.A. Multivariate Elliptically Contoured Distributions: Estimation-Testing Hypothesis. A Review and an Application. Master’s Thesis, Department of Mathematics, University of Ioannina, Ioannina, Greece, 2010. [CrossRef] 43. Lehman, E.L. Testing Statistical Hypothesis; John Wiley and Sons: Hoboken, NJ, USA, 1959. [CrossRef] 44. Lindley, D.V. Fiducial distributions and Bayes’ theorem. J. R. Stat. Soc. 1958, 20, 102–107. [CrossRef] 45. Harville, D.A. Matrix Algebra From a Statistician’s Perspective; Springer: Berlin/Heidelberg, Germany, 1997. [CrossRef]
Collections