Biblioteca de la Universidad Complutense de Madrid

Ordering and selecting extreme populations by means of entropies and divergences


Menéndez Calleja, María Luisa y Pardo Llorente, Leandro y Zografos, Konstantinos (2009) Ordering and selecting extreme populations by means of entropies and divergences. Journal of Computational and Applied Mathematics, 232 (2). pp. 335-350. ISSN 0377-0427

[img] PDF
Restringido a Sólo personal autorizado del repositorio hasta 31 Diciembre 2020.


URL Oficial:


This paper studies the simultaneous selection of extreme populations from a set of independent populations. Two types of subset selection rules for k populations are proposed and studied. The first type selects one subset of populations that should contain the population with the smallest, and another subset of populations that should contain the population with the largest, p-entropy. The second type selects analogously, but in terms of the extreme phi-divergences with respect a known control population. Properties of the proposed procedures are stated and studied. Examples are presented in order to illustrate the results.

Tipo de documento:Artículo
Palabras clave:Divergence; Entropy; Ordering populations; Selection of populations; Subset selection approach; Extreme populations
Materias:Ciencias > Matemáticas > Estadística matemática
Código ID:17375

R.E. Bechhofer, A single-sample multiple decision procedure for ranking means of normal populations with known variances, Ann. Math. Statist. 25 (1954) 16–39.

S.S. Gupta, On a selection and ranking procedure for gamma populations, Ann. Inst. Statist. Math. 14 (1963) 199–216.

S.S. Gupta, On some multiple decision (selection and ranking) rules, Technometrics 7 (1965) 225–245.

J.D. Gibbons, I. Olkin, M. Sobel, Selecting and Ordering Populations: A New Statistical Methodology, Wiley, New York, 1977.

S.S. Gupta, S. Panchapakesan, Multiple Decision Procedures: Theory and Methodology of Selecting and Ranking Populations, Wiley, New York, 1979.

Paul van der Laan, L.R. Verdooren, Selection of populations: an overview and some recent results, Biometrical J. 31 (1989) 383–420.

M.L. Menéndez, L. Pardo, Ch. Tsairidis, K. Zografos, Selection of the best population: An information theoretic approach, Metrika 58 (2003) 117–147.

L. Thabane, Haq M. Safiul, On Bayesian selection of the best normal population using the Kullback–Leibler divergence measure, Statist. Neerlandica 53 (1999) 342–360. K.-S. Song, Rényi information, loglikelihood and an intrinsic distribution measure, J. Stat. Plann. Inference 93 (2001) 51–69.

S.S. Gupta, D.Y. Huang, On subset selection procedures for the entropy function associated with the binomial population, Sankhya, Ser. A 38 (1976) 153–173.

E.J. Dudewicz, E.C. van der Meulen, Entropy-based statistical inference, II: Selection-of-the-best/complete ranking for continuous distributions on (0,1), with applications to random number generators, Statist. Decisions 1 (1983) 131–145.

S.S. Gupta, S. Panchapakesan, Statistical selection procedures in multivariate models, in: A.K. Gupta (Ed.), Advances in Multivariate Statistical Analysis, D. Reidel Publishing Company, 1987, pp. 141–160.

S.S. Gupta, L. Lii-Yuh, L. TaChen, Simultaneous selection for homogeneous multinomial populations based on entropy function: an empirical Bayes approach, J. Stat. Plann. Inference 54 (1996) 145–157.

S.N. Mishra, E.J. Dudewicz, Simultaneous selection of extreme populations: a subset selection approach, Biometrical J 29 (1987) 471–483.

R.J. Serfling, Approximations Theorems of Mathematical Statistics, Wiley, New York, 1980.

I. Vajda, Theory of Statistical Inference and Information, Kluwer Academic Publishers, 1989.

J. Burbea, C.R. Rao, On the convexity of some divergence measures based on entropy functions, IEEE Trans. Inform. Theory IT-28 (1982) 489–495.

A. Perez, Risk estimates in terms of generalized f-entropies, in: Proceedings of the Colloquium on Information Theory, Vol. II, János Bolyai Mathematical Society, Budapest, 1967, pp. 299–315.

M. Ben-Bassat, f-entropies, probability of error, and feature selection, Inform, and Control 39 (1978) 227–242.

I. Csiszár, Eine informationstheoretische ungleichung und ihre anwendung auf den beweis der ergodizitat von Markoffschen Ketten, Mayar Tud. Acad. Mat. Kutato Int. Kozl. 8 (1963) 85–108.

S.M. Ali, S.D. Silvey, A general class of coefficients of divergence of one distribution from another, J. Roy. Statist. Soc. B 28 (1966) 131–142.

N. Cressie, T.R.C. Read, Multinomial goodness-of-fit tests, J. Roy. Statist. Soc. Ser. B 46 (1984) 440–464.

F. Liese, I. Vajda, Convex statistical distances, in: Teubner Texts in Mathematics, vol. 95, Teubner Verlagsgesellschaft, Leipzig, 1987.

L. Pardo, Statistical Inference based on Divergence Measures, Chapman & Hall/CRC, 2006.

K.O. Bowman, L.R. Shenton, Problems with maximum likelihood estimation and the 3 parameter gamma distribution, J. Stat. Comput. Simul. 72 (2002) 391–401.

M.L. Menéndez, Shannon's entropy in exponential families: Statistical applications, Appl. Math. Lett. 13 (2000) 37–42.

Depositado:12 Dic 2012 09:30
Última Modificación:07 Feb 2014 09:46

Sólo personal del repositorio: página de control del artículo