¡Nos trasladamos! E-Prints cerrará el 7 de junio.

En las próximas semanas vamos a migrar nuestro repositorio a una nueva plataforma con muchas funcionalidades nuevas. En esta migración las fechas clave del proceso son las siguientes:

Es muy importante que cualquier depósito se realice en E-Prints Complutense antes del 7 de junio. En caso de urgencia para realizar un depósito, se puede comunicar a docta@ucm.es.

Sensitivity to hyperprior parameters in Gaussian Bayesian networks



Downloads per month over past year

Gómez Villegas, Miguel A. and Main Yaque, Paloma and Navarro, H. and Susi, R. (2014) Sensitivity to hyperprior parameters in Gaussian Bayesian networks. Journal of multivariate analysis, 124 . pp. 214-225. ISSN 0047-259X

[thumbnail of Main200oficial.pdf]

Official URL: http://www.sciencedirect.com/science/article/pii/S0047259X13002340


Bayesian networks (BNs) have become an essential tool for reasoning under uncertainty in complex models. In particular, the subclass of Gaussian Bayesian networks (GBNs) can be used to model continuous variables with Gaussian distributions. Here we focus on the task of learning GBNs from data. Factorization of the multivariate Gaussian joint density according to a directed acyclic graph (DAG) provides an alternative and interchangeable representation of a GBN by using the Gaussian conditional univariate densities of each variable given its parents in the DAG. With this latter conditional specification of a GBN, the learning process involves determination of the mean vector, regression coefficients and conditional variances parameters. Some approaches have been proposed to learn these parameters from a Bayesian perspective using different priors, and therefore some hyperparameter values are tuned. Our goal is to deal with the usual prior distributions given by the normal/inverse gamma form and to evaluate the effect of prior hyperparameter choice on the posterior distribution. As usual in Bayesian robustness, a large class of priors expressed by many hyperparameter values should lead to a small collection of posteriors. From this perspective and using Kullback-Leibler divergence to measure prior and posterior deviations, a local sensitivity measure is proposed to make comparisons. If a robust Bayesian analysis is developed by studying the sensitivity of Bayesian answers to uncertain inputs, this method will also be useful for selecting robust hyperparameter values.

Item Type:Article
Uncontrolled Keywords:Gaussian Bayesian network; Kullback-Leibler divergence; Bayesian linear regression
Subjects:Sciences > Mathematics > Mathematical statistics
ID Code:24698
Deposited On:17 Mar 2014 11:23
Last Modified:01 Jan 2023 23:00

Origin of downloads

Repository Staff Only: item control page