Robust inference and model criticism using bagged posteriors
Huggins, Jonathan H.; Miller, Jeffrey W.
Standard Bayesian inference is known to be sensitive to model misspecification,
leading to unreliable uncertainty quantification and poor predictive
performance. However, finding generally applicable and computationally
feasible methods for robust Bayesian inference under misspecification
has proven to be a difficult challenge. An intriguing, easy-to-use, and widely
applicable approach is to use bagging on the Bayesian posterior (“Bayes-
Bag”); that is, to use the average of posterior distributions conditioned on
bootstrapped datasets. In this paper, we develop the asymptotic theory of
BayesBag, propose a model–data mismatch index for model criticism using
BayesBag, and empirically validate our theory and methodology on synthetic
and real-world data in linear regression, sparse logistic regression, and a hierarchical
mixed effects model. We find that in the presence of significant misspecification,
BayesBag yields more reproducible inferences and has better
predictive accuracy than the standard Bayesian posterior; on the other hand,
when the model is correctly specified, BayesBag produces superior or equally
good results. Overall, our results demonstrate that BayesBag combines the
attractive modeling features of standard Bayesian inference with the distributional
robustness properties of frequentist methods, providing benefits over
both Bayes alone and the bootstrap alone.
↧