Challenges in Markov chain Monte Carlo for Bayesian neural networks
release_a7yyjtpsxvcd5okxwclt5gm3xe
by
Theodore Papamarkou and Jacob Hinkle and M. Todd Young and David Womble
2021
Abstract
Markov chain Monte Carlo (MCMC) methods have not been broadly adopted in
Bayesian neural networks (BNNs). This paper initially reviews the main
challenges in sampling from the parameter posterior of a neural network via
MCMC. Such challenges culminate to lack of convergence to the parameter
posterior. Nevertheless, this paper shows that a non-converged Markov chain,
generated via MCMC sampling from the parameter space of a neural network, can
yield via Bayesian marginalization a valuable posterior predictive distribution
of the output of the neural network. Classification examples based on
multilayer perceptrons showcase highly accurate posterior predictive
distributions. The postulate of limited scope for MCMC developments in BNNs is
partially valid; an asymptotically exact parameter posterior seems less
plausible, yet an accurate posterior predictive distribution is a tenable
research avenue.
In text/plain
format
Archived Files and Locations
application/pdf 3.8 MB
file_gqqrzk4rszbmbar67tsubnvhsa
|
arxiv.org (repository) web.archive.org (webarchive) |
1910.06539v6
access all versions, variants, and formats of this works (eg, pre-prints)