Radial and Directional Posteriors for Bayesian Neural Networks
release_vpaguogbujdpljbve3g2tlrphy
by
Changyong Oh, Kamil Adamczewski, Mijung Park
2019
Abstract
We propose a new variational family for Bayesian neural networks. We
decompose the variational posterior into two components, where the radial
component captures the strength of each neuron in terms of its magnitude; while
the directional component captures the statistical dependencies among the
weight parameters. The dependencies learned via the directional density provide
better modeling performance compared to the widely-used Gaussian
mean-field-type variational family. In addition, the strength of input and
output neurons learned via the radial density provides a structured way to
compress neural networks. Indeed, experiments show that our variational family
improves predictive performance and yields compressed networks simultaneously.
In text/plain
format
Archived Files and Locations
application/pdf 1.7 MB
file_mxyxjfe7qrbl7ljmv2s3hilrya
|
arxiv.org (repository) web.archive.org (webarchive) |
1902.02603v1
access all versions, variants, and formats of this works (eg, pre-prints)