Tubular Riemannian Laplace for Bayesian Neural Networks
Analysis
This paper introduces the Tubular Riemannian Laplace (TRL) approximation for Bayesian neural networks. It addresses the limitations of Euclidean Laplace approximations in handling the complex geometry of deep learning models. TRL models the posterior as a probabilistic tube, leveraging a Fisher/Gauss-Newton metric to separate uncertainty. The key contribution is a scalable reparameterized Gaussian approximation that implicitly estimates curvature. The paper's significance lies in its potential to improve calibration and reliability in Bayesian neural networks, achieving performance comparable to Deep Ensembles with significantly reduced computational cost.
Key Takeaways
- •Introduces Tubular Riemannian Laplace (TRL) approximation for Bayesian neural networks.
- •Addresses limitations of Euclidean Laplace approximations in deep learning.
- •Models posterior as a probabilistic tube using a Fisher/Gauss-Newton metric.
- •Achieves excellent calibration and reliability, comparable to Deep Ensembles.
- •Requires only a fraction of the training cost compared to Deep Ensembles.
“TRL achieves excellent calibration, matching or exceeding the reliability of Deep Ensembles (in terms of ECE) while requiring only a fraction (1/5) of the training cost.”