Tubular Riemannian Laplace for Bayesian Neural Networks

Published:Dec 30, 2025 17:50
1 min read
ArXiv

Analysis

This paper introduces the Tubular Riemannian Laplace (TRL) approximation for Bayesian neural networks. It addresses the limitations of Euclidean Laplace approximations in handling the complex geometry of deep learning models. TRL models the posterior as a probabilistic tube, leveraging a Fisher/Gauss-Newton metric to separate uncertainty. The key contribution is a scalable reparameterized Gaussian approximation that implicitly estimates curvature. The paper's significance lies in its potential to improve calibration and reliability in Bayesian neural networks, achieving performance comparable to Deep Ensembles with significantly reduced computational cost.

Reference

TRL achieves excellent calibration, matching or exceeding the reliability of Deep Ensembles (in terms of ECE) while requiring only a fraction (1/5) of the training cost.