Conference item icon

Conference item

Rao-Blackwellised reparameterisation gradients

Abstract:
Latent Gaussian variables have been popularised in probabilistic machine learning. In turn, gradient estimators are the machinery that facilitates gradient-based optimisation for models with latent Gaussian variables. The reparameterisation trick is often used as the default estimator as it is simple to implement and yields low-variance gradients for variational inference. In this work, we propose the R2-G2 estimator as the Rao-Blackwellisation of the reparameterisation gradient estimator. Interestingly, we show that the local reparameterisation gradient estimator for Bayesian MLPs is an instance of the R2-G2 estimator and Rao-Blackwellisation. This lets us extend benefits of Rao-Blackwellised gradients to a suite of probabilistic models. We show that initial training with R2-G2 consistently yields better performance in models with multiple applications of the reparameterisation trick.
Publication status:
Accepted
Peer review status:
Peer reviewed

Actions


Authors


More by this author
Institution:
University of Oxford
Division:
MPLS
Department:
Statistics
Role:
Author
ORCID:
0000-0002-5887-7329
More by this author
Institution:
University of Oxford
Division:
MPLS
Department:
Statistics
Oxford college:
Brasenose College
Role:
Author
ORCID:
0000-0002-0821-4607
More by this author
Institution:
University of Oxford
Division:
MPLS
Department:
Statistics
Role:
Author


Publisher:
Neural Information Processing Systems Foundation
Acceptance date:
2025-09-18
Event title:
Thirty-Ninth Annual Conference on Neural Information Processing Systems
Event location:
San Diego, California, USA and Mexico City, Mexico
Event website:
https://neurips.cc/
Event start date:
2025-11-30
Event end date:
2025-12-07


Language:
English
Pubs id:
2129958
Local pid:
pubs:2129958
Deposit date:
2025-09-19

Terms of use



Views and Downloads






If you are the owner of this record, you can report an update to it here: Report update to this record

TO TOP