TU Berlin

Methoden der Künstlichen IntelligenzPublikationen

Inhalt des Dokuments

zur Navigation

Publikationsliste

Perturbative Black Box Corrected Variational Inference
Zitatschlüssel BaChOpma17
Autor Bamler R., Cheng Z., Opper M., Mandt S.
Buchtitel Advances in Neural Information Processing Systems 30
Seiten 11
Jahr 2017
ISSN 00189219
Verlag IEEE
Zusammenfassung Black box variational inference (BBVI) with reparameterization gradients triggered the exploration of divergence measures other than the Kullback-Leibler (KL) divergence, such as alpha divergences. These divergences can be tuned to be more mass-covering (preventing overfitting in complex models), but are also often harder to optimize using Monte-Carlo gradients. In this paper, we view BBVI with generalized divergences as a form of biased importance sampling. The choice of divergence determines a bias-variance tradeoff between the tightness of the bound (low bias) and the variance of its gradient estimators. Drawing on variational perturbation theory of statistical physics, we use these insights to construct a new variational bound which is tighter than the KL bound and more mass covering. Compared to alpha-divergences, its reparameterization gradients have a lower variance. We show in several experiments on Gaussian Processes and Variational Autoencoders that the resulting posterior covariances are closer to the true posterior and lead to higher likelihoods on held-out data.
Link zur Publikation Download Bibtex Eintrag

Navigation

Direktzugang

Schnellnavigation zur Seite über Nummerneingabe