Machine Learning Seminar presentation
Topic: ELBO-within-Stein: General and integrated Stein Variational Inference (Part 3 of 3)
Speaker: Ola Rønning, PhD student in the Probabilistic programming group of Prof. Thomas Hamelryck, Dpt. of Computer Science, University of Copenhagen
Time: Wednesday, 2022.02.16, 10:00 CET
How to join: Please contact Jakub Lengiewicz
Bayesian inference provides a unified framework for quantifying uncertainty in probabilistic models with latent variables. However, exact inference algorithms generally scale poorly with the dimensionality of the model and the size of the data. To overcome the issue of scaling, the ML community has turned to approximate inference. For the Big Data case, the most prominent method is Variational inference (VI), which uses a simpler parametric model to approximate the target distribution of the latent variables. In recent years, Stein’s method has caught the attention of the ML community as a way to formulate new schemes for performing variational inference. Stein’s method provides a fundamental technique for approximating and bounding distances between probability distributions. The kernel Stein discrepancy underlies Stein Variational Gradient Descent (SVGD) which works by iteratively transporting particles sampled from a simple distribution to the target distribution. We introduce the ELBO-within-Stein algorithm that combines SVGD and VI to alleviate issues due to high-dimensional models and large data sets. The ELBO-within-Stein algorithm is available in our computational framework EinStein distributed with the deep probabilistic programming language NumPyro. We will draw upon our framework to illustrate key concepts with examples. EinStein is currently freely available on GitHub and will be available in NumPyro from the next release. The framework is an efficient inference tool for practitioners and a flexible and unified codebase for researchers.
Video recording: https://youtu.be/vsI7J5pgTv0