Milad Zeraatpisheh: Bayesian neural networks and MC Dropout; ways to measure uncertainty in Deep learning

Machine Learning Seminar presentation

Topic: Bayesian neural networks and MC Dropout; ways to measure uncertainty in Deep learning

Speaker: Milad Zeraatpisheh, FSTM, University of Luxembourg

Time: Wednesday 2021.01.20, 10:00 CET

How to join: Please contact Jakub Lengiewicz

Abstract:

Deep learning methods represent the state-of-the-art for numerous applications such as facial recognition systems, supercomputing, and speech recognition. Conventional Neural networks generate point estimates of deep neural network parameters and therefore, make predictions that can be overconfident since they do not account well for uncertainty in model parameters.

In this presentation, we will take a closer look at the Bayesian Neural network as a way to measure this uncertainty. First, Bayesian inference on Neural network weights will be discussed. Afterward, Monte Carlo Dropout, proposed by Gal & Ghahramani (2016), as another way to tackle uncertainty in deep learning will be explained.

Additional material: