Normalizing Flows for Bayesian Experimental Design in Imaging Applications

TitleNormalizing Flows for Bayesian Experimental Design in Imaging Applications
Publication TypeConference
Year of Publication2024
AuthorsRafael Orozco, Abhinav Prakash Gahlot, Peng Chen, Mathias Louboutin, Felix J. Herrmann
Conference NameEAGE Annual Conference Proceedings
Month6
KeywordsBayesian inference, deep learning, expected information gain, Normalizing flows, optimal experimental design, Uncertainty quantification
Abstract

Neural density estimators, such as invertible normalizing flows, are capable of estimating the Bayesian posterior distribution in a variety of imaging problems, including medical MRI and seismic imaging/monitoring. So far, few works explore the possibility to make explicit use of probabilistic information contained within the full Bayesian solution of these inverse problems. During our talk, we investigate how a simple modification to the training objective of conditional normalizing flows allows for Bayesian experimental design without modifying the normalizing flow's neural architecture itself. By establishing a key relationship between the expected information gain (EIG) and the maximum-likelihood, attained during the training of normalizing flows, we show that optimal experimental design can be achieved. During our talk, we first verify, on a stylized problem, that our method indeed maximizes the expected information gain, followed by demonstrating the advocacy of our methodology on large-scale medical and seismic problems.

URLhttps://slim.gatech.edu/Publications/Public/Conferences/EAGE/2024/orozco2024EAGEnfb
Citation Keyorozco2024EAGEnfb