Faster Uncertainty Quantification for Inverse Problems with Conditional Normalizing Flows

TitleFaster Uncertainty Quantification for Inverse Problems with Conditional Normalizing Flows
Publication TypeReport
Year of Publication2020
AuthorsAli Siahkoohi, Gabrio Rizzuti, Philipp A. Witte, Felix J. Herrmann
Document NumberTR-CSE-2020-2
InstitutionGeorgia Institute of Technology
Keywordsdeep learning, invertible networks, Uncertainty quantification

In inverse problems, we often have access to data consisting of paired samples $(x,y)\sim p_X,Y(x,y)$ where $y$ are partial observations of a physical system, and $x$ represents the unknowns of the problem. Under these circumstances, we can employ supervised training to learn a solution $x$ and its uncertainty from the observations $y$. We refer to this problem as the "supervised" case. However, the data $y\sim p_Y(y)$ collected at one point could be distributed differently than observations $y'\sim p_Y'(y')$, relevant for a current set of problems. In the context of Bayesian inference, we propose a two-step scheme, which makes use of normalizing flows and joint data to train a conditional generator $q_θ(x|y)$ to approximate the target posterior density $p_X|Y(x|y)$. Additionally, this preliminary phase provides a density function $q_θ(x|y)$, which can be recast as a prior for the "unsupervised" problem, e.g. when only the observations $y'\sim p_Y'(y')$, a likelihood model $y'|x$, and a prior on $x'$ are known. We then train another invertible generator with output density $q'_φ(x|y')$ specifically for $y'$, allowing us to sample from the posterior $p_X|Y'(x|y')$. We present some synthetic results that demonstrate considerable training speedup when reusing the pretrained network $q_θ(x|y')$ as a warm start or preconditioning for approximating $p_X|Y'(x|y')$, instead of learning from scratch. This training modality can be interpreted as an instance of transfer learning. This result is particularly relevant for large-scale inverse problems that employ expensive numerical simulations.

Citation Keysiahkoohi2020TRfuqf