Reliable amortized variational inference with conditional normalizing flows via physics-based latent distribution correction
Title | Reliable amortized variational inference with conditional normalizing flows via physics-based latent distribution correction |
Publication Type | Conference |
Year of Publication | 2022 |
Authors | Ali Siahkoohi, Gabrio Rizzuti, Rafael Orozco, Felix J. Herrmann |
Conference Name | IMAGE Workshop on Subsurface Uncertainty Description and Estimation - Moving Away from Single Prediction with Distribution Learning |
Month | 09 |
Keywords | Bayesian inference, deep learning, Generative models, Imaging, Inverse problems, SEG, Uncertainty quantification, workshop |
Abstract | Bayesian inference for high-dimensional inverse problems is challenged by the computational costs associated with the forward operator during posterior sampling, as well as the selection of an appropriate prior distribution that encodes our prior knowledge about the unknown. Amortized variational inference addresses these challenges by pretraining a conditional normalizing flow (cNF) that approximates the posterior distribution over the existing model and data joint samples where the prior is implicitly learned from the data. When fed data and normally distributed latent samples as input, the pretrained cNF provides posterior samples for previously unseen data virtually for free. The accuracy of this purely data-driven approach, however, is dependent on the availability of high-fidelity training data, which is rarely the case with geophysical inverse problems, due to the highly heterogeneous structure of the Earth. To overcome this challenge and to minimize the negative bias of data distribution shifts during inference, we propose learning a physics-based correction to the cNF latent distribution to provide a more accurate approximation to the posterior distribution. We accomplish this by parameterizing the cNF latent distribution by a Gaussian distribution with an unknown mean and diagonal covariance, which are estimated by minimizing the Kullback-Leibler divergence between the corrected posterior distribution estimate and the true posterior distribution. For a relatively well-trained cNF, this approach provides reliable posterior samples with a limited computational cost while remaining bound to data and physics. We showcase the computational gains of this approach on a "quasi" real seismic imaging example. |
Notes | (IMAGE Workshop, Houston) |
URL | https://www.imageevent.org/Workshop/subsurface-uncertainty-description-estimation-moving-away-single-prediction-distribution-learning |
Presentation | |
Citation Key | siahkoohi2022SEGWSrav |