Effective scaling of numerical surrogates via domain-decomposed Fourier neural operators

TitleEffective scaling of numerical surrogates via domain-decomposed Fourier neural operators
Publication TypePresentation
Year of Publication2022
AuthorsThomas J. Grady II, Rishi Khan, Mathias Louboutin, Ziyi Yin, Philipp A. Witte, Ranveer Chandra, Russell J. Hewett, Felix J. Herrmann
KeywordsCCS, deep learning, Fourier neural operators, HPC, ML4SEISMIC, SLIM

Numerical surrogates are models which learn to mimic a complex physical process (such as the solution to a PDE produced by a solver) from a set of input/output pairs. Fourier neural operators (FNOs) are a specific type of numerical surrogate which use a learned matched filter to quickly approximate solutions to relatively smooth complex physical processes. In the case of carbon capture sequestration (CCS) technology, FNOs have been shown to well-approximate solutions to the two-phase flow equations, with speedups of 1,000 to 10,000 times at inference time versus a tradtitional solver. This speed combined with the fact that FNOs are differentiable with respect to their input parameters allows for inverse and uncertainty quantification problems to theoretically be solved on real 3D data, a previously intractible task. However, due to the size of the input data, network weights, and optimizer state, FNOs have thus far been limited to small to medium 2D and 3D problems, well below the size of an industry standard such as the Sleipner benchmark. Here we alleviate this problem by proposing a model-parallel FNO which makes use of domain decomposition of the input data and network weights, and exploits architectural features of FNOs to also include a natural form of asynchronous pipeline parallelism. Our network can scale to arbitrary problem sizes on CPU and GPU systems.

Citation Keygrady2022ML4SEISMICesn