Sub-Nyquist sampling and sparsity: how to get more information from fewer samples

TitleSub-Nyquist sampling and sparsity: how to get more information from fewer samples
Publication TypeConference
Year of Publication2009
AuthorsFelix J. Herrmann
Conference NameSEG Technical Program Expanded Abstracts
Month10
PublisherSEG
KeywordsPresentation, SEG
Abstract

Seismic exploration relies on the collection of massive data volumes that are subsequently mined for information during seismic processing. While this approach has been extremely successful in the past, the current trend of incessantly pushing for higher quality images in increasingly complicated regions of the Earth continues to reveal fundamental shortcomings in our workflows to handle massive high-dimensional data volumes. Two causes can be identified as the main culprits responsible for this barrier. First, there is the so-called ‘‘curse of dimensionality’’ exemplified by Nyquist’s sampling criterion, which puts disproportionate strain on current acquisition and processing systems as the size and desired resolution of our survey areas continues to increase. Secondly, there is the recent ‘‘departure from Moore’s law’’ that forces us to lower our expectations to compute ourselves out of this curse of dimensionality. In this paper, we offer a way out of this situation by a deliberate \emphrandomized subsampling combined with structure-exploiting transform-domain sparsity promotion. Our approach is successful because it reduces the size of seismic data volumes without loss of information. Because of this size reduction both impediments are removed and we end up with a new technology where the costs of acquisition and processing are no longer dictated by the \emphsize of the acquisition but by the transform-domain \emphsparsity of the end-product after processing.

URLhttps://slim.gatech.edu/Publications/Public/Conferences/SEG/2009/herrmann09SEGsns/herrmann09SEGsns.pdf
DOI10.1190/1.3255570
Presentation
Citation Keyherrmann2009SEGsns