Randomized sampling strategies

TitleRandomized sampling strategies
Publication TypeConference
Year of Publication2010
AuthorsFelix J. Herrmann
Conference NameEAGE Annual Conference Proceedings
KeywordsEAGE, Presentation

Seismic exploration relies on the collection of massive data volumes that are subsequently mined for information during seismic processing. While this approach has been extremely successful in the past, the current trend towards higher quality images in increasingly complicated regions continues to reveal fundamental shortcomings in our workflows for high-dimensional data volumes. Two causes can be identified. First, there is the so-called ‘‘curse of dimensionality’’ exemplified by Nyquist’s sampling criterion, which puts disproportionate strain on current acquisition and processing systems as the size and desired resolution of our survey areas continues to increase. Secondly, there is the recent ‘‘departure from Moore’s law’’ that forces us to lower our expectations to compute ourselves out of this curse of dimensionality. In this paper, we offer a way out of this situation by a deliberate randomized subsampling combined with structure-exploiting transform-domain sparsity promotion. Our approach is successful because it reduces the size of seismic data volumes without loss of information. As such we end up with a new technology where the costs of acquisition and processing are no longer dictated by the size of the acquisition but by the transform-domain sparsity of the end-product.

Citation Keyherrmann2010EAGErss