Simply denoise: wavefield reconstruction via jittered undersampling |
For regularly-undersampled data along one or more spatial coordinates, i.e., data spatially sampled below Nyquist rate, there exists a wide variety of wavefield reconstruction techniques. Filter-based methods interpolate by convolution with a filter designed such that the error is white noise. The most common of these filters are the prediction error filters (PEF's) that can handle aliased events (Spitz, 1991). Wavefield-operator-based methods represent another type of interpolation approaches that explicitly include wave propagation (Stolt, 2002; Canning and Gardner, 1996; Biondi et al., 1998). Finally, transform-based methods also provide efficient algorithms for seismic data regularization (Sacchi et al., 1998; Herrmann and Hennenfent, 2007; Zwartjes and Sacchi, 2007; Trad et al., 2003). However, for irregularly-sampled data, e.g., binned data with some of the bins that are empty, or data that are continuous random undersampled, the performance of most of the aforementioned interpolation methods deteriorates.
The objective of this paper is to demonstrate that irregular/random
undersampling is not a drawback for particular transform-based
interpolation methods and for many other advanced processing
algorithms as was already observed by other authors
(Trad and Ulrych, 1999; Xu et al., 2005; Zhou and Schuster, 1995; Abma and Kabir, 2006; Zwartjes and Sacchi, 2007; Sun et al., 1997).
We explain why random undersampling is an advantage and how it can be
used to our benefit when designing coarse sampling schemes. To keep
the discussion as clear and concise as possible, we focus on regular
sampling with randomly missing data points, i.e., discrete random
(under)sampling. Our conclusions extend to continuous random
undersampling though. Unless otherwise specified, the term random is
used in the remaining of the text in the discrete sense.
Simply denoise: wavefield reconstruction via jittered undersampling |