Sparse recovery by non-convex optimization - instance optimality
Title | Sparse recovery by non-convex optimization - instance optimality |
Publication Type | Journal Article |
Year of Publication | 2010 |
Authors | Rayan Saab, Ozgur Yilmaz |
Journal | Applied and Computational Harmonic Analysis |
Volume | 29 |
Pagination | 30-48 |
Month | 07 |
Keywords | Compressive Sensing, non-convex |
Abstract | In this note, we address the theoretical properties of Δp, a class of compressed sensing decoders that rely on lp minimization with p∈(0,1) to recover estimates of sparse and compressible signals from incomplete and inaccurate measurements. In particular, we extend the results of Cand‘es, Romberg and Tao [3] and Wojtaszczyk [30] regarding the decoder Δ1, based on ℓ1 minimization, to Δp with p \in (0,1). Our results are two-fold. First, we show that under certain sufficient conditions that are weaker than the analogous sufficient conditions for \Delta_1 the decoders \Delta_p are robust to noise and stable in the sense that they are (2,p) instance optimal. Second, we extend the results of Wojtaszczyk to show that, like \Delta_1, the decoders \Delta_p are (2,2) instance optimal in probability provided the measurement matrix is drawn from an appropriate distribution. While the extension of the results of [3] to the setting where p \in (0,1) is straightforward, the extension of the instance optimality in probability result of [30] is non-trivial. In particular, we need to prove that the LQ_1 property, introduced in [30], and shown to hold for Gaussian matrices and matrices whose columns are drawn uniformly from the sphere, generalizes to an LQ_p property for the same classes of matrices. Our proof is based on a result by Gordon and Kalton [18] about the Banach-Mazur distances of p-convex bodies to their convex hulls. |
URL | http://www.sciencedirect.com/science/article/pii/S1063520309000864 |
DOI | 10.1016/j.acha.2009.08.002 |
URL2 | |
Citation Key | saab2008ACHAsrb |