zbMATH — the first resource for mathematics

Examples
Geometry Search for the term Geometry in any field. Queries are case-independent.
Funct* Wildcard queries are specified by * (e.g. functions, functorial, etc.). Otherwise the search is exact.
"Topological group" Phrases (multi-words) should be set in "straight quotation marks".
au: Bourbaki & ti: Algebra Search for author and title. The and-operator & is default and can be omitted.
Chebyshev | Tschebyscheff The or-operator | allows to search for Chebyshev or Tschebyscheff.
"Quasi* map*" py: 1989 The resulting documents have publication year 1989.
so: Eur* J* Mat* Soc* cc: 14 Search for publications in a particular source with a Mathematics Subject Classification code (cc) in 14.
"Partial diff* eq*" ! elliptic The not-operator ! eliminates all results containing the word elliptic.
dt: b & au: Hilbert The document type is set to books; alternatively: j for journal articles, a for book articles.
py: 2000-2015 cc: (94A | 11T) Number ranges are accepted. Terms can be grouped within (parentheses).
la: chinese Find documents in a given language. ISO 639-1 language codes can also be used.

Operators
a & b logic and
a | b logic or
!ab logic not
abc* right wildcard
"ab c" phrase
(ab c) parentheses
Fields
any anywhere an internal document identifier
au author, editor ai internal author identifier
ti title la language
so source ab review, abstract
py publication year rv reviewer
cc MSC code ut uncontrolled term
dt document type (j: journal article; b: book; a: book article)
On verifiable sufficient conditions for sparse signal recovery via 1 minimization. (English) Zbl 1211.90333
Summary: We discuss necessary and sufficient conditions for a sensing matrix to be “s-good”—to allow for exact 1 -recovery of sparse signals with s nonzero entries when no measurement noise is present. Then we express the error bounds for imperfect 1 -recovery (nonzero measurement noise, nearly s-sparse signal, near-optimal solution of the optimization problem yielding the 1 -recovery) in terms of the characteristics underlying these conditions. Further, we demonstrate (and this is the principal result of the paper) that these characteristics, although difficult to evaluate, lead to verifiable sufficient conditions for exact sparse 1 -recovery and to efficiently computable upper bounds on those s for which a given sensing matrix is s-good. We establish also instructive links between our approach and the basic concepts of the Compressed Sensing theory, like Restricted Isometry or Restricted Eigenvalue properties.
MSC:
90C90Applications of mathematical programming
90C05Linear programming
65K05Mathematical programming (numerical methods)
Software:
Mosek
References:
[1]Andersen E.D., Andersen K.D.: The MOSEK optimization tools manual. Version 5.0 http://www.mosek.com/fileadmin/products/5_0/tools/doc/html/tools/index.html
[2]Ben-Tal A., Nemirovski A.: Lectures on Modern Convex Optimization. SIAM, Philadelphia (2001)
[3]Bickel P.J.: Discussion of The Dantzig selector: statistical estimation when p is much larger than n. In: Candes, E.J., Tao, T. Annals of Stat. 35, 2352–2357 (2007)
[4]Bickel, P.J., Ritov, Ya., Tsybakov, A.: Simultaneous analysis of Lasso and Dantzig selector. Ann. Stat. (to appear 2008)
[5]Candès E. J., Tao T.: The Dantzig selector: statistical estimation when p is much larger than n. Ann. Stat. 35, 2313–2351 (2007) · Zbl 1139.62019 · doi:10.1214/009053606000001523
[6]Candès E. J., Tao T.: Decoding by linear programming. IEEE Trans. Inform. Theory 51, 4203–4215 (2006) · Zbl 1264.94121 · doi:10.1109/TIT.2005.858979
[7]Candès E. J., Tao T.: Near-optimal signal recovery from random projections and universal encoding strategies. IEEE Trans. Inform. Theory 52, 5406–5425 (2006) · Zbl 05455295 · doi:10.1109/TIT.2006.885507
[8]Candès E. J., Romberg J., Tao T.: Signal recovery from incomplete and inaccurate measurements. Comm. Pure Appl. Math. 59(8), 1207–1223 (2005) · Zbl 1098.94009 · doi:10.1002/cpa.20124
[9]Candès, E. J.: Compressive sampling. Sanz-Solé M. In: Javier, S., Juan, L. V., Joan, V. (eds). International Congress of Mathematicians, Madrid 2006, vol. III, pp. 1437–1452. European Mathematical Society Publishing House (2006)
[10]Candès E. J.: The restricted isometry property and its implications for compressed sensing. Comptes Rendus de l’Acad. des Sci. Ser. I 346, 589–592 (2008)
[11]Cohen, A., Dahmen, W., DeVore, R.: Compressed Sensing and Best k-term Approximation. Preprint, http://www.math.sc.edu/devore/publications/CDDSensing_6.pdf (2006)
[12]d’Aspremont, A., El Ghaoui, L.: Testing the Nullspace Property using Semidefinite Programming. Preprint. http://arxiv.org/abs/0807.3520 (2008)
[13]DeVore, R.: Deterministic Constructions of Compressed Sensing Matrices. Preprint, Department of Mathematics, University of South Carolina (2007)
[14]Donoho D., Huo X.: Uncertainty principles and ideal atomic decomposition. IEEE Trans. Inf. Theory 47(7), 2845–2862 (2001) · Zbl 1019.94503 · doi:10.1109/18.959265
[15]Donoho, D.: High-dimensional centrally-symmetric polytopes with neighborliness proportional to dimension. Technical report, Department of Statistics, Stanford University (2004)
[16]Donoho, D.: Neighborly polytopes and sparse solutions of underdetermined linear equations. Technical report, Department of Statistics, Stanford University (2004)
[17]Donoho D., Elad M., Temlyakov V. N.: Stable recovery of sparse overcomplete representations in the presence of noise. IEEE Trans. Inf. Theory 52, 6–18 (2006) · Zbl 05454300 · doi:10.1109/TIT.2005.860430
[18]Fuchs J.-J.: On sparse representations in arbitrary redundant bases. IEEE Trans. Inf. Theory 50, 1341–1344 (2004) · Zbl 05454053 · doi:10.1109/TIT.2004.828141
[19]Fuchs J.-J.: Recovery of exact sparse representations in the presence of bounded noise. IEEE Trans. Inf. Theory 51, 3601–3608 (2005) · Zbl 05454375 · doi:10.1109/TIT.2005.855614
[20]Nemirovski A.: Prox-method with rate of convergence O(1/t) for variational inequalities with Lipschitz continuous monotone operators and smooth convex-concave saddle point problems. SIAM J. Optim. 15, 229–251 (2004) · Zbl 1106.90059 · doi:10.1137/S1052623403425629
[21]Tibshirani R.: Regression shrinkage and selection via the Lasso. J. R. Stat. Soc. Ser. B 58, 267–288 (1996)
[22]Tropp J.A.: Just relax: convex programming methods for identifying sparse signals. IEEE Trans. Info. Theory 51(3), 1030–1051 (2006) · Zbl 05455043 · doi:10.1109/TIT.2005.864420
[23]Zhang, Y.: A simple proof for recoverability of ell-1-minimization: go over or under? Rice CAAM Department Technical Report TR05-09 (2005)