Free Access
Volume 23, 2019
Page(s) 430 - 463
Published online 07 August 2019
  1. M. Abramowitz and I. Stegun, Handbook of Mathematical Functions. National Bureau of Standards, Washington DC (1965) [Google Scholar]
  2. R. Adamczak, A. E. Litvak, A. Pajor and N. Tomczak-Jaegermann, Restricted isometry property of matrices with independent columns and neighborly polytopes by random sampling. Constr. Approx. 34 (2011) 61–88 [Google Scholar]
  3. D. Amelunxen, M. Lotz, M. McCoy and J. A. Tropp, Living on the edge: phase transitions in convex programs with random data. Inf. Inference 3 (2014) 224–294 [Google Scholar]
  4. U. Ayaz, S. Dirksen and H. Rauhut, Uniform recovery of fusion frame structured sparse signals. Appl. Comput. Harmon. Anal. 41 (2016) 341–361 [Google Scholar]
  5. J.-M. Azaïs, Y. De Castro and S. Mourareau, A rice method proof of the null-space property over the grassmannian. Ann. Inst. Henri Poincaré - Probab. Stat. 53 (2017) 1821–1838 [CrossRef] [Google Scholar]
  6. B. Bah and J. Tanner, Improved bounds on restricted isometry constants for gaussian matrices. SIAM J. Matrix Anal. Appl. 31 (2010) 2882–2898 [Google Scholar]
  7. B. Bah and J. Tanner, Bounds of restricted isometry constants in extreme asymptotics: formulae for Gaussian matrices. Linear Algebra Appl. 441 (2014) 88–109 [Google Scholar]
  8. R. Baraniuk, M. Davenport, R. DeVore and M. Wakin, A simple proof of the restricted isometry property for random matrices. Constr. Approx. 28 (2008) 253–263 [Google Scholar]
  9. K. Bertin, E. Le Pennec and V. Rivoirard, Adaptive dantzig density estimation. Ann. Inst. Henri Poincaré - Probab. Stat. 47 (2011) 43–74 [CrossRef] [Google Scholar]
  10. P. J. Bickel, Y. Ritov and A. B. Tsybakov, Simultaneous analysis of lasso and Dantzig selector. Ann. Stat. 37 (2009) 1705–1732 [Google Scholar]
  11. J. D. Blanchard, C. Cartis and J. Tanner, Compressed sensing: how sharp is the restricted isometry property? SIAM Rev. 53 (2011) 105–125 [CrossRef] [Google Scholar]
  12. A. Borodin and P. Forrester, Increasing subsequences and the hard-to-soft edge transition in matrix ensembles. J. Phys. A 36 (2003) 2963–2981 [CrossRef] [Google Scholar]
  13. T. T. Cai and A. Zhang, Sparse representation of a polytope and recovery of sparse signals and low-rank matrices. IEEE Trans. Inf. Theory 60 (2014) 122–132 [Google Scholar]
  14. E. J. Candés, J. Romberg and T. Tao, Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information. IEEE Trans. Inf. Theory 52 (2006) 489–509 [Google Scholar]
  15. E. J. Candès and T. Tao, Decoding by linear programming. IEEE Trans. Inf. Theory 51 (2005) 4203–4215 [Google Scholar]
  16. E. J. Candès and T. Tao, Near-optimal signal recovery from random projections: universal encoding strategies? IEEE Trans. Inf. Theory 52 (2006) 5406–5425 [Google Scholar]
  17. E. J. Candes, The restricted isometry property and its implications for compressed sensing. C. R. Math. Acad. Sci. Paris 346 (2008) 589–592 [CrossRef] [Google Scholar]
  18. Y. De Castro A remark on the lasso and the dantzig selector. Stat. Probab. Lett. 83 (2013) 304–314 [Google Scholar]
  19. D. Chafaï, O. Guédon, G. Lecué and A. Pajor, Interaction Between Compressed Sensing, Random Matrices and High Dimensional Geometry. No 37 of Panoramas et synthèses. Société Mathématique de France (2012) [Google Scholar]
  20. K. R. Davidson and S. J. Szarek, Local operator theory, random matrices and banach spaces. In Handbook of the Geometry of Banach Spaces, vol. 1. Elsevier, Amsterdam (2001) 317–366 [CrossRef] [Google Scholar]
  21. S. Dirksen, Tail bounds via generic chaining. Electron. J. Probab. 20 (2015) 53 [Google Scholar]
  22. D. L. Donoho and J. Tanner, Neighborliness of randomly projected simplices in high dimensions. Proc. Nat. Acad. Sci. USA 102 (2005) 9452–9457 [CrossRef] [Google Scholar]
  23. D. L. Donoho and J. Tanner, Counting faces of randomly projected polytopes when the projection radically lowers dimension. J. Am. Math. Soc. 22 (2009) 1–53 [CrossRef] [Google Scholar]
  24. D. L. Donoho and J. Tanner, Observed universality of phase transitions in high-dimensional geometry, with implications for modern data analysis and signal processing. Philos. Trans. R. Soc. A Math. Phys. Eng. Sci. 367 (2009) 4273–4293 [CrossRef] [Google Scholar]
  25. O. N. Feldheim and S. Sodin, A universality result for the smallest eigenvalues of certain sample covariance matrices. Geom. Funct. Anal. 20 (2010) 88–123 [CrossRef] [Google Scholar]
  26. S. Foucart and M.-J. Lai, Sparsest solutions of underdetermined linear systems via lq-minimization for 0 < q < = 1. Appl. Comput. Harmon. Anal. 26 (2009) 395–407. [Google Scholar]
  27. S. Foucart and H. Rauhut, A Mathematical Introduction to Compressive Sensing. Springer (2013) [CrossRef] [Google Scholar]
  28. K. Johansson, Shape fluctuations and random matrices. Commun. Math. Phys. 209 (2000) 437–476 [CrossRef] [MathSciNet] [Google Scholar]
  29. A. Juditsky and A. Nemirovski, Accuracy guarantees for l1-recovery. IEEE Trans. Inf. Theory 57 (2011) 7818–7839 [Google Scholar]
  30. G. Lecué and S. Mendelson, Sparse recovery under weak moment assumptions. J. Eur. Math. Soc. 19 (2017) 881–904 [CrossRef] [Google Scholar]
  31. G. Lecué and S. Mendelson, Regularization and the small-ball method i: sparse recovery. Ann. Stat. 46 (2018) 611–641 [Google Scholar]
  32. M. Ledoux and B. Rider, Small deviations for beta ensembles. Electron. J. Probab. 15 (2010) 1319–1343 [Google Scholar]
  33. M. Ledoux, Deviation inequalities on largest eigenvalues. In Geometric Aspects of Functional Analysis, vol. 1910 of Lecture Notes in Mathematics. Springer, Berlin (2007) 167–219 [CrossRef] [Google Scholar]
  34. C. Liaw, A. Mehrabian, Y. Plan and R. Vershynin, A simple tool for bounding the deviation of random matrices on geometric sets. In Geometric Aspects of Functional Analysis. Springer (2017) 277–299 [CrossRef] [Google Scholar]
  35. A. E. Litvak, A. Pajor, M. Rudelson and N. Tomczak-Jaegermann, Smallest singular value of random matrices and geometry of random polytopes. Adv. Math. 195 (2005) 491–523 [CrossRef] [Google Scholar]
  36. M. B. McCoy and J. A. Tropp, Sharp recovery bounds for convex demixing, with applications. Found. Comput. Math. 14 (2014) 503–567 [CrossRef] [Google Scholar]
  37. S. Mendelson, A. Pajor and N. Tomczak-Jaegermann, Uniform uncertainty principle for Bernoulli and subgaussian ensembles. Constr. Approx. 28 (2008) 277–289 [Google Scholar]
  38. S. Péché, Universality results for the largest eigenvalues of some sample covariance matrix ensembles. Probab. Theory Relat. Fields 143 (2009) 481–516 [Google Scholar]
  39. N. Pillai and J. Yin, Universality of covariance matrices. Ann. Appl. Probab. 24 (2014) 935–1001 [Google Scholar]
  40. M. Rudelson and R. Vershynin, On sparse reconstruction from Fourier and Gaussian measurements. Commun. Pure Appl. Math. 61 (2008) 1025–1045 [Google Scholar]
  41. M. Rudelson and R. Vershynin, Non-asymptotic theory of random matrices: extreme singular values, in Proceedings of the International Congress of Mathematicians. vol. III. Hindustan Book Agency, New Delhi (2010) 1576–1602 [Google Scholar]
  42. A. Soshnikov, A note on universality of the distribution of the largest eigenvalues in certain sample covariance matrices. J. Stat. Phys. 108 (2002) 1033–1056 [Google Scholar]
  43. Y. Traonmilin and R. Gribonval, Stable recovery of low-dimensional cones in hilbert spaces: one rip to rule them all. Appl. Comput. Harmon. Anal. 45 (2016) 170–205 [Google Scholar]
  44. S. A. van de Geer and P. Bühlmann, On the conditions used to prove Oracle results for the lasso. Electron. J. Stat. 3 (2009) 1360–1392 [Google Scholar]
  45. K. Wang, Random covariance matrices: universality of local statistics of eigenvalues up to the edge. Random Matrices Theory Appl. 1 (2012) 1150005 [CrossRef] [Google Scholar]

Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.

Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.

Initial download of the metrics may take a while.