Free Access
Volume 19, 2015
Page(s) 725 - 745
Published online 11 December 2015
  1. P.J. Bickel, Y. Ritov and A.B. Tsybakov, Simultaneous analysis of Lasso and Dantzig selector. Ann. Stat. 37 (2009) 1705–1732. [Google Scholar]
  2. P.J. Bickel, Y. Ritov and A.B. Tsybakov, Simultaneous analysis of lasso and Dantzig selector. Ann. Stat. 37 (2009) 1705–1732. [Google Scholar]
  3. P.L. Bühlmann and S.A. van de Geer, Statistics for High-Dimensional Data. Springer (2011). [Google Scholar]
  4. E.J. Candès and Y. Plan, Near-ideal model selection by L1 minimization. Ann. Stat. 37 (2009) 2145–2177. [CrossRef] [Google Scholar]
  5. E.J. Candes and T. Tao, The Dantzig selector: statistical estimation when p is much larger than n. Ann. Stat. 35 (2007) 2313–2351. [Google Scholar]
  6. E.J. Candes, J.K. Romberg and T. Tao, Stable signal recovery from incomplete and inaccurate measurements. Commun. Pure Appl. Math. 59 (2006) 1207–1223. [CrossRef] [MathSciNet] [Google Scholar]
  7. D. Chafaı, O. Guédon, G. Lecué and A. Pajor, Interactions Between Compressed Sensing, Random Matrices, and High Dimensional Geometry. Panoramas et Synthèses. SMF (2012). [Google Scholar]
  8. Y. de Castro, A remark on the lasso and the Dantzig selector. Stat. Probab. Lett. (2012). [Google Scholar]
  9. D.L. Donoho, M. Elad and V.N. Temlyakov, Stable recovery of sparse overcomplete representations in the presence of noise. Inf. Theory IEEE Trans. 52 (2006) 6–18. [Google Scholar]
  10. B. Efron, T. Hastie, I. Johnstone and R.J. Tibshirani, Least angle regression. Ann. Stat. 32 (2004) 407–499. [CrossRef] [MathSciNet] [Google Scholar]
  11. Jianqing Fan, Yang Feng and Rui Song, Nonparametric independence screening in sparse ultra-high-dimensional additive models. J. Am. Stat. Assoc. 106 (2011). [Google Scholar]
  12. J.-J. Fuchs, On sparse representations in arbitrary redundant bases. Inf. Theory IEEE Trans. 50 (2004) 1341–1344. [Google Scholar]
  13. F. Gamboa, A. Janon, T. Klein, A. Lagnoux-Renaudie and C. Prieur, Statistical inference for Sobol pick freeze Monte Carlo method. Preprint arXiv:1303.6447 (2013). [Google Scholar]
  14. R. Gray, Toeplitz and Circulant Matrices: A Review. Now Publishers Inc. (2006). [Google Scholar]
  15. W. Hoeffding, Probability Inequalities for Sums of Bounded Random Variables. J. Am. Statist. Assoc. 58 (1963) 13–30. [Google Scholar]
  16. A. Janon, T. Klein, A. Lagnoux, M. Nodet and C. Prieur, Asymptotic normality and efficiency of two Sobol index estimators. Preprint available at (2012). [Google Scholar]
  17. A. Juditsky and A. Nemirovski, Accuracy Guarantees for-Recovery. Inf. Theory IEEE Trans. 57 (2011) 7818–7839. [CrossRef] [Google Scholar]
  18. R. Liu and A.B. Owen, Estimating Mean Dimensionality. Department of Statistics, Stanford University (2003). [Google Scholar]
  19. K. Lounici, Sup-norm convergence rate and sign concentration property of Lasso and Dantzig estimators. Electron. J. Stat. 2 (2008) 90–102. [CrossRef] [Google Scholar]
  20. H. Monod, C. Naud and D. Makowski, Uncertainty and sensitivity analysis for crop models. In Chap. 4. Working with Dynamic Crop Models: Evaluation, Analysis, Parameterization, and Applications. Edited by D. Wallach, D. Makowski and J.W. Jones. Elsevier (2006) 55–99. [Google Scholar]
  21. M.D. Morris, Factorial sampling plans for preliminary computational experiments. Technometrics 33 (1991) 161–174. [Google Scholar]
  22. A. Saltelli, M. Ratto, T. Andres, F. Campolongo, J. Cariboni, D. Gatelli, M. Saisana and S. Tarantola, Global Sensitivity Analysis: The Primer. Wiley Online Library (2008). [Google Scholar]
  23. I.M. Sobol, Sensitivity estimates for nonlinear mathematical models. Math. Model. Comput. Experiment 1 (1993) 407–414. [Google Scholar]
  24. I.M. Sobol, Global sensitivity indices for nonlinear mathematical models and their Monte Carlo estimates. Math. Comput. Simul. 55 (2001) 271–280. [Google Scholar]
  25. S. Tarantola, et al., Estimating the approximation error when fixing unessential factors in global sensitivity analysis. Reliab. Eng. Syst. Safety 92 (2007) 957–960. [CrossRef] [Google Scholar]
  26. R.J. Tibshirani, Regression shrinkage and selection via the lasso. J. R. Stat. Soc. Ser. B. Methodological (1996) 267–288. [Google Scholar]
  27. J.-Y. Tissot and C. Prieur, Bias correction for the estimation of sensitivity indices based on random balance designs. Reliab. Eng. Syst. Safety 107 (2012) 205–213. [Google Scholar]
  28. J.-Y. Tissot and C. Prieur, Estimating Sobol’Indices Combining Monte Carlo Estimators and Latin Hypercube Sampling (2012). [Google Scholar]
  29. J.A. Tropp, Just relax: Convex programming methods for identifying sparse signals in noise. Inf. Theory IEEE Trans. 52 (2006) 1030–1051. [CrossRef] [MathSciNet] [Google Scholar]
  30. S.A. van de Geer, and P. Bühlmann, On the conditions used to prove oracle results for the Lasso. Electron. J. Stat. 3 (2009) 1360–1392. [Google Scholar]
  31. L. Welch, Lower bounds on the maximum cross correlation of signals (Corresp.). Inf. Theory, IEEE Trans. 20 (1974) 397–399. [CrossRef] [Google Scholar]
  32. P. Zhao and B. Yu, On model selection consistency of Lasso. J. Mach. Learn. Res. 7 (2006) 2541–2563. [Google Scholar]

Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.

Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.

Initial download of the metrics may take a while.