Free Access
Issue
ESAIM: PS
Volume 10, September 2006
Page(s) 164 - 183
DOI https://doi.org/10.1051/ps:2006004
Published online 09 March 2006
  1. F. Abramovich, Y. Benjamini, D. Donoho and I. Johnston, Adapting to unknown sparsity by controlloing the false discovery rate. Technical Report 2000-19, Department of Statistics, Stanford University (2000).
  2. H. Akaike, Information theory and an extension of the maximum likelihood principle, in 2nd International Symposium on Information Theory, B.N. Petrov and F. Csaki Eds., Budapest Akademia Kiado (1973) 267–281.
  3. H. Akaike, A bayesian analysis of the minimum aic procedure. Ann. Inst. Statist. Math. 30 (1978) 9–14. [CrossRef] [MathSciNet]
  4. A. Antoniadis, I. Gijbels and G. Grégoire, Model selection using wavelet decomposition and applications. Biometrika 84 (1997) 751–763. [CrossRef] [MathSciNet]
  5. Y. Baraud, S. Huet and B. Laurent, Adaptive tests of qualitative hypotheses. ESAIM: PS 7 (2003) 147–159. [CrossRef] [EDP Sciences]
  6. A. Barron, L. Birgé and P. Massart, Risk bounds for model selection via penalization. Probab. Theory Rel. Fields 113 (1999) 301–413. [CrossRef] [MathSciNet]
  7. Y. Benjamini and Y. Hochberg, Controlling the false discovery rate: a practical and powerful approach to multiple testing. J. R. Statist. Soc. B 57 (1995) 289–300.
  8. L. Birgé and P. Massart, Gaussian model selection. J. Eur. Math. Soc. (JEMS) 3 (2001) 203–268. [CrossRef] [MathSciNet]
  9. L. Birgé and P. Massart, A generalized cp criterion for gaussian model selection. Technical report, Univ. Paris 6, Paris 7, Paris (2001).
  10. B.S. Cirel'son, I.A. Ibragimov and V.N. Sudakov, Norm of gaussian sample function, in Proceedings of the 3rd Japan-URSS. Symposium on Probability Theory, Berlin, Springer-Verlag. Springer Lect. Notes Math. 550 (1976) 20–41. [CrossRef]
  11. H.A. David, Order Statistics. Wiley series in Probability and mathematical Statistics. John Wiley and Sons, NY (1981).
  12. E.P. Box and R.D. Meyer, An analysis for unreplicated fractional factorials. Technometrics 28 (1986) 11–18. [CrossRef] [MathSciNet]
  13. D.P. Foster and R.A. Stine, Adaptive variable selection competes with bayes expert. Technical report, The Wharton School of the University of Pennsylvania, Philadelphia (2002).
  14. S. Huet, Comparison of methods for estimating the non zero components of a gaussian vector. Technical report, INRA, MIA-Jouy, www.inra.fr/miaj/apps/cgi-bin/raptech.cgi (2005).
  15. M.C. Hurvich and C.L. Tsai, Regression and time series model selection in small samples. Biometrika 76 (1989) 297–307. [CrossRef] [MathSciNet]
  16. I. Johnston and B. Silverman, Empirical bayes selection of wavelet thresholds. Available from www.stats.ox.ac.uk/ silverma/papers.html (2003).
  17. B. Laurent and P. Massart, Adaptive estimation of a quadratic functional by model selection. Ann. Statist. 28 (2000) 1302–1338. [CrossRef] [MathSciNet]
  18. R. Nishii, Maximum likelihood principle and model selection when the true model is unspecified. J. Multivariate Anal. 27 (1988) 392–403. [CrossRef] [MathSciNet]
  19. P.D. Haaland and M.A. O'Connell, Inference for effect-saturated fractional factorials. Technometrics 37 (1995) 82–93. [CrossRef]
  20. J. Rissanen, Universal coding, information, prediction and estimation. IEEE Trans. Infor. Theory 30 (1984) 629–636. [CrossRef]
  21. R.V. Lenth, Quick and easy analysis of unreplicated factorials. Technometrics 31(4) (1989) 469–473.
  22. G. Schwarz, Estimating the dimension of a model. Ann. Statist. 6 (1978) 461–464. [NASA ADS] [CrossRef] [MathSciNet]

Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.

Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.

Initial download of the metrics may take a while.