Slope heuristics: overview and implementation. Model selection is a general paradigm which includes many statistical problems. One of the most fruitful and popular approaches to carry it out is the minimization of a penalized criterion. L. Birgé and P. Massart [Probab. Theory Relat. Fields 138, No. 1–2, 33–73 (2007; Zbl 1112.62082)] have proposed a promising data-driven method to calibrate such criteria whose penalties are known up to a multiplicative factor: the “slope heuristics”. Theoretical works validate this heuristic method in some situations and several papers report a promising practical behavior in various frameworks. The purpose of this work is twofold. First, an introduction to the slope heuristics and an overview of the theoretical and practical results about it are presented. Second, we focus on the practical difficulties occurring for applying the slope heuristics. A new practical approach is carried out and compared to the standard dimension jump method. All the practical solutions discussed in this paper in different frameworks are implemented and brought together in a Matlab graphical user interface called CAPUSHE. Supplemental Materials containing further information and an additional application, the CAPUSHE package and the datasets presented in this paper, are available on the journal Web site.

References in zbMATH (referenced in 37 articles , 1 standard article )

Showing results 1 to 20 of 37.
Sorted by year (citations)

1 2 next

  1. Lehéricy, Luc: Consistent order estimation for nonparametric hidden Markov models (2019)
  2. Brault, Vincent; Ouadah, Sarah; Sansonnet, Laure; Lévy-Leduc, Céline: Nonparametric multiple change-point estimation for analyzing large Hi-C data matrices (2018)
  3. Comte, Fabienne; Samson, Adeline; Stirnemann, Julien J.: Hazard estimation with censoring and measurement error: application to length of pregnancy (2018)
  4. Comte, F.; Duval, C.: Statistical inference for renewal processes (2018)
  5. Devijver, Emilie; Gallopin, Mélina: Block-diagonal covariance selection for high-dimensional Gaussian graphical models (2018)
  6. Fop, Michael; Murphy, Thomas Brendan: Variable selection methods for model-based clustering (2018)
  7. Garreau, Damien; Arlot, Sylvain: Consistent change-point detection with kernels (2018)
  8. Gassiat, Elisabeth; Rousseau, Judith; Vernet, Elodie: Efficient semiparametric estimation and model selection for multidimensional mixtures (2018)
  9. Lehéricy, Luc: State-by-state minimax adaptive estimation for nonparametric hidden Markov models (2018)
  10. Li, Le; Guedj, Benjamin; Loustau, Sébastien: A quasi-Bayesian perspective to online clustering (2018)
  11. Chagny, G.; Comte, F.; Roche, A.: Adaptive estimation of the hazard rate with multiplicative censoring (2017)
  12. Devijver, Emilie: Joint rank and variable selection for parsimonious estimation in a high-dimensional finite mixture regression model (2017)
  13. Lacour, Claire; Massart, Pascal; Rivoirard, Vincent: Estimator selection: a new method with applications to kernel density estimation (2017)
  14. Navarro, Fabien; Saumard, Adrien: Slope heuristics and V-fold model selection in heteroscedastic regression using strongly localized bases (2017)
  15. Brunel, Élodie; Mas, André; Roche, Angelina: Non-asymptotic adaptive prediction in functional linear models (2016)
  16. Comte, Fabienne; Rebafka, Tabea: Nonparametric weighted estimators for biased data (2016)
  17. Mabon, Gwennaëlle: Adaptive deconvolution of linear functionals on the nonnegative real line (2016)
  18. Baudry, Jean-Patrick: Estimation and model selection for model-based clustering with the conditional classification likelihood (2015)
  19. Baudry, Jean-Patrick; Celeux, Gilles: EM for mixtures (2015)
  20. Bouveyron, Charles; Côme, Etienne; Jacques, Julien: The discriminative functional mixture model for a comparative analysis of bike sharing systems (2015)

1 2 next