EbayesThresh library is a collection of MATLAB™ scripts that complements the paper ”Needles and straw in haystacks: Empirical Bayes approaches to thresholding a possibly sparse sequence” and ”Empirical Bayes selection of wavelet thresholds” by Iain M. Johnstone and Bernard W. Silverman, submitted for publication 2002. A paper giving a general description of the software and some details both of the general methodology and of some specific technical matters is available here. The scripts in this library are a translation of the corresponding R package or S-PLUS library. The ebayesthresh_wavelet function applies the approach to wavelet transforms obtained with the WAVELAB matlab toolbox developed at Stanford by Buckheit, Chen, Donoho, Johnstone & Scargle (1995). If wavelet transforms are obtained using other software, the routine will not be applicable directly, but should still provide a model for the user to write their own wavelet smoothing routine making use of the function ebayesthresh. The software may be downloaded and used freely for academic purposes, provided its use is acknowledged. Commercial use is not allowed without the permission of the authors. Please bring any problems or errors to the author’s attention. The entire MATLAB source code, in compressed zip form, is available for download from: ..

References in zbMATH (referenced in 112 articles )

Showing results 1 to 20 of 112.
Sorted by year (citations)

1 2 3 4 5 6 next

  1. Amato, Umberto; Antoniadis, Anestis; De Feis, Italia: Flexible, boundary adapted, nonparametric methods for the estimation of univariate piecewise-smooth functions (2020)
  2. Belitser, Eduard; Nurushev, Nurzhan: Needles and straw in a haystack: robust confidence for possibly sparse sequences (2020)
  3. Castillo, Ismaël; Szabó, Botond: Spike and slab empirical Bayes sparse credible sets (2020)
  4. Dattner, Itai; Ship, Harold; Voit, Eberhard O.: Separable nonlinear least-squares parameter estimation for complex dynamic systems (2020)
  5. Fischer, Aurélie; Picard, Dominique: On change-point estimation under Sobolev sparsity (2020)
  6. Piironen, Juho; Paasiniemi, Markus; Vehtari, Aki: Projective inference in high-dimensional problems: prediction and feature selection (2020)
  7. Bai, Ray; Ghosh, Malay: Large-scale multiple hypothesis testing with the normal-beta prime prior (2019)
  8. Bhadra, Anindya; Datta, Jyotishka; Polson, Nicholas G.; Willard, Brandon: Lasso meets horseshoe: a survey (2019)
  9. De Wiel, Mark A. van; Te Beest, Dennis E.; Münch, Magnus M.: Learning from a lot: empirical Bayes for high-dimensional model-based prediction (2019)
  10. van der Vaart, Aad: Comment: “Bayes, oracle Bayes and empirical Bayes” (2019)
  11. Zhu, Li; Huo, Zhiguang; Ma, Tianzhou; Oesterreich, Steffi; Tseng, George C.: Bayesian indicator variable selection to incorporate hierarchical overlapping group structure in multi-omics applications (2019)
  12. Brown, Lawrence D.; Mukherjee, Gourab; Weinstein, Asaf: Empirical Bayes estimates for a two-way cross-classified model (2018)
  13. Castillo, Ismaël; Mismer, Romain: Empirical Bayes analysis of spike and slab posterior distributions (2018)
  14. Faulkner, James R.; Minin, Vladimir N.: Locally adaptive smoothing with Markov random fields and shrinkage priors (2018)
  15. Ma, Li; Soriano, Jacopo: Efficient functional ANOVA through wavelet-domain Markov groves (2018)
  16. Park, Junyong: Simultaneous estimation based on empirical likelihood and general maximum likelihood estimation (2018)
  17. Ročková, Veronika: Bayesian estimation of sparse signals with a continuous spike-and-slab prior (2018)
  18. Ročková, Veronika; George, Edward I.: The spike-and-slab LASSO (2018)
  19. Belitser, Eduard; Nurushev, Nurzhan: Local posterior concentration rate for multilevel sparse sequences (2017)
  20. Fenga, Livio; Politis, Dimitris N.: LASSO order selection for sparse autoregression: a bootstrap approach (2017)

1 2 3 4 5 6 next