EigenPrism: inference for high dimensional signal-to-noise ratios. Consider the following three important problems in statistical inference: constructing confidence intervals for the error of a high dimensional (p>n) regression estimator, the linear regression noise level and the genetic signal-to-noise ratio of a continuous-valued trait (related to the heritability). All three problems turn out to be closely related to the little-studied problem of performing inference on the l 2 -norm of the signal in high dimensional linear regression. We derive a novel procedure for this, which is asymptotically correct when the covariates are multivariate Gaussian and produces valid confidence intervals in finite samples as well. The procedure, called EigenPrism, is computationally fast and makes no assumptions on coefficient sparsity or knowledge of the noise level. We investigate the width of the EigenPrism confidence intervals, including a comparison with a Bayesian setting in which our interval is just 5% wider than the Bayes credible interval. We are then able to unify the three aforementioned problems by showing that EigenPrism with only minor modifications can make important contributions to all three. We also investigate the robustness of coverage and find that the method applies in practice and in finite samples much more widely than just the case of multivariate Gaussian covariates. Finally, we apply EigenPrism to a genetic data set to estimate the genetic signal-to-noise ratio for a number of continuous phenotypes.

References in zbMATH (referenced in 15 articles , 1 standard article )

Showing results 1 to 15 of 15.
Sorted by year (citations)

  1. Bradic, Jelena; Fan, Jianqing; Zhu, Yinchu: Testability of high-dimensional linear models with nonsparse structures (2022)
  2. Gamarnik, David; Zadik, Ilias: Sparse high-dimensional linear regression. Estimating squared error and a phase transition (2022)
  3. Comminges, L.; Collier, O.; Ndaoud, M.; Tsybakov, A. B.: Adaptive robust estimation in sparse vector model (2021)
  4. Law, Michael; Ritov, Ya’acov: Inference without compatibility: using exponential weighting for inference on a parameter of a linear model (2021)
  5. Wang, Rui; Xu, Xingzhong: A Bayesian-motivated test for high-dimensional linear regression models with fixed design matrix (2021)
  6. Azriel, David; Schwartzman, Armin: Estimation of linear projections of non-sparse coefficients in high-dimensional regression (2020)
  7. Barber, Rina Foygel; Candès, Emmanuel J.: A knockoff filter for high-dimensional selective inference (2019)
  8. Guo, Zijian; Wang, Wanjie; Cai, T. Tony; Li, Hongzhe: Optimal estimation of genetic relatedness in high-dimensional linear models (2019)
  9. Zhao, Qingyuan: Covariate balancing propensity score by tailored loss functions (2019)
  10. Cai, T. Tony; Guo, Zijian: Accuracy assessment for high-dimensional linear regression (2018)
  11. Holmes, Susan: Statistical proof? The problem of irreproducibility (2018)
  12. Javanmard, Adel; Montanari, Andrea: Debiasing the Lasso: optimal sample size for Gaussian designs (2018)
  13. Verzelen, Nicolas; Gassiat, Elisabeth: Adaptive estimation of high-dimensional signal-to-noise ratios (2018)
  14. Zhu, Yinchu; Bradic, Jelena: Significance testing in non-sparse high-dimensional linear models (2018)
  15. Janson, Lucas; Barber, Rina Foygel; Candès, Emmanuel: EigenPrism: inference for high dimensional signal-to-noise ratios (2017)