hierNet: A Lasso for Hierarchical Interactions. Fits sparse interaction models for continuous and binary responses subject to the strong (or weak) hierarchy restriction that an interaction between two variables only be included if both (or at least one of) the variables is included as a main effect. For more details, see Bien, J., Taylor, J., Tibshirani, R., (2013) ”A Lasso for Hierarchical Interactions.” Annals of Statistics. 41(3). 1111-1141.

References in zbMATH (referenced in 36 articles , 1 standard article )

Showing results 1 to 20 of 36.
Sorted by year (citations)

1 2 next

  1. Antonelli, Joseph; Mazumdar, Maitreyi; Bellinger, David; Christiani, David; Wright, Robert; Coull, Brent: Estimating the health effects of environmental mixtures using Bayesian semiparametric regression and sparsity inducing priors (2020)
  2. Hui, Francis K. C.; Müller, Samuel; Welsh, A. H.: The LASSO on latent indices for regression modeling with ordinal categorical predictors (2020)
  3. Shen, Sumin; Zhang, Zhiyang; Deng, Xinwei: On design and analysis of funnel testing experiments in webpage optimization (2020)
  4. Tang, Cheng Yong; Fang, Ethan X.; Dong, Yuexiao: High-dimensional interactions detection with sparse principal Hessian matrix (2020)
  5. Bhadra, Anindya; Datta, Jyotishka; Polson, Nicholas G.; Willard, Brandon: Lasso meets horseshoe: a survey (2019)
  6. Dong, Hongbo: On integer and MPCC representability of affine sparsity (2019)
  7. Dong, Hongbo; Ahn, Miju; Pang, Jong-Shi: Structural properties of affine sparsity constraints (2019)
  8. Lederer, Johannes; Yu, Lu; Gaynanova, Irina: Oracle inequalities for high-dimensional prediction (2019)
  9. Li, Yang; Liu, Jun S.: Robust variable and interaction selection for logistic regression and general index models (2019)
  10. Mak, Simon; Wu, C. F. Jeff: \textsfcmenet: A new method for bi-level variable selection of conditional main effects (2019)
  11. Sato, Toshiki; Takano, Yuichi; Nakahara, Takanobu: Investigating consumers’ store-choice behavior via hierarchical variable selection (2019)
  12. Tan, Kean Ming; Lu, Junwei; Zhang, Tong; Liu, Han: Layer-wise learning strategy for nonparametric tensor product smoothing spline regression and graphical models (2019)
  13. Tyagi, Hemant; Vybiral, Jan: Learning general sparse additive models from point queries in high dimensions (2019)
  14. Daniel, Jeffrey; Horrocks, Julie; Umphrey, Gary J.: Penalized composite likelihoods for inhomogeneous Gibbs point process models (2018)
  15. Dazard, Jean-Eudes; Ishwaran, Hemant; Mehlotra, Rajeev; Weinberg, Aaron; Zimmerman, Peter: Ensemble survival tree models to reveal pairwise interactions of variables with time-to-events outcomes in low-dimensional setting (2018)
  16. Dong, Yao; Jiang, He: A two-stage regularization method for variable selection and forecasting in high-order interaction model (2018)
  17. Hao, Ning; Feng, Yang; Zhang, Hao Helen: Model selection for high-dimensional quadratic regression via regularization (2018)
  18. Kim, Joungyoun; Lim, Johan; Kim, Yongdai; Jang, Woncheol: Bayesian variable selection with strong heredity constraints (2018)
  19. Koslovsky, M. D.; Swartz, M. D.; Leon-Novelo, L.; Chan, W.; Wilkinson, A. V.: Using the EM algorithm for Bayesian variable selection in logistic regression models with related covariates (2018)
  20. She, Yiyuan; Wang, Zhifeng; Jiang, He: Group regularized estimation under structural hierarchy (2018)

1 2 next