glasso
The graphical lasso: new insights and alternatives. The graphical lasso [5] is an algorithm for learning the structure in an undirected Gaussian graphical model, using ℓ 1 regularization to control the number of zeros in the precision matrix Θ=Σ -1 [2, 11]. The R package glasso [5] is popular, fast, and allows one to efficiently build a path of models for different values of the tuning parameter. Convergence of glasso can be tricky; the converged precision matrix might not be the inverse of the estimated covariance, and occasionally it fails to converge with warm starts. In this paper we explain this behavior, and propose new algorithms that appear to outperform glasso. By studying the “normal equations” we see that, glasso is solving the dual of the graphical lasso penalized likelihood, by block coordinate ascent; a result which can also be found in [2]. In this dual, the target of estimation is Σ, the covariance matrix, rather than the precision matrix Θ. We propose similar primal algorithms p-glasso and dp-glasso, that also operate by block-coordinate descent, where Θ is the optimization target. We study all of these algorithms, and in particular different approaches to solving their coordinate sub-problems. We conclude that dp-glasso is superior from several points of view.
Keywords for this software
References in zbMATH (referenced in 251 articles , 1 standard article )
Showing results 221 to 240 of 251.
Sorted by year (- Städler, Nicolas; Bühlmann, Peter: Missing values: sparse inverse covariance estimation and an extension to sparse regression (2012)
- Wen, Zaiwen; Goldfarb, Donald; Scheinberg, Katya: Block coordinate descent methods for semidefinite programming (2012)
- Xue, Lingzhou; Zou, Hui: Regularized rank-based estimation of high-dimensional nonparanormal graphical models (2012)
- Yin, Jianxin; Li, Hongzhe: Model selection and estimation in the matrix normal graphical model (2012)
- Zhao, Tuo; Liu, Han; Roeder, Kathryn; Lafferty, John; Wasserman, Larry: The \texttthugepackage for high-dimensional undirected graph estimation in \textttR (2012)
- Agarwal, Deepak; Zhang, Liang; Mazumder, Rahul: Modeling item-item similarities for personalized recommendations on Yahoo! front page (2011)
- Chiquet, Julien; Grandvalet, Yves; Ambroise, Christophe: Inferring multiple graphical structures (2011)
- Finegold, Michael; Drton, Mathias: Robust graphical modeling of gene networks using classical and alternative (t)-distributions (2011)
- Gehrmann, Helene: Lattices of graphical Gaussian models with symmetries (2011)
- Huang, Jian; Ma, Shuangge; Li, Hongzhe; Zhang, Cun-Hui: The sparse Laplacian shrinkage estimator for high-dimensional regression (2011)
- Lian, Heng: Shrinkage tuning parameter selection in precision matrices estimation (2011)
- Negahban, Sahand; Wainwright, Martin J.: Estimation of (near) low-rank matrices with noise and high-dimensional scaling (2011)
- Pourahmadi, Mohsen: Covariance estimation: the GLM and regularization perspectives (2011)
- Ravikumar, Pradeep; Wainwright, Martin J.; Raskutti, Garvesh; Yu, Bin: High-dimensional covariance estimation by minimizing (\ell_1)-penalized log-determinant divergence (2011)
- Yin, Jianxin; Li, Hongzhe: A sparse conditional Gaussian graphical model for analysis of genetical genomics data (2011)
- Yun, Sangwoon; Tseng, Paul; Toh, Kim-Chuan: A block coordinate gradient descent method for regularized convex separable optimization and covariance selection (2011)
- Kolar, Mladen; Song, Le; Ahmed, Amr; Xing, Eric P.: Estimating time-varying networks (2010)
- Li, Lu; Toh, Kim-Chuan: An inexact interior point method for (L_1)-regularized sparse covariance selection (2010)
- Lu, Zhaosong: Adaptive first-order methods for general sparse inverse covariance selection (2010)
- Tseng, Paul: Approximation accuracy, gradient methods, and error bound for structured convex optimization (2010)