FPC_AS
FPC_AS (fixed-point continuation and active set) is a MATLAB solver for the l1-regularized least squares problem: A fast algorithm for sparse reconstruction based on shrinkage, subspace optimization, and continuation. We propose a fast algorithm for solving the ℓ 1 -regularized minimization problem min x∈ℝ n μ∥x∥ 1 +∥Ax-b∥ 2 2 for recovering sparse solutions to an undetermined system of linear equations Ax=b. The algorithm is divided into two stages that are performed repeatedly. In the first stage a first-order iterative “shrinkage” method yields an estimate of the subset of components of x likely to be nonzero in an optimal solution. Restricting the decision variables x to this subset and fixing their signs at their current values reduces the ℓ 1 -norm ∥x∥ 1 to a linear function of x. The resulting subspace problem, which involves the minimization of a smaller and smooth quadratic function, is solved in the second phase. Our code FPC_AS embeds this basic two-stage algorithm in a continuation (homotopy) approach by assigning a decreasing sequence of values to μ. This code exhibits state-of-the-art performance in terms of both its speed and its ability to recover sparse signals
Keywords for this software
References in zbMATH (referenced in 54 articles , 1 standard article )
Showing results 1 to 20 of 54.
Sorted by year (- Zhang, Chao; Chen, Xiaojun: A smoothing active set method for linearly constrained non-Lipschitz nonconvex optimization (2020)
- Azmi, Behzad; Kunisch, Karl: A hybrid finite-dimensional RHC for stabilization of time-varying parabolic equations (2019)
- Becker, Stephen; Fadili, Jalal; Ochs, Peter: On quasi-Newton forward-backward splitting: proximal calculus and convergence (2019)
- Cheng, Wanyou; Hu, Qingjie; Li, Donghui: A fast conjugate gradient algorithm with active set prediction for (\ell_1) optimization (2019)
- Esmaeili, Hamid; Shabani, Shima; Kimiaei, Morteza: A new generalized shrinkage conjugate gradient method for sparse recovery (2019)
- Lin, Meixia; Liu, Yong-Jin; Sun, Defeng; Toh, Kim-Chuan: Efficient sparse semismooth Newton methods for the clustered Lasso problem (2019)
- Nutini, Julie; Schmidt, Mark; Hare, Warren: “Active-set complexity” of proximal gradient: how long does it take to find the sparsity pattern? (2019)
- Yang, Tianbao; Zhang, Lijun; Jin, Rong; Zhu, Shenghuo; Zhou, Zhi-Hua: A simple homotopy proximal mapping algorithm for compressive sensing (2019)
- Cheng, Wanyou; Dai, Yu-Hong: Gradient-based method with active set strategy for (\ell_1) optimization (2018)
- Cheng, Wan-You; Li, Dong-Hui: A preconditioned conjugate gradient method with active set strategy for (\ell_1)-regularized least squares (2018)
- Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; Vane, Zachary P.; Lacaze, Guilhem; Oefelein, Joseph C.; Najm, Habib N.: Compressive sensing with cross-validation and stop-sampling for sparse polynomial chaos expansions (2018)
- Li, Chong-Jun; Zhong, Yi-Jun: A pseudo-heuristic parameter selection rule for (l^1)-regularized minimization problems (2018)
- Li, Xudong; Sun, Defeng; Toh, Kim-Chuan: A highly efficient semismooth Newton augmented Lagrangian method for solving lasso problems (2018)
- Xiao, Xiantao; Li, Yongfeng; Wen, Zaiwen; Zhang, Liwei: A regularized semi-smooth Newton method with projection steps for composite convex programs (2018)
- Eghbali, Reza; Fazel, Maryam: Decomposable norm minimization with proximal-gradient homotopy algorithm (2017)
- Karimi, Sahar; Vavasis, Stephen: IMRO: A proximal quasi-Newton method for solving (\ell_1)-regularized least squares problems (2017)
- Stella, Lorenzo; Themelis, Andreas; Patrinos, Panagiotis: Forward-backward quasi-Newton methods for nonsmooth optimization problems (2017)
- Sun, Tao; Jiang, Hao; Cheng, Lizhi: Global convergence of proximal iteratively reweighted algorithm (2017)
- Byrd, Richard H.; Chin, Gillian M.; Nocedal, Jorge; Oztoprak, Figen: A family of second-order methods for convex (\ell_1)-regularized optimization (2016)
- De Santis, Marianna; Lucidi, Stefano; Rinaldi, Francesco: A fast active set block coordinate descent algorithm for (\ell_1)-regularized least squares (2016)