ParNes: A rapidly convergent algorithm for accurate recovery of sparse and approximately sparse signals In this article, we propose an algorithm, nesta-lasso, for the lasso problem, i.e., an underdetermined linear least-squares problem with a 1-norm constraint on the solution. We prove under the assumption of the restricted isometry property (rip) and a sparsity condition on the solution, that nesta-lasso is guaranteed to be almost always locally linearly convergent. As in the case of the algorithm nesta, proposed by Becker, Bobin, and Cand`es, we rely on Nesterov’s accelerated proximal gradient method, which takes $O(sqrt {1/varepsilon })$ iterations to come within $varepsilon > 0$ of the optimal value. We introduce a modification to Nesterov’s method that regularly updates the prox-center in a provably optimal manner. The aforementioned linear convergence is in part due to this modification. In the second part of this article, we attempt to solve the basis pursuit denoising (bpdn) problem (i.e., approximating the minimum 1-norm solution to an underdetermined least squares problem) by using nesta-lasso in conjunction with the Pareto root-finding method employed by van den Berg and Friedlander in their spgl1 solver. The resulting algorithm is called parnes. We provide numerical evidence to show that it is comparable to currently available solvers.

References in zbMATH (referenced in 12 articles )

Showing results 1 to 12 of 12.
Sorted by year (citations)

  1. Derksen, Harm: A general theory of singular values with applications to signal denoising (2018)
  2. Alli-Oke, Razak O.; Heath, William P.: A secant-based Nesterov method for convex functions (2017)
  3. Karimi, Sahar; Vavasis, Stephen: IMRO: A proximal quasi-Newton method for solving (\ell_1)-regularized least squares problems (2017)
  4. Liang, Jingwei; Fadili, Jalal; Peyré, Gabriel: Activity identification and local linear convergence of forward-backward-type methods (2017)
  5. Neumaier, Arnold: OSGA: a fast subgradient algorithm with optimal complexity (2016)
  6. Lin, Qihang; Xiao, Lin: An adaptive accelerated proximal gradient method and its homotopy continuation for sparse optimization (2015)
  7. O’Donoghue, Brendan; Candès, Emmanuel: Adaptive restart for accelerated gradient schemes (2015)
  8. Zhu, Wei; Shu, Shi; Cheng, Lizhi: An efficient proximity point algorithm for total-variation-based image restoration (2014)
  9. Gu, Ming; Lim, Lek-Heng; Wu, Cinna Julie: ParNes: A rapidly convergent algorithm for accurate recovery of sparse and approximately sparse signals (2013)
  10. Xiao, Lin; Zhang, Tong: A proximal-gradient homotopy method for the sparse least-squares problem (2013)
  11. Becker, Stephen R.; Candès, Emmanuel J.; Grant, Michael C.: Templates for convex cone problems with applications to sparse signal recovery (2011)
  12. Zhang, H.; Cheng, L. Z.: Projected Landweber iteration for matrix completion (2010)