TFOCS

TFOCS: Templates for First-Order Conic Solvers. TFOCS (pronounced tee-fox) provides a set of Matlab templates, or building blocks, that can be used to construct efficient, customized solvers for a variety of convex models, including in particular those employed in sparse recovery applications. It was conceived and written by Stephen Becker, Emmanuel J. Candès and Michael Grant.


References in zbMATH (referenced in 120 articles , 1 standard article )

Showing results 1 to 20 of 120.
Sorted by year (citations)

1 2 3 4 5 6 next

  1. Gong, Yuxuan; Li, Peijun; Wang, Xu; Xu, Xiang: Numerical solution of an inverse random source problem for the time fractional diffusion equation via phaselift (2021)
  2. Jakob S. Jørgensen, Evelina Ametova, Genoveva Burca, Gemma Fardell, Evangelos Papoutsellis, Edoardo Pasca, Kris Thielemans, Martin Turner, Ryan Warr, William R. B. Lionheart, Philip J. Withers: Core Imaging Library - Part I: a versatile Python framework for tomographic imaging (2021) arXiv
  3. Folberth, James; Becker, Stephen: Safe feature elimination for non-negativity constrained convex optimization (2020)
  4. Kikuchi, Paula A.; Oliveira, Aurelio R. L.: New preconditioners applied to linear programming and the compressive sensing problems (2020)
  5. Ahookhosh, Masoud: Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity (2019)
  6. Bao, Weizhu; Ruan, Xinran: Computing ground states of Bose-Einstein condensates with higher order interaction via a regularized density function formulation (2019)
  7. Beck, Amir; Guttmann-Beck, Nili: FOM -- a MATLAB toolbox of first-order methods for solving convex optimization problems (2019)
  8. Friedlander, Michael P.; Macêdo, Ives; Pong, Ting Kei: Polar convolution (2019)
  9. Liu, Tianxiang; Pong, Ting Kei; Takeda, Akiko: A successive difference-of-convex approximation method for a class of nonconvex nonsmooth optimization problems (2019)
  10. Lorenz, Dirk A.; Tran-Dinh, Quoc: Non-stationary Douglas-Rachford and alternating direction method of multipliers: adaptive step-sizes and convergence (2019)
  11. Renegar, James: Accelerated first-order methods for hyperbolic programming (2019)
  12. Sun, Tianxiao; Quoc, Tran-Dinh: Generalized self-concordant functions: a recipe for Newton-type methods (2019)
  13. Tran-Dinh, Quoc: Proximal alternating penalty algorithms for nonsmooth constrained convex optimization (2019)
  14. Wen, Bo; Xue, Xiaoping: On the convergence of the iterates of proximal gradient algorithm with extrapolation for convex nonsmooth minimization problems (2019)
  15. Wong, Raymond K. W.; Zhang, Xiaoke: Nonparametric operator-regularized covariance function estimation for functional data (2019)
  16. Yu, Peiran; Pong, Ting Kei: Iteratively reweighted (\ell_1) algorithms with extrapolation (2019)
  17. Zhang, Rui; Feng, Xiangchu; Yang, Lixia; Chang, Lihong; Zhu, Xiaolong: A global sparse gradient based coupled system for image denoising (2019)
  18. Ahookhosh, Masoud: Optimal subgradient methods: computational properties for large-scale linear inverse problems (2018)
  19. Aravkin, Aleksandr Y.; Burke, James V.; Pillonetto, Gianluigi: Generalized system identification with stable spline kernels (2018)
  20. Bottou, Léon; Curtis, Frank E.; Nocedal, Jorge: Optimization methods for large-scale machine learning (2018)

1 2 3 4 5 6 next