Saga

SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives. In this work we introduce a new optimisation method called SAGA in the spirit of SAG, SDCA, MISO and SVRG, a set of recently proposed incremental gradient algorithms with fast linear convergence rates. SAGA improves on the theory behind SAG and SVRG, with better theoretical convergence rates, and has support for composite objectives where a proximal operator is used on the regulariser. Unlike SDCA, SAGA supports non-strongly convex problems directly, and is adaptive to any inherent strong convexity of the problem. We give experimental results showing the effectiveness of our method.


References in zbMATH (referenced in 85 articles )

Showing results 1 to 20 of 85.
Sorted by year (citations)

1 2 3 4 5 next

  1. Belomestny, Denis; Iosipoi, Leonid; Moulines, Eric; Naumov, Alexey; Samsonov, Sergey: Variance reduction for dependent sequences with applications to stochastic gradient MCMC (2021)
  2. Bian, Fengmiao; Liang, Jingwei; Zhang, Xiaoqun: A stochastic alternating direction method of multipliers for non-smooth and non-convex optimization (2021)
  3. Chen, Chenxi; Chen, Yunmei; Ye, Xiaojing: A randomized incremental primal-dual method for decentralized consensus optimization (2021)
  4. Duchi, John C.; Ruan, Feng: Asymptotic optimality in stochastic optimization (2021)
  5. Gower, Robert M.; Richtárik, Peter; Bach, Francis: Stochastic quasi-gradient methods: variance reduction via Jacobian sketching (2021)
  6. Gürbüzbalaban, M.; Ozdaglar, A.; Parrilo, P. A.: Why random reshuffling beats stochastic gradient descent (2021)
  7. Hu, Bin; Seiler, Peter; Lessard, Laurent: Analysis of biased stochastic gradient descent using sequential semidefinite programs (2021)
  8. Lu, Haihao; Freund, Robert M.: Generalized stochastic Frank-Wolfe algorithm with stochastic “substitute” gradient for structured convex optimization (2021)
  9. Nguyen, Lam M.; Scheinberg, Katya; Takáč, Martin: Inexact SARAH algorithm for stochastic optimization (2021)
  10. Qian, Xun; Qu, Zheng; Richtárik, Peter: L-SVRG and L-Katyusha with arbitrary sampling (2021)
  11. Tuckute, Greta; Hansen, Sofie Therese; Kjaer, Troels Wesenberg; Hansen, Lars Kai: Real-time decoding of attentional states using closed-loop EEG neurofeedback (2021)
  12. Xiao, Xiantao: A unified convergence analysis of stochastic Bregman proximal gradient and extragradient methods (2021)
  13. Yu, Tengteng; Liu, Xin-Wei; Dai, Yu-Hong; Sun, Jie: Stochastic variance reduced gradient methods using a trust-region-like scheme (2021)
  14. Zhang, Junyu; Xiao, Lin: Multilevel composite stochastic optimization via nested variance reduction (2021)
  15. Aravkin, Aleksandr; Davis, Damek: Trimmed statistical estimation via variance reduction (2020)
  16. Boffi, Nicholas M.; Slotine, Jean-Jacques E.: A continuous-time analysis of distributed stochastic gradient (2020)
  17. Drori, Yoel; Taylor, Adrien B.: Efficient first-order methods for convex minimization: a constructive approach (2020)
  18. Gu, Bin; Xian, Wenhan; Huo, Zhouyuan; Deng, Cheng; Huang, Heng: A unified (q)-memorization framework for asynchronous stochastic optimization (2020)
  19. Hassani, Hamed; Karbasi, Amin; Mokhtari, Aryan; Shen, Zebang: Stochastic conditional gradient++: (Non)convex minimization and continuous submodular maximization (2020)
  20. Hazan, Tamir; Sabach, Shoham; Voldman, Sergey: Stochastic proximal linear method for structured non-convex problems (2020)

1 2 3 4 5 next