NESUN

NESUN - Nesterov’s universal gradient method: Universal gradient methods for convex optimization problems. In this paper, we present new methods for black-box convex minimization. They do not need to know in advance the actual level of smoothness of the objective function. Their only essential input parameter is the required accuracy of the solution. At the same time, for each particular problem class they automatically ensure the best possible rate of convergence. We confirm our theoretical results by encouraging numerical experiments, which demonstrate that the fast rate of convergence, typical for the smooth optimization problems, sometimes can be achieved even on nonsmooth problem instances.


References in zbMATH (referenced in 43 articles )

Showing results 1 to 20 of 43.
Sorted by year (citations)

1 2 3 next

  1. Hu, Yaohua; Li, Gongnong; Yu, Carisa Kwok Wai; Yip, Tsz Leung: Quasi-convex feasibility problems: subgradient methods and convergence rates (2022)
  2. Ahookhosh, Masoud; Hien, Le Thi Khanh; Gillis, Nicolas; Patrinos, Panagiotis: A block inertial Bregman proximal algorithm for nonsmooth nonconvex problems with application to symmetric nonnegative matrix tri-factorization (2021)
  3. Dvinskikh, Darina; Gasnikov, Alexander: Decentralized and parallel primal and dual accelerated methods for stochastic convex programming problems (2021)
  4. Dvurechensky, Pavel; Gorbunov, Eduard; Gasnikov, Alexander: An accelerated directional derivative method for smooth stochastic convex optimization (2021)
  5. Hanzely, Filip; Richtárik, Peter; Xiao, Lin: Accelerated Bregman proximal gradient methods for relatively smooth convex optimization (2021)
  6. Ito, Masaru; Fukuda, Mituhiro: Nearly optimal first-order methods for convex optimization under gradient norm measure: an adaptive regularization approach (2021)
  7. Matyukhin, Vladislav; Kabanikhin, Sergey; Shishlenin, Maxim; Novikov, Nikita; Vasin, Artem; Gasnikov, Alexander: Convex optimization with inexact gradients in Hilbert space and applications to elliptic inverse problems (2021)
  8. Nesterov, Yurii: Implementable tensor methods in unconstrained convex optimization (2021)
  9. Song, Chaobing; Jiang, Yong; Ma, Yi: Unified acceleration of high-order algorithms under general Hölder continuity (2021)
  10. Uribe, César A.; Lee, Soomin; Gasnikov, Alexander; Nedić, Angelia: A dual approach for optimal algorithms in distributed optimization over networks (2021)
  11. Berger, Guillaume O.; Absil, P.-A.; Jungers, Raphaël M.; Nesterov, Yurii: On the quality of first-order approximation of functions with Hölder continuous gradient (2020)
  12. Dolgopolik, M. V.: The method of codifferential descent for convex and global piecewise affine optimization (2020)
  13. Lei, Lihua; Jordan, Michael I.: On the adaptivity of stochastic gradient-based optimization (2020)
  14. Rodomanov, Anton; Nesterov, Yurii: Smoothness parameter of power of Euclidean norm (2020)
  15. Roulet, Vincent; d’Aspremont, Alexandre: Sharpness, restart, and acceleration (2020)
  16. Scieur, Damien; D’Aspremont, Alexandre; Bach, Francis: Regularized nonlinear acceleration (2020)
  17. Silveti-Falls, Antonio; Molinari, Cesare; Fadili, Jalal: Generalized conditional gradient with augmented Lagrangian for composite minimization (2020)
  18. Ahookhosh, Masoud: Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity (2019)
  19. Ahookhosh, Masoud; Neumaier, Arnold: An optimal subgradient algorithm with subspace search for costly convex optimization problems (2019)
  20. Baimurzina, D. R.; Gasnikov, A. V.; Gasnikova, E. V.; Dvurechensky, P. E.; Ershov, E. I.; Kubentaeva, M. B.; Lagunovskaya, A. A.: Universal method of searching for equilibria and stochastic equilibria in transportation networks (2019)

1 2 3 next