LDGB

New limited memory bundle method for large-scale nonsmooth optimization Many practical optimization problems involve nonsmooth (that is, not necessarily differentiable) functions of hundreds or thousands of variables. In such problems the direct application of smooth gradient-based methods may lead to a failure due to the nonsmooth nature of the problem. On the other hand, none of the current general nonsmooth optimization methods is efficient in large-scale settings. In this article we describe a new limited memory variable metric based bundle method for nonsmooth large-scale optimization. In addition, we introduce a new set of academic test problems for large-scale nonsmooth minimization. Finally, we give some encouraging results from numerical experiments using both academic and practical test problems.


References in zbMATH (referenced in 42 articles , 1 standard article )

Showing results 1 to 20 of 42.
Sorted by year (citations)

1 2 3 next

  1. Abdollahi, Fahimeh; Fatemi, Masoud: An efficient conjugate gradient method with strong convergence properties for non-smooth optimization (2021)
  2. Bagirov, Adil M.; Taheri, Sona; Joki, Kaisa; Karmitsa, Napsu; Mäkelä, Marko M.: Aggregate subgradient method for nonsmooth DC optimization (2021)
  3. Dinc Yalcin, Gulcin; Kasimbeyli, Refail: Weak subgradient method for solving nonsmooth nonconvex optimization problems (2021)
  4. Larson, Jeffrey; Leyffer, Sven; Palkar, Prashant; Wild, Stefan M.: A method for convex black-box integer global optimization (2021)
  5. Joki, Kaisa; Bagirov, Adil M.; Karmitsa, Napsu; Mäkelä, Marko M.; Taheri, Sona: Clusterwise support vector linear regression (2020)
  6. Li, Xiangrong: A limited memory BFGS subspace algorithm for bound constrained nonsmooth problems (2020)
  7. Mahdavi-Amiri, N.; Shaeiri, M.: A conjugate gradient sampling method for nonsmooth optimization (2020)
  8. Maleknia, Morteza; Shamsi, Mostafa: A gradient sampling method based on ideal direction for solving nonsmooth optimization problems (2020)
  9. Maleknia, M.; Shamsi, M.: A new method based on the proximal bundle idea and gradient sampling technique for minimizing nonsmooth convex functions (2020)
  10. Woldu, Tsegay Giday; Zhang, Haibin; Zhang, Xin; Fissuh, Yemane Hailu: A modified nonlinear conjugate gradient algorithm for large-scale nonsmooth convex optimization (2020)
  11. Fiege, Sabrina; Walther, Andrea; Griewank, Andreas: An algorithm for nonsmooth optimization by successive piecewise linearization (2019)
  12. Karmitsa, N.; Gaudioso, M.; Joki, K.: Diagonal bundle method with convex and concave updates for large-scale nonconvex and nonsmooth optimization (2019)
  13. Keskar, N.; Wächter, Andreas: A limited-memory quasi-Newton algorithm for bound-constrained non-smooth optimization (2019)
  14. Liu, Shuai: A simple version of bundle method with linear programming (2019)
  15. Yuan, Gonglin; Li, Tingting; Hu, Wujie: A conjugate gradient algorithm and its application in large-scale optimization problems and image restoration (2019)
  16. Helou, Elias S.; Santos, Sandra A.; Simões, Lucas E. A.: A fast gradient and function sampling method for finite-max functions (2018)
  17. Hoseini, N.; Nobakhtian, S.: A new trust region method for nonsmooth nonconvex optimization (2018)
  18. Ou, Yigui; Zhou, Xin: A modified scaled memoryless BFGS preconditioned conjugate gradient algorithm for nonsmooth convex optimization (2018)
  19. Karmitsa, Napsu; Bagirov, Adil M.; Taheri, Sona: New diagonal bundle method for clustering problems in large data sets (2017)
  20. Mahdavi-Amiri, N.; Shaeiri, M.: An adaptive competitive penalty method for nonsmooth constrained optimization (2017)

1 2 3 next


Further publications can be found at: http://napsu.karmitsa.fi/publications/