SCALCG

SCALCG – Scaled conjugate gradient algorithms for unconstrained optimization. In this work we present and analyze a new scaled conjugate gradient algorithm and its implementation, based on an interpretation of the secant equation and on the inexact Wolfe line search conditions. The best spectral conjugate gradient algorithm SCG by Birgin and Martínez (2001), which is mainly a scaled variant of Perry’s (1977), is modified in such a manner to overcome the lack of positive definiteness of the matrix defining the search direction. This modification is based on the quasi-Newton BFGS updating formula. The computational scheme is embedded in the restart philosophy of Beale–Powell. The parameter scaling the gradient is selected as spectral gradient or in an anticipative manner by means of a formula using the function values in two successive points. In very mild conditions it is shown that, for strongly convex functions, the algorithm is global convergent. Preliminary computational results, for a set consisting of 500 unconstrained optimization test problems, show that this new scaled conjugate gradient algorithm substantially outperforms the spectral conjugate gradient SCG algorithm.


References in zbMATH (referenced in 93 articles )

Showing results 1 to 20 of 93.
Sorted by year (citations)

1 2 3 4 5 next

  1. Bojari, S.; Eslahchi, M. R.: Two families of scaled three-term conjugate gradient methods with sufficient descent property for nonconvex optimization (2020)
  2. Mahdavi-Amiri, N.; Shaeiri, M.: A conjugate gradient sampling method for nonsmooth optimization (2020)
  3. Nataj, Sarah; Lui, S. H.: Superlinear convergence of nonlinear conjugate gradient method and scaled memoryless BFGS method based on assumptions about the initial point (2020)
  4. Yuan, Gonglin; Li, Tingting; Hu, Wujie: A conjugate gradient algorithm for large-scale nonlinear equations and image restoration problems (2020)
  5. Aminifard, Zohre; Babaie-Kafaki, Saman: An optimal parameter choice for the Dai-Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix (2019)
  6. Babaie-Kafaki, Saman; Aminifard, Zohre: Two--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length (2019)
  7. Dehghani, R.; Mahdavi-Amiri, N.: Scaled nonlinear conjugate gradient methods for nonlinear least squares problems (2019)
  8. Faramarzi, Parvaneh; Amini, Keyvan: A modified spectral conjugate gradient method with global convergence (2019)
  9. Khoshgam, Zahra; Ashrafi, Ali: A new modified scaled conjugate gradient method for large-scale unconstrained optimization with non-convex objective function (2019)
  10. Liu, Hongwei; Liu, Zexian: An efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization (2019)
  11. Rezaee, Saeed; Babaie-Kafaki, Saman: An adaptive nonmonotone trust region method based on a modified scalar approximation of the Hessian in the successive quadratic subproblems (2019)
  12. Xue, Yanqin; Liu, Hongwei; Liu, Zexian: An improved nonmonotone adaptive trust region method. (2019)
  13. Andrei, Neculai: A double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization (2018)
  14. Andrei, Neculai: A double parameter scaled BFGS method for unconstrained optimization (2018)
  15. Andrei, Neculai: A diagonal quasi-Newton updating method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization (2018)
  16. Babaie-Kafaki, Saman; Ghanbari, Reza: A linear hybridization of the Hestenes-Stiefel method and the memoryless BFGS technique (2018)
  17. Caliciotti, Andrea; Fasano, Giovanni; Roma, Massimo: Preconditioned nonlinear conjugate gradient methods based on a modified secant equation (2018)
  18. Dong, XiaoLiang; Han, Deren; Dai, Zhifeng; Li, Lixiang; Zhu, Jianguang: An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition (2018)
  19. Li, Ming; Liu, Hongwei; Liu, Zexian: A new family of conjugate gradient methods for unconstrained optimization (2018)
  20. Li, Ming; Liu, Hongwei; Liu, Zexian: A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization (2018)

1 2 3 4 5 next