Algorithm 851: CG_DESCENT. A conjugate gradient method with guaranteed descent Recently, a new nonlinear conjugate gradient scheme was developed which satisfies the descent condition gTkdk ≤ −7/8 ‖gk‖2 and which is globally convergent whenever the line search fulfills the Wolfe conditions. This article studies the convergence behavior of the algorithm; extensive numerical tests and comparisons with other methods for large-scale unconstrained optimization are given.

This software is also peer reviewed by journal TOMS.

References in zbMATH (referenced in 105 articles , 1 standard article )

Showing results 1 to 20 of 105.
Sorted by year (citations)

1 2 3 4 5 6 next

  1. Abubakar, Auwal Bala; Kumam, Poom; Mohammad, Hassan; Awwal, Aliyu Muhammed: A Barzilai-Borwein gradient projection method for sparse signal and blurred image restoration (2020)
  2. Andrei, Neculai: New conjugate gradient algorithms based on self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method (2020)
  3. Karasözen, Bülent; Uzunca, Murat; Küçükseyhan, Tuğba: Reduced order optimal control of the convective FitzHugh-Nagumo equations (2020)
  4. Li, Min: A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method (2020)
  5. Liu, Zexian; Liu, Hongwei; Dai, Yu-Hong: An improved Dai-Kou conjugate gradient algorithm for unconstrained optimization (2020)
  6. Sabi’u, Jamilu; Shah, Abdullah; Waziri, Mohammed Yusuf: Two optimal Hager-Zhang conjugate gradient methods for solving monotone nonlinear equations (2020)
  7. Tarek, Mohamed; Ray, Tapabrata: Adaptive continuation solid isotropic material with penalization for volume constrained compliance minimization (2020)
  8. Yuan, Gonglin; Wang, Xiaoliang; Sheng, Zhou: The projection technique for two open problems of unconstrained optimization problems (2020)
  9. Abubakar, Auwal Bala; Kumam, Poom: A descent Dai-Liao conjugate gradient method for nonlinear equations (2019)
  10. Abubakar, Auwal Bala; Kumam, Poom; Awwal, Aliyu Muhammed: A descent Dai-Liao projection method for convex constrained nonlinear monotone equations with applications (2019)
  11. Aminifard, Z.; Babaie-Kafaki, S.: Matrix analyses on the Dai-Liao conjugate gradient method (2019)
  12. Aminifard, Zohre; Babaie-Kafaki, Saman: An optimal parameter choice for the Dai-Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix (2019)
  13. Aminifard, Zohre; Babaie-Kafaki, Saman: A modified descent Polak-Ribiére-Polyak conjugate gradient method with global convergence property for nonconvex functions (2019)
  14. Awwal, Aliyu Muhammed; Kumam, Poom; Abubakar, Auwal Bala: A modified conjugate gradient method for monotone nonlinear equations with convex constraints (2019)
  15. Babaie-Kafaki, Saman; Aminifard, Zohre: Two--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length (2019)
  16. Dehghani, R.; Mahdavi-Amiri, N.: Scaled nonlinear conjugate gradient methods for nonlinear least squares problems (2019)
  17. Fazzio, N. S.; Schuverdt, M. L.: Convergence analysis of a nonmonotone projected gradient method for multiobjective optimization problems (2019)
  18. Jiang, Xianzhen; Jian, Jinbao: Improved Fletcher-Reeves and Dai-Yuan conjugate gradient methods with the strong Wolfe line search (2019)
  19. Liu, Hongwei; Liu, Zexian: An efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization (2019)
  20. Liu, Zexian; Liu, Hongwei: An efficient gradient method with approximately optimal stepsize based on tensor model for unconstrained optimization (2019)

1 2 3 4 5 6 next