Algorithm 851: CG_DESCENT. A conjugate gradient method with guaranteed descent Recently, a new nonlinear conjugate gradient scheme was developed which satisfies the descent condition gTkdk ≤ −7/8 ‖gk‖2 and which is globally convergent whenever the line search fulfills the Wolfe conditions. This article studies the convergence behavior of the algorithm; extensive numerical tests and comparisons with other methods for large-scale unconstrained optimization are given.

This software is also peer reviewed by journal TOMS.

References in zbMATH (referenced in 101 articles , 1 standard article )

Showing results 1 to 20 of 101.
Sorted by year (citations)

1 2 3 4 5 6 next

  1. Andrei, Neculai: New conjugate gradient algorithms based on self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method (2020)
  2. Li, Min: A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method (2020)
  3. Liu, Zexian; Liu, Hongwei; Dai, Yu-Hong: An improved Dai-Kou conjugate gradient algorithm for unconstrained optimization (2020)
  4. Sabi’u, Jamilu; Shah, Abdullah; Waziri, Mohammed Yusuf: Two optimal Hager-Zhang conjugate gradient methods for solving monotone nonlinear equations (2020)
  5. Tarek, Mohamed; Ray, Tapabrata: Adaptive continuation solid isotropic material with penalization for volume constrained compliance minimization (2020)
  6. Abubakar, Auwal Bala; Kumam, Poom: A descent Dai-Liao conjugate gradient method for nonlinear equations (2019)
  7. Aminifard, Z.; Babaie-Kafaki, S.: Matrix analyses on the Dai-Liao conjugate gradient method (2019)
  8. Aminifard, Zohre; Babaie-Kafaki, Saman: An optimal parameter choice for the Dai-Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix (2019)
  9. Aminifard, Zohre; Babaie-Kafaki, Saman: A modified descent Polak-Ribiére-Polyak conjugate gradient method with global convergence property for nonconvex functions (2019)
  10. Awwal, Aliyu Muhammed; Kumam, Poom; Abubakar, Auwal Bala: A modified conjugate gradient method for monotone nonlinear equations with convex constraints (2019)
  11. Babaie-Kafaki, Saman; Aminifard, Zohre: Two--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length (2019)
  12. Dehghani, R.; Mahdavi-Amiri, N.: Scaled nonlinear conjugate gradient methods for nonlinear least squares problems (2019)
  13. Fazzio, N. S.; Schuverdt, M. L.: Convergence analysis of a nonmonotone projected gradient method for multiobjective optimization problems (2019)
  14. Jiang, Xianzhen; Jian, Jinbao: Improved Fletcher-Reeves and Dai-Yuan conjugate gradient methods with the strong Wolfe line search (2019)
  15. Liu, Hongwei; Liu, Zexian: An efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization (2019)
  16. Liu, Zexian; Liu, Hongwei: An efficient gradient method with approximately optimal stepsize based on tensor model for unconstrained optimization (2019)
  17. Li, Yufei; Liu, Zexian; Liu, Hongwei: A subspace minimization conjugate gradient method based on conic model for unconstrained optimization (2019)
  18. Mu, Xiaojie; Zhang, Qimin; Rong, Libin: Optimal vaccination strategy for an SIRS model with imprecise parameters and Lévy noise (2019)
  19. Rezaee, Saeed; Babaie-Kafaki, Saman: An adaptive nonmonotone trust region algorithm (2019)
  20. Rezaee, Saeed; Babaie-Kafaki, Saman: An adaptive nonmonotone trust region method based on a modified scalar approximation of the Hessian in the successive quadratic subproblems (2019)

1 2 3 4 5 6 next