CUTEr

CUTEr is a versatile testing environment for optimization and linear algebra solvers. The package contains a collection of test problems, along with Fortran 77, Fortran 90/95 and Matlab tools intended to help developers design, compare and improve new and existing solvers. The test problems provided are written in so-called Standard Input Format (SIF). A decoder to convert from this format into well-defined Fortran 77 and data files is available as a separate package. Once translated, these files may be manipulated to provide tools suitable for testing optimization packages. Ready-to-use interfaces to existing packages, such as MINOS, SNOPT, filterSQP, Knitro, and more, are provided. See the interfaces section for a complete list.


References in zbMATH (referenced in 540 articles , 1 standard article )

Showing results 1 to 20 of 540.
Sorted by year (citations)

1 2 3 ... 25 26 27 next

  1. Li, Min: A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method (2020)
  2. Liu, Zexian; Liu, Hongwei; Dai, Yu-Hong: An improved dai-Kou conjugate gradient algorithm for unconstrained optimization (2020)
  3. Aminifard, Z.; Babaie-Kafaki, S.: Matrix analyses on the Dai-Liao conjugate gradient method (2019)
  4. Aminifard, Zohre; Babaie-Kafaki, Saman: A modified descent Polak-Ribiére-Polyak conjugate gradient method with global convergence property for nonconvex functions (2019)
  5. Aminifard, Zohre; Babaie-Kafaki, Saman: An optimal parameter choice for the Dai-Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix (2019)
  6. Amini, Keyvan; Faramarzi, Parvaneh; Pirfalah, Nasrin: A modified Hestenes-Stiefel conjugate gradient method with an optimal property (2019)
  7. Andrei, Neculai: A new diagonal quasi-Newton updating method with scaled forward finite differences directional derivative for unconstrained optimization (2019)
  8. Armand, Paul; Tran, Ngoc Nguyen: An augmented Lagrangian method for equality constrained optimization with rapid infeasibility detection capabilities (2019)
  9. Audet, Charles; Le Digabel, Sébastien; Tribes, Christophe: The mesh adaptive direct search algorithm for granular and discrete variables (2019)
  10. Babaie-Kafaki, Saman; Aminifard, Zohre: Two--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length (2019)
  11. Boggs, Paul T.; Byrd, Richard H.: Adaptive, limited-memory BFGS algorithms for unconstrained optimization (2019)
  12. Brust, Johannes; Burdakov, Oleg; Erway, Jennifer B.; Marcia, Roummel F.: A dense initialization for limited-memory quasi-Newton methods (2019)
  13. Dong, Wen-Li; Li, Xing; Peng, Zheng: A simulated annealing-based Barzilai-Borwein gradient method for unconstrained optimization problems (2019)
  14. Faramarzi, Parvaneh; Amini, Keyvan: A modified spectral conjugate gradient method with global convergence (2019)
  15. Faramarzi, Parvaneh; Amini, Keyvan: A scaled three-term conjugate gradient method for large-scale unconstrained optimization problem (2019)
  16. Huang, Na; Ma, Chang-Feng: Spectral analysis of the preconditioned system for the (3 \times3) block saddle point problem (2019)
  17. Jiang, Xianzhen; Jian, Jinbao: Improved Fletcher-Reeves and Dai-Yuan conjugate gradient methods with the strong Wolfe line search (2019)
  18. Khoshgam, Zahra; Ashrafi, Ali: A new modified scaled conjugate gradient method for large-scale unconstrained optimization with non-convex objective function (2019)
  19. Khoshgam, Zahra; Ashrafi, Ali: A new hybrid conjugate gradient method for large-scale unconstrained optimization problem with non-convex objective function (2019)
  20. Liu, Hongwei; Liu, Zexian: An efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization (2019)

1 2 3 ... 25 26 27 next