DFO

DFO is a Fortran package for solving general nonlinear optimization problems that have the following characteristics: they are relatively small scale (less than 100 variables), their objective function is relatively expensive to compute and derivatives of such functions are not available and cannot be estimated efficiently. There also may be some noise in the function evaluation procedures. Such optimization problems arise ,for example, in engineering design, where the objective function evaluation is a simulation package treated as a black box.


References in zbMATH (referenced in 115 articles , 1 standard article )

Showing results 1 to 20 of 115.
Sorted by year (citations)

1 2 3 4 5 6 next

  1. Berahas, Albert S.; Byrd, Richard H.; Nocedal, Jorge: Derivative-free optimization of noisy functions via quasi-Newton methods (2019)
  2. Falini, Antonella; Jüttler, Bert: THB-splines multi-patch parameterization for multiply-connected planar domains via template segmentation (2019)
  3. Larson, Jeffrey; Menickelly, Matt; Wild, Stefan M.: Derivative-free optimization methods (2019)
  4. Wang, Peng; Zhu, Detong; Song, Yufeng: Derivative-free feasible backtracking search methods for nonlinear multiobjective optimization with simple boundary constraint (2019)
  5. Costa, Alberto; Nannicini, Giacomo: RBFOpt: an open-source library for black-box optimization with costly function evaluations (2018)
  6. Maggiar, Alvaro; Wächter, Andreas; Dolinskaya, Irina S.; Staum, Jeremy: A derivative-free trust-region algorithm for the optimization of functions smoothed via Gaussian convolution using adaptive multiple importance sampling (2018)
  7. Zhou, Zhe; Bai, Fusheng: An adaptive framework for costly black-box global optimization based on radial basis function interpolation (2018)
  8. Echebest, N.; Schuverdt, M. L.; Vignau, R. P.: An inexact restoration derivative-free filter method for nonlinear programming (2017)
  9. Fang, Xiaowei; Ni, Qin: A frame-based conjugate gradients direct search method with radial basis function interpolation model (2017)
  10. Hare, W.: Compositions of convex functions and fully linear models (2017)
  11. Rahmanpour, Fardin; Hosseini, Mohammad Mehdi; Maalek Ghaini, Farid Mohammad: Penalty-free method for nonsmooth constrained optimization via radial basis functions (2017)
  12. Tenne, Yoel: Machine-learning in optimization of expensive black-box functions (2017)
  13. Verdério, Adriano; Karas, Elizabeth W.; Pedroso, Lucas G.; Scheinberg, Katya: On the construction of quadratic models for derivative-free trust-region algorithms (2017)
  14. Cauwet, Marie-Liesse; Liu, Jialin; Rozière, Baptiste; Teytaud, Olivier: Algorithm portfolios for noisy optimization (2016)
  15. Garmanjani, R.; Júdice, D.; Vicente, L. N.: Trust-region methods without using derivatives: worst case complexity and the nonsmooth case (2016)
  16. Lazar, Markus; Jarre, Florian: Calibration by optimization without using derivatives (2016)
  17. Tröltzsch, Anke: A sequential quadratic programming algorithm for equality-constrained optimization without derivatives (2016)
  18. Wang, Jueyu; Zhu, Detong: Conjugate gradient path method without line search technique for derivative-free unconstrained optimization (2016)
  19. Audet, Charles; Le Digabel, Sébastien; Peyrega, Mathilde: Linear equalities in blackbox optimization (2015)
  20. Ferreira, Priscila S.; Karas, Elizabeth W.; Sachine, Mael: A globally convergent trust-region algorithm for unconstrained derivative-free optimization (2015)

1 2 3 4 5 6 next