NOMAD

Algorithm 909: NOMAD: Nonlinear Optimization with the MADS Algorithm. NOMAD is software that implements the Mesh Adaptive Direct Search (MADS) algorithm for blackbox optimization under general nonlinear constraints. Blackbox optimization is about optimizing functions that are usually given as costly programs with no derivative information and no function values returned for a significant number of calls attempted. NOMAD is designed for such problems and aims for the best possible solution with a small number of evaluations. The objective of this article is to describe the underlying algorithm, the software’s functionalities, and its implementation.


References in zbMATH (referenced in 90 articles )

Showing results 1 to 20 of 90.
Sorted by year (citations)

1 2 3 4 5 next

  1. Alimo, Ryan; Beyhaghi, Pooriya; Bewley, Thomas R.: Delaunay-based derivative-free optimization via global surrogates. III: nonconvex constraints (2020)
  2. Audet, Charles; Côté, Pascal; Poissant, Catherine; Tribes, Christophe: Monotonic grey box direct search optimization (2020)
  3. Bajaj, Ishan; Hasan, M. M. Faruque: Global dynamic optimization using edge-concave underestimator (2020)
  4. Bhosekar, Atharv; Ierapetritou, Marianthi: A discontinuous derivative-free optimization framework for multi-enterprise supply chain (2020)
  5. Cocchi, Guido; Levato, Tommaso; Liuzzi, Giampaolo; Sciandrone, Marco: A concave optimization-based approach for sparse multiobjective programming (2020)
  6. Jiang, Su; Sun, Wenyue; Durlofsky, Louis J.: A data-space inversion procedure for well control optimization and closed-loop reservoir management (2020)
  7. Liuzzi, Giampaolo; Lucidi, Stefano; Rinaldi, Francesco: An algorithmic framework based on primitive directions and nonmonotone line searches for black-box optimization problems with integer variables (2020)
  8. Manno, Andrea; Amaldi, Edoardo; Casella, Francesco; Martelli, Emanuele: A local search method for costly black-box problems and its application to CSP plant start-up optimization refinement (2020)
  9. Sauk, Benjamin; Ploskas, Nikolaos; Sahinidis, Nikolaos: GPU parameter tuning for tall and skinny dense linear least squares problems (2020)
  10. Verma, Aekaansh; Wong, Kwai; Marsden, Alison L.: A concurrent implementation of the surrogate management framework with application to cardiovascular shape optimization (2020)
  11. Audet, Charles; Côté-Massicotte, Julien: Dynamic improvements of static surrogates in direct search optimization (2019)
  12. Audet, Charles; Le Digabel, Sébastien; Tribes, Christophe: The mesh adaptive direct search algorithm for granular and discrete variables (2019)
  13. Berahas, Albert S.; Byrd, Richard H.; Nocedal, Jorge: Derivative-free optimization of noisy functions via quasi-Newton methods (2019)
  14. Bűrmen, Árpád; Fajfar, Iztok: Mesh adaptive direct search with simplicial Hessian update (2019)
  15. Gratton, S.; Royer, C. W.; Vicente, L. N.; Zhang, Z.: Direct search based on probabilistic feasible descent for bound and linearly constrained problems (2019)
  16. Larson, Jeffrey; Menickelly, Matt; Wild, Stefan M.: Derivative-free optimization methods (2019)
  17. Liuzzi, Giampaolo; Lucidi, Stefano; Rinaldi, Francesco; Vicente, Luis Nunes: Trust-region methods for the derivative-free optimization of nonsmooth black-box functions (2019)
  18. Müller, Juliane; Day, Marcus: Surrogate optimization of computationally expensive black-box problems with hidden constraints (2019)
  19. Sanguinetti, Guido (ed.); Huynh-Thu, Vân Anh (ed.): Gene regulatory networks. Methods and protocols (2019)
  20. Amaioua, Nadir; Audet, Charles; Conn, Andrew R.; Le Digabel, Sébastien: Efficient solution of quadratically constrained quadratic subproblems within the mesh adaptive direct search algorithm (2018)

1 2 3 4 5 next