DFO

DFO is a Fortran package for solving general nonlinear optimization problems that have the following characteristics: they are relatively small scale (less than 100 variables), their objective function is relatively expensive to compute and derivatives of such functions are not available and cannot be estimated efficiently. There also may be some noise in the function evaluation procedures. Such optimization problems arise ,for example, in engineering design, where the objective function evaluation is a simulation package treated as a black box.


References in zbMATH (referenced in 129 articles , 1 standard article )

Showing results 61 to 80 of 129.
Sorted by year (citations)
  1. Auger, Anne; Teytaud, Olivier: Continuous lunches are free plus the design of optimal optimization algorithms (2010)
  2. Colson, Benoît; Bruyneel, Michaël; Grihon, Stéphane; Raick, Caroline; Remouchamps, Alain: Optimization methods for advanced design of aircraft panels: a comparison (2010)
  3. Jakobsson, Stefan; Patriksson, Michael; Rudholm, Johan; Wojciechowski, Adam: A method for simulation based optimization using radial basis functions (2010)
  4. Probst, M.; Lülfesmann, M.; Nicolai, M.; Bücker, H. M.; Behr, M.; Bischof, C. H.: Sensitivity of optimal shapes of artificial grafts with respect to flow parameters (2010)
  5. Scheinberg, K.; Toint, Ph. L.: Self-correcting geometry in model-based algorithms for derivative-free unconstrained optimization (2010)
  6. Conn, Andrew R.; Scheinberg, Katya; Vicente, Luís N.: Global convergence of general derivative-free trust-region algorithms to first- and second-order critical points (2009)
  7. Deng, Geng; Ferris, Michael C.: Variable-number sample-path optimization (2009)
  8. Du, Peng; Peng, Jiming; Terlaky, Tamás: Self-adaptive support vector machines: modelling and experiments (2009)
  9. Fasano, Giovanni; Morales, José Luis; Nocedal, Jorge: On the geometry phase in model-based algorithms for derivative-free optimization (2009)
  10. Finkel, D. E.; Kelley, C. T.: Convergence analysis of sampling methods for perturbed Lipschitz functions (2009)
  11. Hvattum, Lars Magnus; Glover, Fred: Finding local optima of high-dimensional functions using direct search methods (2009)
  12. Mutapcic, Almir; Boyd, Stephen: Cutting-set methods for robust convex optimization with pessimizing oracles (2009)
  13. Regis, Rommel G.; Shoemaker, Christine A.: Parallel stochastic global optimization using radial basis functions (2009)
  14. Sainvitu, Caroline: How much do approximate derivatives hurt filter methods? (2009)
  15. Tenne, Yoel; Armfield, S. W.: A framework for memetic optimization using variable global and local surrogate models (2009) ioport
  16. Caboussat, A.; Francois, M. M.; Glowinski, R.; Kothe, D. B.; Sicilian, J. M.: A numerical method for interface reconstruction of triple points within a volume tracking algorithm (2008)
  17. Conn, Andrew R.; Scheinberg, Katya; Vicente, Luís N.: Geometry of sample sets in derivative-free optimization: Polynomial regression and underdetermined interpolation (2008)
  18. Conn, A. R.; Scheinberg, K.; Vicente, Luís N.: Geometry of interpolation sets in derivative free optimization (2008)
  19. Diniz-Ehrhardt, M. A.; Martínez, J. M.; Raydan, M.: A derivative-free nonmonotone line-search technique for unconstrained optimization (2008)
  20. Uğur, Ö.; Karasözen, B.; Schäfer, M.; Yapıcı, K.: Derivative free optimization methods for optimizing stirrer configurations (2008)