UOBYQA: unconstrained optimization by quadratic approximation A new algorithm for general unconstrained optimization calculations is described. It takes account of the curvature of the objective function by forming quadratic models by interpolation. Obviously, no first derivatives are required. A typical iteration of the algorithm generates a new vector of variables either by minimizing the quadratic model subject to a trust region bound, or by a procedure that should improve the accuracy of the model. The paper addresses the initial positions of the interpolation points and the adjustment of trust region radii. par The algorithm works with the Lagrange functions of the interpolation equations explicitly; therefore their coefficients are updated when an interpolation point is moved. The Lagrange functions assist the procedure that improves the model and also they provide an estimate of the error of the quadratic approximation of the function being minimized. It is pointed out that results are very promising for functions with less than twenty variables.

References in zbMATH (referenced in 58 articles , 1 standard article )

Showing results 1 to 20 of 58.
Sorted by year (citations)

1 2 3 next

  1. Ahmadvand, Mohammad; Esmaeilbeigi, Mohsen; Kamandi, Ahmad; Yaghoobi, Farajollah Mohammadi: An improved hybrid-ORBIT algorithm based on point sorting and MLE technique (2019)
  2. Berahas, Albert S.; Byrd, Richard H.; Nocedal, Jorge: Derivative-free optimization of noisy functions via quasi-Newton methods (2019)
  3. Cartis, Coralia; Roberts, Lindon: A derivative-free Gauss-Newton method (2019)
  4. Larson, Jeffrey; Menickelly, Matt; Wild, Stefan M.: Derivative-free optimization methods (2019)
  5. Audet, Charles; Ihaddadene, Amina; Le Digabel, Sébastien; Tribes, Christophe: Robust optimization of noisy blackbox problems using the mesh adaptive direct search algorithm (2018)
  6. Chen, R.; Menickelly, M.; Scheinberg, K.: Stochastic optimization using a trust-region method and random models (2018)
  7. Gobbi, Paula E.: Childcare and commitment within households (2018)
  8. He, Xinyu; Hu, Yangzhou; Powell, Warren B.: Optimal learning for nonlinear parametric belief models over multidimensional continuous spaces (2018)
  9. Maggiar, Alvaro; Wächter, Andreas; Dolinskaya, Irina S.; Staum, Jeremy: A derivative-free trust-region algorithm for the optimization of functions smoothed via Gaussian convolution using adaptive multiple importance sampling (2018)
  10. Shashaani, Sara; Hashemi, Fatemeh S.; Pasupathy, Raghu: ASTRO-DF: a class of adaptive sampling trust-region algorithms for derivative-free stochastic optimization (2018)
  11. Zhou, Zhe; Bai, Fusheng: An adaptive framework for costly black-box global optimization based on radial basis function interpolation (2018)
  12. C. Cartis; L. Roberts: A Derivative-Free Gauss-Newton Method (2017) arXiv
  13. Hare, W.: Compositions of convex functions and fully linear models (2017)
  14. Rahmanpour, Fardin; Hosseini, Mohammad Mehdi; Maalek Ghaini, Farid Mohammad: Penalty-free method for nonsmooth constrained optimization via radial basis functions (2017)
  15. Regis, Rommel G.; Wild, Stefan M.: CONORBIT: constrained optimization by radial basis function interpolation in trust regions (2017)
  16. Verdério, Adriano; Karas, Elizabeth W.; Pedroso, Lucas G.; Scheinberg, Katya: On the construction of quadratic models for derivative-free trust-region algorithms (2017)
  17. Chen, Xiaojun; Kelley, C. T.: Optimization with hidden constraints and embedded Monte Carlo computations (2016)
  18. Larson, Jeffrey; Billups, Stephen C.: Stochastic derivative-free optimization using a trust region framework (2016)
  19. Ni, Qin; Jiang, Cui; Liu, Hao: A new direct search method based on separable fractional interpolation model (2016)
  20. Wang, Jueyu; Zhu, Detong: Conjugate gradient path method without line search technique for derivative-free unconstrained optimization (2016)

1 2 3 next