tn

Newton-type minimization via the Lanczos method This paper discusses the use of the linear conjugate-gradient method (developed via the Lanczos method) in the solution of large-scale unconstrained minimization problems. It is shown how the equivalent Lanczos characterization of the linear conjugate-gradient method may be exploited to define a modified Newton method which can be applied to problems that do not necessarily have positive-definite Hessian matrices. This derivation also makes it possible to compute a negative-curvature direction at a stationary point. The above mentioned modified Lanczos algorithm requires up to n iterations to compute the search direction, where n denotes the number of variables of the problem. The idea of a truncated Newton method is to terminate the iterations earlier. A preconditioned truncated Newton method is described that defines a search direction which interpolates between the direction defined by a nonlinear conjugate-gradient-type method and a modified Newton direction. Numerical results are given which show the promising performance of truncated Newton methods. (Source: http://plato.asu.edu)


References in zbMATH (referenced in 134 articles )

Showing results 1 to 20 of 134.
Sorted by year (citations)

1 2 3 ... 5 6 7 next

  1. Curtis, Frank E.; Robinson, Daniel P.; Royer, Clément W.; Wright, Stephen J.: Trust-region Newton-CG with strong second-order complexity guarantees for nonconvex optimization (2021)
  2. Fasano, Giovanni; Pesenti, Raffaele: Polarity and conjugacy for quadratic hypersurfaces: a unified framework with recent advances (2021)
  3. Hu, Xinyu; Qian, Min; Cheng, Bin; Cheung, Ying Kuen: Personalized policy learning using longitudinal mobile health data (2021)
  4. Al-Baali, Mehiddin; Caliciotti, Andrea; Fasano, Giovanni; Roma, Massimo: A class of approximate inverse preconditioners based on Krylov-subspace methods for large-scale nonconvex optimization (2020)
  5. Andrei, Neculai: Diagonal approximation of the Hessian by finite differences for unconstrained optimization (2020)
  6. Brás, C. P.; Martínez, J. M.; Raydan, M.: Large-scale unconstrained optimization using separable cubic modeling and matrix-free subspace minimization (2020)
  7. Chang, Haw-Shiuan; Vembu, Shankar; Mohan, Sunil; Uppaal, Rheeya; McCallum, Andrew: Using error decay prediction to overcome practical issues of deep active learning for named entity recognition (2020)
  8. De Leone, Renato; Fasano, Giovanni; Roma, Massimo; Sergeyev, Yaroslav D.: Iterative Grossone-based computation of negative curvature directions in large-scale optimization (2020)
  9. Fung, Samy Wu; Di, Zichao: Multigrid optimization for large-scale ptychographic phase retrieval (2020)
  10. Guadarrama, Lili; Prieto, Carlos; Van Houten, Elijah: An optimization problem based on a Bayesian approach for the 2D Helmholtz equation (2020)
  11. Andrei, Neculai: A new diagonal quasi-Newton updating method with scaled forward finite differences directional derivative for unconstrained optimization (2019)
  12. Andrei, Neculai: A diagonal quasi-Newton updating method for unconstrained optimization (2019)
  13. Austin, Anthony P.; Di, Zichao; Leyffer, Sven; Wild, Stefan M.: Simultaneous sensing error recovery and tomographic inversion using an optimization-based approach (2019)
  14. Busseti, Enzo; Moursi, Walaa M.; Boyd, Stephen: Solution refinement at regular points of conic problems (2019)
  15. Grote, Marcus J.; Nahum, Uri: Adaptive eigenspace for multi-parameter inverse scattering problems (2019)
  16. Józsa, Tamas I.; Balaras, E.; Kashtalyan, M.; Borthwick, A. G. L.; Viola, I. M.: Active and passive in-plane wall fluctuations in turbulent channel flows (2019)
  17. Xu, Min; Zhou, Bojian; He, Jie: Improving truncated Newton method for the logit-based stochastic user equilibrium problem (2019)
  18. Zhou, W.; Akrotirianakis, I. G.; Yektamaram, S.; Griffin, J. D.: A matrix-free line-search algorithm for nonconvex optimization (2019)
  19. Caliciotti, Andrea; Fasano, Giovanni; Nash, Stephen G.; Roma, Massimo: An adaptive truncation criterion, for linesearch-based truncated Newton methods in large scale nonconvex optimization (2018)
  20. Campos, Juan S.; Parpas, Panos: A multigrid approach to SDP relaxations of sparse polynomial optimization problems (2018)

1 2 3 ... 5 6 7 next