ve08

On the unconstrained optimization of partially separable functions We consider the problem of minimizing a smooth objective function f of n real variables. For n>200 we can only hope to locate a local minimum of f within the usual limitations on storage and computing time by using a minimization algorithm that exploits some special structure of f. One such possibility is that the Hessian G(x) of f(x) has clustered eigenvalues at a minimizer x *, in which case conjugate gradient and limited memory variable metric methods were found to work quite well. However, in general, the performance of these methods is rather unpredictable since, except for certain test functions, the eigenvalue structure of G at or near x * is usually not known. Therefore we pursue the traditional approach of approximating f by local quadratic models, which is computationally feasible even for large n if f has a certain separability structure. This structure is always implied by sparsity of G, and depends only on the way in which the components of x enter into f, and not on the numerical values of f or its derivatives. (Source: http://plato.asu.edu)


References in zbMATH (referenced in 140 articles , 3 standard articles )

Showing results 1 to 20 of 140.
Sorted by year (citations)

1 2 3 ... 5 6 7 next

  1. Galli, Leonardo; Galligari, Alessandro; Sciandrone, Marco: A unified convergence framework for nonmonotone inexact decomposition methods (2020)
  2. Hosseini Dehmiry, Alireza: The global convergence of the BFGS method under a modified Yuan-Wei-Lu line search technique (2020)
  3. Kuřátko, Jan; Ratschan, Stefan: Solving reachability problems by a scalable constrained optimization method (2020)
  4. Chen, Xiaojun; Toint, Ph. L.; Wang, H.: Complexity of partially separable convexly constrained optimization with non-Lipschitzian singularities (2019)
  5. Gao, Wenbo; Goldfarb, Donald: Quasi-Newton methods: superlinear convergence without line searches for self-concordant functions (2019)
  6. García, Oscar: Estimating reducible stochastic differential equations by conversion to a least-squares problem (2019)
  7. Petra, Cosmin G.; Chiang, Naiyuan; Anitescu, Mihai: A structured quasi-Newton algorithm for optimizing with incomplete Hessian information (2019)
  8. Tyagi, Hemant; Vybiral, Jan: Learning general sparse additive models from point queries in high dimensions (2019)
  9. Gao, Wenbo; Goldfarb, Donald: Block BFGS methods (2018)
  10. Kronqvist, Jan; Lundell, Andreas; Westerlund, Tapio: Reformulations for utilizing separability when solving convex MINLP problems (2018)
  11. Petra, C. G.; Qiang, F.; Lubin, M.; Huchette, J.: On efficient Hessian computation using the edge pushing algorithm in Julia (2018)
  12. Yuan, Gonglin; Sheng, Zhou; Wang, Bopeng; Hu, Wujie; Li, Chunnian: The global convergence of a modified BFGS method for nonconvex functions (2018)
  13. Cao, Hui-Ping; Li, Dong-Hui: Partitioned quasi-Newton methods for sparse nonlinear equations (2017)
  14. Yuan, Gonglin; Wei, Zengxin; Lu, Xiwen: Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search (2017)
  15. Cao, Huiping; Yao, Lan: A partitioned PSB method for partially separable unconstrained optimization problems (2016)
  16. Gaur, Daya R.; Hossain, Shahadat; Saha, Anik: Determining sparse Jacobian matrices using two-sided compression: an algorithm and lower bounds (2016)
  17. Janka, Dennis; Kirches, Christian; Sager, Sebastian; Wächter, Andreas: An SR1/BFGS SQP algorithm for nonconvex nonlinear programs with block-diagonal Hessian matrix (2016)
  18. Huang, Wen; Gallivan, K. A.; Absil, P.-A.: A Broyden class of quasi-Newton methods for Riemannian optimization (2015)
  19. Bidabadi, Narges; Mahdavi-Amiri, Nezam: Superlinearly convergent exact penalty methods with projected structured secant updates for constrained nonlinear least squares (2014)
  20. Dai, Yu-Hong; Yamashita, Nobuo: Analysis of sparse quasi-Newton updates with positive definite matrix completion (2014)

1 2 3 ... 5 6 7 next