References in zbMATH (referenced in 55 articles )

Showing results 1 to 20 of 55.
Sorted by year (citations)

1 2 3 next

  1. Goda, Takashi; Hironaka, Tomohiko; Kitade, Wataru; Foster, Adam: Unbiased MLMC stochastic gradient-based optimization of Bayesian experimental designs (2022)
  2. Barakat, Anas; Bianchi, Pascal: Convergence and dynamical behavior of the ADAM algorithm for nonconvex stochastic optimization (2021)
  3. Barakat, Anas; Bianchi, Pascal; Hachem, Walid; Schechtman, Sholom: Stochastic optimization with momentum: convergence, fluctuations, and traps avoidance (2021)
  4. Duruisseaux, Valentin; Schmitt, Jeremy; Leok, Melvin: Adaptive Hamiltonian variational integrators and applications to symplectic accelerated optimization (2021)
  5. Flori, Andrea; Regoli, Daniele: Revealing pairs-trading opportunities with long short-term memory networks (2021)
  6. Grosnit, Antoine; Cowen-Rivers, Alexander I.; Tutunov, Rasul; Griffiths, Ryan-Rhys; Wang, Jun; Bou-Ammar, Haitham: Are we forgetting about compositional optimisers in Bayesian optimisation? (2021)
  7. Gunnarsson, Björn Rafn; vanden Broucke, Seppe; Baesens, Bart; Óskarsdóttir, María; Lemahieu, Wilfried: Deep learning for credit scoring: do or don’t? (2021)
  8. Jomaa, Hadi S.; Schmidt-Thieme, Lars; Grabocka, Josif: Dataset2Vec: learning dataset meta-features (2021)
  9. Kovachki, Nikola B.; Stuart, Andrew M.: Continuous time analysis of momentum methods (2021)
  10. Lakhmiri, Dounia; Digabel, Sébastien Le; Tribes, Christophe: HyperNOMAD. Hyperparameter optimization of deep neural networks using mesh adaptive direct search (2021)
  11. Liu, Yang; Roosta, Fred: Convergence of Newton-MR under inexact Hessian information (2021)
  12. Prazeres, Mariana; Oberman, Adam M.: Stochastic gradient descent with Polyak’s learning rate (2021)
  13. Tang, Xueying; Zhang, Susu; Wang, Zhi; Liu, Jingchen; Ying, Zhiliang: ProcData: an R package for process data analysis (2021)
  14. Yan, Liang; Zhou, Tao: Stein variational gradient descent with local approximations (2021)
  15. Borisyak, Maxim; Ryzhikov, Artem; Ustyuzhanin, Andrey; Derkach, Denis; Ratnikov, Fedor; Mineeva, Olga: ((1 + \varepsilon))-class classification: an anomaly detection method for highly imbalanced or incomplete data sets (2020)
  16. Ciosek, Kamil; Whiteson, Shimon: Expected policy gradients for reinforcement learning (2020)
  17. Da Silva, Andre Belotto; Gazeau, Maxime: A general system of differential equations to model first-order adaptive algorithms (2020)
  18. Do, Dieu T. T.; Nguyen-Xuan, H.; Lee, Jaehong: Material optimization of tri-directional functionally graded plates by using deep neural network and isogeometric multimesh design approach (2020)
  19. Duan, Shiyu; Yu, Shujian; Chen, Yunmei; Principe, Jose C.: On kernel method-based connectionist models and supervised deep learning without backpropagation (2020)
  20. França, Guilherme; Sulam, Jeremias; Robinson, Daniel P.; Vidal, René: Conformal symplectic and relativistic optimization (2020)

1 2 3 next