Adam

Adam: A Method for Stochastic Optimization. We introduce Adam, an algorithm for first-order gradient-based optimization of stochastic objective functions, based on adaptive estimates of lower-order moments. The method is straightforward to implement, is computationally efficient, has little memory requirements, is invariant to diagonal rescaling of the gradients, and is well suited for problems that are large in terms of data and/or parameters. The method is also appropriate for non-stationary objectives and problems with very noisy and/or sparse gradients. The hyper-parameters have intuitive interpretations and typically require little tuning. Some connections to related algorithms, on which Adam was inspired, are discussed. We also analyze the theoretical convergence properties of the algorithm and provide a regret bound on the convergence rate that is comparable to the best known results under the online convex optimization framework. Empirical results demonstrate that Adam works well in practice and compares favorably to other stochastic optimization methods. Finally, we discuss AdaMax, a variant of Adam based on the infinity norm.


References in zbMATH (referenced in 187 articles )

Showing results 41 to 60 of 187.
Sorted by year (citations)

previous 1 2 3 4 5 ... 8 9 10 next

  1. Wu, Pin; Sun, Junwu; Chang, Xuting; Zhang, Wenjie; Arcucci, Rossella; Guo, Yike; Pain, Christopher C.: Data-driven reduced order model with temporal convolutional neural network (2020)
  2. Xie, Fangzhou: Wasserstein index generation model: automatic generation of time-series index with application to economic policy uncertainty (2020)
  3. Yang, Liu; Zhang, Dongkun; Karniadakis, George Em: Physics-informed generative adversarial networks for stochastic differential equations (2020)
  4. Yao, Houpu; Gao, Yi; Liu, Yongming: FEA-Net: a physics-guided data-driven model for efficient mechanical response prediction (2020)
  5. Ye, Han-Jia; Sheng, Xiang-Rong; Zhan, De-Chuan: Few-shot learning with adaptively initialized task optimizer: a practical meta-learning approach (2020)
  6. Ahmadi, Ahmadreza; Tani, Jun: A novel predictive-coding-inspired variational RNN model for online prediction and recognition (2019)
  7. Beck, Christian; E., Weinan; Jentzen, Arnulf: Machine learning approximation algorithms for high-dimensional fully nonlinear partial differential equations and second-order backward stochastic differential equations (2019)
  8. Biau, Gérard; Scornet, Erwan; Welbl, Johannes: Neural random forests (2019)
  9. Bubba, Tatiana A.; Kutyniok, Gitta; Lassas, Matti; März, Maximilian; Samek, Wojciech; Siltanen, Samuli; Srinivasan, Vignesh: Learning the invisible: a hybrid deep learning-shearlet framework for limited angle computed tomography (2019)
  10. Buehler, H.; Gonon, L.; Teichmann, J.; Wood, B.: Deep hedging (2019)
  11. Cao, Ying; Shen, Zuo-Jun Max: Quantile forecasting and data-driven inventory management under nonstationary demand (2019)
  12. ChangYong Oh, Efstratios Gavves, Max Welling: BOCK : Bayesian Optimization with Cylindrical Kernels (2019) arXiv
  13. Chan, Shing; Elsheikh, Ahmed H.: Parametric generation of conditional geological realizations using generative neural networks (2019)
  14. Chan-Wai-Nam, Quentin; Mikael, Joseph; Warin, Xavier: Machine learning for semi linear PDEs (2019)
  15. Chen, Guorong; Li, Tiange; Chen, Qijun; Ren, Shaofei; Wang, Chao; Li, Shaofan: Application of deep learning neural network to identify collision load conditions based on permanent plastic deformation of shell structures (2019)
  16. Dahl, Astrid; Bonilla, Edwin V.: Grouped Gaussian processes for solar power prediction (2019)
  17. Do, Kien; Tran, Truyen; Nguyen, Thin; Venkatesh, Svetha: Attentional multilabel learning over graphs: a message passing approach (2019)
  18. Edgar Riba, Dmytro Mishkin, Daniel Ponsa, Ethan Rublee, Gary Bradski: Kornia: an Open Source Differentiable Computer Vision Library for PyTorch (2019) arXiv
  19. Fujii, Masaaki; Takahashi, Akihiko; Takahashi, Masayuki: Asymptotic expansion as prior knowledge in deep learning method for high dimensional BSDEs (2019)
  20. Geete, Kanu; Pandey, Manish: A noise-based stabilizer for convolutional neural networks (2019)

previous 1 2 3 4 5 ... 8 9 10 next