• GELUs

  • Referenced in 5 articles [sw36443]
  • propose the Gaussian Error Linear Unit (GELU), a high-performing neural network activation function ... GELU activation function is xΦ(x), where Φ(x) the standard Gaussian cumulative distribution function ... GELU nonlinearity weights inputs by their value, rather than gates inputs by their sign ... perform an empirical evaluation of the GELU nonlinearity against the ReLU and ELU activations...
  • Adam

  • Referenced in 737 articles [sw22205]
  • Adam: A Method for Stochastic Optimization. We introduce...
  • SGDR

  • Referenced in 15 articles [sw30752]
  • SGDR: Stochastic Gradient Descent with Warm Restarts. Restart...
  • Zoneout

  • Referenced in 5 articles [sw36444]
  • Zoneout: Regularizing RNNs by Randomly Preserving Hidden Activations...