Evolino: hybrid neuroevolution / optimal linear search for sequence learning. Current Neural Network learning algorithms are limited in their ability to model non-linear dynamical systems. Most supervised gradient-based recurrent neural networks (RNNs) suffer from a vanishing error signal that prevents learning from inputs far in the past. Those that do not, still have problems when there are numerous local minima. We introduce a general framework for sequence learning, EVOlution of recurrent systems with LINear outputs (Evolino). Evolino uses evolution to discover good RNN hidden node weights, while using methods such as linear regression or quadratic programming to compute optimal linear mappings from hidden state to output. Using the Long Short-Term Memory RNN Architecture, the method is tested in three very different problem domains: 1) context-sensitive languages, 2) multiple superimposed sine waves, and 3) the Mackey-Glass system. Evolino performs exceptionally well across all tasks, where other methods show notable deficiencies in some.

References in zbMATH (referenced in 18 articles , 1 standard article )

Showing results 1 to 18 of 18.
Sorted by year (citations)

  1. Yu, Yong; Si, Xiaosheng; Hu, Changhua; Zhang, Jianxun: A review of recurrent neural networks: LSTM cells and network architectures (2019)
  2. Vlachas, Pantelis R.; Byeon, Wonmin; Wan, Zhong Y.; Sapsis, Themistoklis P.; Koumoutsakos, Petros: Data-driven forecasting of high-dimensional chaotic systems with long short-term memory networks (2018)
  3. Morando, S.; Jemei, S.; Hissel, D.; Gouriveau, R.; Zerhouni, N.: ANOVA method applied to proton exchange membrane fuel cell ageing forecasting using an echo state network (2017)
  4. Schmidhuber, Jürgen: Deep learning in neural networks: an overview (2015) ioport
  5. Yilmaz, Ozgur: Symbolic computation using cellular automata-based hyperdimensional computing (2015)
  6. Jan Koutník, Klaus Greff, Faustino Gomez, Jürgen Schmidhuber: A Clockwork RNN (2014) arXiv
  7. Chandra, Rohitash; Frean, Marcus; Zhang, Mengjie: Adapting modularity during learning in cooperative co-evolutionary recurrent neural networks (2012) ioport
  8. Koryakin, Danil; Lohmann, Johannes; Butz, Martin V.: Balanced echo state networks (2012) ioport
  9. Monner, Derek; Reggia, James A.: A generalized LSTM-like training algorithm for second-order recurrent neural networks (2012)
  10. Gallicchio, Claudio; Micheli, Alessio: Architectural and Markovian factors of echo state networks (2011) ioport
  11. Reinhart, R. Felix; Steil, Jochen J.: A constrained regularization approach for input-driven recurrent neural networks (2011)
  12. Holzmann, Georg; Hauser, Helmut: Echo state networks with filter neurons and a delay&sum readout (2010)
  13. Lukoševičius, Mantas; Jaeger, Herbert: Reservoir computing approaches to recurrent neural network training (2009)
  14. Montana, David; Vanwyk, Eric; Brinn, Marshall; Montana, Joshua; Milligan, Stephen: Evolution of internal dynamics for neural network nodes (2009) ioport
  15. Jaeger, Herbert; Lukoševičius, Mantas; Popovici, Dan; Siewert, Udo: Optimization and applications of echo state networks with leaky- integrator neurons (2007)
  16. Schmidhuber, Jürgen; Wierstra, Daan; Gagliolo, Matteo; Gomez, Faustino: Training recurrent networks by Evolino (2007)
  17. Xue, Yanbo; Yang, Le; Haykin, Simon: Decoupled echo state networks with lateral inhibition (2007)
  18. Schmidhuber, Jürgen; Gagliolo, Matteo; Wierstra, Daan; Gomez, Faustino J.: Evolino for recurrent support vector machines (2005) ioport