darch: Package for deep architectures and Restricted-Bolzmann-Machines. The darch package is build on the basis of the code from G. E. Hinton and R. R. Salakhutdinov (available under Matlab Code for deep belief nets : last visit: 01.08.2013). This package is for generating neural networks with many layers (deep architectures) and train them with the method introduced by the publications ”A fast learning algorithm for deep belief nets” (G. E. Hinton, S. Osindero, Y. W. Teh) and ”Reducing the dimensionality of data with neural networks” (G. E. Hinton, R. R. Salakhutdinov). This method includes a pre training with the contrastive divergence method publishing by G.E Hinton (2002) and a fine tuning with common known training algorithms like backpropagation or conjugate gradient.

References in zbMATH (referenced in 138 articles )

Showing results 1 to 20 of 138.
Sorted by year (citations)

1 2 3 ... 5 6 7 next

  1. Lyubchich, Vyacheslav; Woodland, Ryan J.: Using isotope composition and other node attributes to predict edges in fish trophic networks (2019)
  2. Vergari, Antonio; Di Mauro, Nicola; Esposito, Floriana: Visualizing and understanding sum-product networks (2019)
  3. Wu, Ying Nian; Gao, Ruiqi; Han, Tian; Zhu, Song-Chun: A tale of three probabilistic families: discriminative, descriptive, and generative models (2019)
  4. Burkhardt, Sophie; Kramer, Stefan: Online multi-label dependency topic models for text classification (2018)
  5. Decelle, A.; Fissore, G.; Furtlehner, C.: Thermodynamics of restricted Boltzmann machines and related learning dynamics (2018)
  6. Fang, Le-Heng; Lin, Wei; Luo, Qiang: Brain-inspired constructive learning algorithms with evolutionally additive nonlinear neurons (2018)
  7. Fei, Hongxiao; Tan, Fengyun: Bidirectional grid long short-term memory (BiGridLSTM): a method to address context-sensitivity and vanishing gradient (2018)
  8. Ghadai, Sambit; Balu, Aditya; Sarkar, Soumik; Krishnamurthy, Adarsh: Learning localized features in 3D CAD models for manufacturability analysis of drilled holes (2018)
  9. Guo, Weili; Wei, Haikun; Ong, Yew-Soon; Hervas, Jaime Rubio; Zhao, Junsheng; Wang, Hai; Zhang, Kanjian: Numerical analysis near singularities in RBF networks (2018)
  10. Huang, Zhiqi; Wang, Ran; Zhu, Hong; Zhu, Jie: Discovering the impact of hidden layer parameters on non-iterative training of feed-forward neural networks (2018)
  11. Jiang, Bai; Wu, Tung-Yu; Jin, Yifan; Wong, Wing H.: Convergence of contrastive divergence algorithm in exponential family (2018)
  12. Lee, Seunghye; Ha, Jingwan; Zokhirova, Mehriniso; Moon, Hyeonjoon; Lee, Jaehong: Background information of deep learning for structural engineering (2018)
  13. Ma, Yuzhe; He, Kun; Hopcroft, John; Shi, Pan: Neighbourhood-preserving dimension reduction via localised multidimensional scaling (2018)
  14. Morningstar, Alan; Melko, Roger G.: Deep learning the Ising model near criticality (2018)
  15. Peng, Xuan; Gao, Xunzhang; Li, Xiang: On better training the infinite restricted Boltzmann machines (2018)
  16. Rao, Pattabhi R. K.; Lalitha Devi, S.: Enhancing multi-document summarization using concepts (2018)
  17. Schmitz, Morgan A.; Heitz, Matthieu; Bonneel, Nicolas; Ngolè, Fred; Coeurjolly, David; Cuturi, Marco; Peyré, Gabriel; Starck, Jean-Luc: Wasserstein dictionary learning: optimal transport-based unsupervised nonlinear dictionary learning (2018)
  18. Ueltzhöffer, Kai: Deep active inference (2018)
  19. Wang, Xizhao (ed.); Cao, Weipeng (ed.): Editorial: Non-iterative approaches in training feed-forward neural networks and their applications (2018)
  20. Zhang, Changfan; Cheng, Xiang; Liu, Jianhua; He, Jing; Liu, Guangwei: Deep sparse autoencoder for feature extraction and diagnosis of locomotive adhesion status (2018)

1 2 3 ... 5 6 7 next