darch

darch: Package for deep architectures and Restricted-Bolzmann-Machines. The darch package is build on the basis of the code from G. E. Hinton and R. R. Salakhutdinov (available under Matlab Code for deep belief nets : last visit: 01.08.2013). This package is for generating neural networks with many layers (deep architectures) and train them with the method introduced by the publications ”A fast learning algorithm for deep belief nets” (G. E. Hinton, S. Osindero, Y. W. Teh) and ”Reducing the dimensionality of data with neural networks” (G. E. Hinton, R. R. Salakhutdinov). This method includes a pre training with the contrastive divergence method publishing by G.E Hinton (2002) and a fine tuning with common known training algorithms like backpropagation or conjugate gradient.


References in zbMATH (referenced in 210 articles )

Showing results 1 to 20 of 210.
Sorted by year (citations)

1 2 3 ... 9 10 11 next

  1. Desana, Mattia; Schnörr, Christoph: Sum-product graphical models (2020)
  2. Har-Peled, Sariel; Jones, Mitchell: On separating points by lines (2020)
  3. Li, Xiang; Ning, Shaowu; Liu, Zhanli; Yan, Ziming; Luo, Chengcheng; Zhuang, Zhuo: Designing phononic crystal with anticipated band gap through a deep learning based data-driven method (2020)
  4. Oishi, Atsuya; Yagawa, Genki: A surface-to-surface contact search method enhanced by deep learning (2020)
  5. Tsionas, Mike G.; Andrikopoulos, Athanasios: On a high-dimensional model representation method based on copulas (2020)
  6. van Engelen, Jesper E.; Hoos, Holger H.: A survey on semi-supervised learning (2020)
  7. Zheng, Kunming; Hu, Youmin; Wu, Bo: Intelligent fuzzy sliding mode control for complex robot system with disturbances (2020)
  8. Zhou, Ding-Xuan: Universality of deep convolutional neural networks (2020)
  9. Zhou, Yicheng; Lu, Zhenzhou; Hu, Jinghan; Hu, Yingshi: Surrogate modeling of high-dimensional problems via data-driven polynomial chaos expansions and sparse partial least square (2020)
  10. Bühlmann, Peter: Comments on “Data science, big data and statistics” (2019)
  11. Choi, Arthur; Wang, Ruocheng; Darwiche, Adnan: On the relative expressiveness of Bayesian and neural networks (2019)
  12. Chui, Charles K.; Lin, Shao-Bo; Zhou, Ding-Xuan: Deep neural networks for rotation-invariance approximation and learning (2019)
  13. Comsa, Iulia M.; Firsching, Moritz; Fischbacher, Thomas: SO(8) supergravity and the magic of machine learning (2019)
  14. Czaja, Wojciech; Li, Weilin: Analysis of time-frequency scattering transforms (2019)
  15. Flynn, Thomas; Vázquez-Abad, Felisa: A simultaneous perturbation weak derivative estimator for stochastic neural networks (2019)
  16. Ghahari, Azar; Newlands, Nathaniel K.; Lyubchich, Vyacheslav; Gel, Yulia R.: Deep learning at the interface of agricultural insurance risk and spatio-temporal uncertainty in weather extremes (2019)
  17. Görgel, Pelin; Simsek, Ahmet: Face recognition via deep stacked denoising sparse autoencoders (DSDSA) (2019)
  18. Grillo, Sebastián A.: A linear relation between input and first layer in neural networks (2019)
  19. Gui, Yuanmiao; Wang, Rujing; Wei, Yuanyuan; Wang, Xue: DNN-PPI: a large-scale prediction of protein-protein interactions based on deep neural networks (2019)
  20. Herzog, S.; Wörgötter, F.; Parlitz, U.: Convolutional autoencoder and conditional random fields hybrid for predicting spatial-temporal chaos (2019)

1 2 3 ... 9 10 11 next