darch: Package for deep architectures and Restricted-Bolzmann-Machines. The darch package is build on the basis of the code from G. E. Hinton and R. R. Salakhutdinov (available under Matlab Code for deep belief nets : last visit: 01.08.2013). This package is for generating neural networks with many layers (deep architectures) and train them with the method introduced by the publications ”A fast learning algorithm for deep belief nets” (G. E. Hinton, S. Osindero, Y. W. Teh) and ”Reducing the dimensionality of data with neural networks” (G. E. Hinton, R. R. Salakhutdinov). This method includes a pre training with the contrastive divergence method publishing by G.E Hinton (2002) and a fine tuning with common known training algorithms like backpropagation or conjugate gradient.

References in zbMATH (referenced in 262 articles )

Showing results 1 to 20 of 262.
Sorted by year (citations)

1 2 3 ... 12 13 14 next

  1. Gu, Shihao; Kelly, Bryan; Xiu, Dacheng: Autoencoder asset pricing models (2021)
  2. Jiang, Su; Durlofsky, Louis J.: Data-space inversion using a recurrent autoencoder for time-series parameterization (2021)
  3. Su, Liang; Zhang, Jing-Quan; Huang, Xin; LaFave, James M.: Automatic operational modal analysis of structures based on image recognition of stabilization diagrams with uncertainty quantification (2021)
  4. Cui, Ying; He, Ziyu; Pang, Jong-Shi: Multicomposite nonconvex optimization for training deep neural networks (2020)
  5. Desana, Mattia; Schnörr, Christoph: Sum-product graphical models (2020)
  6. Drori, Iddo: Deep variational inference (2020)
  7. Duan, Shiyu; Yu, Shujian; Chen, Yunmei; Principe, Jose C.: On kernel method-based connectionist models and supervised deep learning without backpropagation (2020)
  8. Fehrman, Benjamin; Gess, Benjamin; Jentzen, Arnulf: Convergence rates for the stochastic gradient descent method for non-convex objective functions (2020)
  9. Frazier-Logue, Noah; Hanson, Stephen José: The stochastic delta rule: faster and more accurate deep learning through adaptive weight noise (2020)
  10. Gao, Jing; Li, Peng; Chen, Zhikui; Zhang, Jianing: A survey on deep learning for multimodal data fusion (2020)
  11. Gong, Maoguo; Pan, Ke; Xie, Yu; Qin, A. K.; Tang, Zedong: Preserving differential privacy in deep neural networks with relevance-based adaptive noise imposition (2020)
  12. Har-Peled, Sariel; Jones, Mitchell: On separating points by lines (2020)
  13. Jiang, Huiping; Wang, Zequn; Jiao, Rui; Chen, Mei: Research on emotional classification of EEG based on convolutional neural network (2020)
  14. Lee, Kookjin; Carlberg, Kevin T.: Model reduction of dynamical systems on nonlinear manifolds using deep convolutional autoencoders (2020)
  15. Li, Xiang; Ning, Shaowu; Liu, Zhanli; Yan, Ziming; Luo, Chengcheng; Zhuang, Zhuo: Designing phononic crystal with anticipated band gap through a deep learning based data-driven method (2020)
  16. Oishi, Atsuya; Yagawa, Genki: A surface-to-surface contact search method enhanced by deep learning (2020)
  17. Osman, Yousuf Babiker M.; Li, Wei: Soft sensor modeling of key effluent parameters in wastewater treatment process based on SAE-NN (2020)
  18. Otsuka, Hajime; Takemoto, Kenta: Deep learning and k-means clustering in heterotic string vacua with line bundles (2020)
  19. Pan, Wei; Wang, Jing; Sun, Deyan: Establishing simple relationship between eigenvector and matrix elements (2020)
  20. Pradhan, Anshuman; Mukerji, Tapan: Seismic Bayesian evidential learning: estimation and uncertainty quantification of sub-resolution reservoir properties (2020)

1 2 3 ... 12 13 14 next