darch
darch: Package for deep architectures and Restricted-Bolzmann-Machines. The darch package is build on the basis of the code from G. E. Hinton and R. R. Salakhutdinov (available under Matlab Code for deep belief nets : last visit: 01.08.2013). This package is for generating neural networks with many layers (deep architectures) and train them with the method introduced by the publications ”A fast learning algorithm for deep belief nets” (G. E. Hinton, S. Osindero, Y. W. Teh) and ”Reducing the dimensionality of data with neural networks” (G. E. Hinton, R. R. Salakhutdinov). This method includes a pre training with the contrastive divergence method publishing by G.E Hinton (2002) and a fine tuning with common known training algorithms like backpropagation or conjugate gradient.
Keywords for this software
References in zbMATH (referenced in 255 articles )
Showing results 1 to 20 of 255.
Sorted by year (- Cui, Ying; He, Ziyu; Pang, Jong-Shi: Multicomposite nonconvex optimization for training deep neural networks (2020)
- Desana, Mattia; Schnörr, Christoph: Sum-product graphical models (2020)
- Drori, Iddo: Deep variational inference (2020)
- Duan, Shiyu; Yu, Shujian; Chen, Yunmei; Principe, Jose C.: On kernel method-based connectionist models and supervised deep learning without backpropagation (2020)
- Fehrman, Benjamin; Gess, Benjamin; Jentzen, Arnulf: Convergence rates for the stochastic gradient descent method for non-convex objective functions (2020)
- Frazier-Logue, Noah; Hanson, Stephen José: The stochastic delta rule: faster and more accurate deep learning through adaptive weight noise (2020)
- Gao, Jing; Li, Peng; Chen, Zhikui; Zhang, Jianing: A survey on deep learning for multimodal data fusion (2020)
- Gong, Maoguo; Pan, Ke; Xie, Yu; Qin, A. K.; Tang, Zedong: Preserving differential privacy in deep neural networks with relevance-based adaptive noise imposition (2020)
- Har-Peled, Sariel; Jones, Mitchell: On separating points by lines (2020)
- Jiang, Huiping; Wang, Zequn; Jiao, Rui; Chen, Mei: Research on emotional classification of EEG based on convolutional neural network (2020)
- Li, Xiang; Ning, Shaowu; Liu, Zhanli; Yan, Ziming; Luo, Chengcheng; Zhuang, Zhuo: Designing phononic crystal with anticipated band gap through a deep learning based data-driven method (2020)
- Oishi, Atsuya; Yagawa, Genki: A surface-to-surface contact search method enhanced by deep learning (2020)
- Osman, Yousuf Babiker M.; Li, Wei: Soft sensor modeling of key effluent parameters in wastewater treatment process based on SAE-NN (2020)
- Otsuka, Hajime; Takemoto, Kenta: Deep learning and k-means clustering in heterotic string vacua with line bundles (2020)
- Pan, Wei; Wang, Jing; Sun, Deyan: Establishing simple relationship between eigenvector and matrix elements (2020)
- Pradhan, Anshuman; Mukerji, Tapan: Seismic Bayesian evidential learning: estimation and uncertainty quantification of sub-resolution reservoir properties (2020)
- Puligilla, Shivakanth Chary; Jayaraman, Balaji: Assessment of end-to-end and sequential data-driven learning for non-intrusive modeling of fluid flows (2020)
- Qi, Anna; Yang, Lihua; Huang, Chao: Convergence of Markovian stochastic approximation for Markov random fields with hidden variables (2020)
- Ruehle, Fabian: Data science applications to string theory (2020)
- Shang, Yifan; Mao, Xiaobo; Zhao, Yuping; Li, Nan; Wang, Yang: Classification of tongue color based on convolution neural network (2020)