PMTK

PMTK is a collection of Matlab/Octave functions, written by Matt Dunham, Kevin Murphy and various other people. The toolkit is primarily designed to accompany Kevin Murphy’s textbook Machine learning: a probabilistic perspective, but can also be used independently of this book. The goal is to provide a unified conceptual and software framework encompassing machine learning, graphical models, and Bayesian statistics (hence the logo). (Some methods from frequentist statistics, such as cross validation, are also supported.) Since December 2011, the toolbox is in maintenance mode, meaning that bugs will be fixed, but no new features will be added (at least not by Kevin or Matt).


References in zbMATH (referenced in 177 articles )

Showing results 1 to 20 of 177.
Sorted by year (citations)

1 2 3 ... 7 8 9 next

  1. Girolami, Mark; Febrianto, Eky; Yin, Ge; Cirak, Fehmi: The statistical finite element method (statFEM) for coherent synthesis of observation data and model predictions (2021)
  2. Keller, Rachael T.; Du, Qiang: Discovery of dynamics using linear multistep methods (2021)
  3. Kojevnikov, Denis; Marmer, Vadim; Song, Kyungchul: Limit theorems for network dependent random variables (2021)
  4. Montiel Olea, José Luis; Nesbit, James: (Machine) learning parameter regions (2021)
  5. Oliehoek, Frans A.; Witwicki, Stefan; Kaelbling, Leslie P.: A sufficient statistic for influence in structured multiagent environments (2021)
  6. Whiteley, Nick: Dimension-free Wasserstein contraction of nonlinear filters (2021)
  7. Bigoni, Caterina; Zhang, Zhenying; Hesthaven, Jan S.: Systematic sensor placement for structural anomaly detection in the absence of damaged states (2020)
  8. Chen, Nan; Majda, Andrew J.: Predicting observed and hidden extreme events in complex nonlinear dynamical systems with partial observations and short training time series (2020)
  9. Duan, Bojia; Yuan, Jiabin; Yu, Chao-Hua; Huang, Jianbang; Hsieh, Chang-Yu: A survey on HHL algorithm: from theory to application in quantum machine learning (2020)
  10. Dunlop, Matthew M.; Helin, Tapio; Stuart, Andrew M.: Hyperparameter estimation in Bayesian MAP estimation: parameterizations and consistency (2020)
  11. Gurevich, Pavel; Stuke, Hannes: Gradient conjugate priors and multi-layer neural networks (2020)
  12. He, Qizhi; Chen, Jiun-Shyan: A physics-constrained data-driven approach based on locally convex reconstruction for noisy database (2020)
  13. Holm-Jensen, Tue; Hansen, Thomas Mejer: Linear waveform tomography inversion using machine learning algorithms (2020)
  14. Hosseini, Reshad; Sra, Suvrit: An alternative to EM for Gaussian mixture models: batch and stochastic Riemannian optimization (2020)
  15. Hung, Ying-Chao; Michailidis, George; PakHai Lok, Horace: Locating infinite discontinuities in computer experiments (2020)
  16. Inatsu, Yu; Karasuyama, Masayuki; Inoue, Keiichi; Kandori, Hideki; Takeuchi, Ichiro: Active learning of Bayesian linear models with high-dimensional binary features by parameter confidence-region estimation (2020)
  17. Kaandorp, Mikael L. A.; Dwight, Richard P.: Data-driven modelling of the Reynolds stress tensor using random forests with invariance (2020)
  18. Keshavarzzadeh, Vahid; Kirby, Robert M.; Narayan, Akil: Stress-based topology optimization under uncertainty via simulation-based Gaussian process (2020)
  19. Kim, D. H.; Zohdi, T. I.; Singh, R. P.: Modeling, simulation and machine learning for rapid process control of multiphase flowing foods (2020)
  20. Kuwajima, Hiroshi; Yasuoka, Hirotoshi; Nakae, Toshihiro: Engineering problems in machine learning systems (2020)

1 2 3 ... 7 8 9 next