Gaussian processes for machine learning (GPML) toolbox. The GPML toolbox provides a wide range of functionality for Gaussian process (GP) inference and prediction. GPs are specified by mean and covariance functions; we offer a library of simple mean and covariance functions and mechanisms to compose more complex ones. Several likelihood functions are supported including Gaussian and heavy-tailed for regression as well as others suitable for classification. Finally, a range of inference methods is provided, including exact and variational inference, Expectation Propagation, and Laplace’s method dealing with non-Gaussian likelihoods and FITC for dealing with large regression tasks.

References in zbMATH (referenced in 44 articles , 1 standard article )

Showing results 1 to 20 of 44.
Sorted by year (citations)

1 2 3 next

  1. Jamie Fairbrother, Christopher Nemeth, Maxime Rischard, Johanni Brea, Thomas Pinder: GaussianProcesses.jl: A Nonparametric Bayes Package for the Julia Language (2022) not zbMATH
  2. Chakraborty, S.; Adhikari, S.; Ganguli, R.: The role of surrogate models in the development of digital twins of dynamic systems (2021)
  3. Chen, Kai; van Laarhoven, Twan; Marchiori, Elena: Gaussian processes with skewed Laplace spectral mixture kernels for long-term forecasting (2021)
  4. Pareek, Parikshit; Wang, Chuan; Nguyen, Hung D.: Non-parametric probabilistic load flow using Gaussian process learning (2021)
  5. Sun, Shiliang; Sun, Xuli; Liu, Qiuyang: Multi-view Gaussian processes with posterior consistency (2021)
  6. Yang, Xiu; Tartakovsky, Guzel; Tartakovsky, Alexandre M.: Physics information aided kriging using stochastic simulation models (2021)
  7. Bartels, Simon; Hennig, Philipp: Conjugate gradients for kernel machines (2020)
  8. Binois, Mickael; Picheny, Victor; Taillandier, Patrick; Habbal, Abderrahmane: The Kalai-Smorodinsky solution for many-objective Bayesian optimization (2020)
  9. Burkhart, Michael C.; Brandman, David M.; Franco, Brian; Hochberg, Leigh R.; Harrison, Matthew T.: The discriminative Kalman filter for Bayesian filtering with nonlinear and nongaussian observation models (2020)
  10. Chen, Chen; Liao, Qifeng: ANOVA Gaussian process modeling for high-dimensional stochastic computational models (2020)
  11. Chen, Hanshu; Meng, Zeng; Zhou, Huanlin: A hybrid framework of efficient multi-objective optimization of stiffened shells with imperfection (2020)
  12. Hartmann, Marcelo; Vanhatalo, Jarno: Laplace approximation and natural gradient for Gaussian process regression with heteroscedastic Student-(t) model (2019)
  13. Herlands, William; Neill, Daniel B.; Nickisch, Hannes; Wilson, Andrew Gordon: Change surfaces for expressive multidimensional changepoints and counterfactual prediction (2019)
  14. Li, Yongqiang; Yang, Chengzan; Hou, Zhongsheng; Feng, Yuanjing; Yin, Chenkun: Data-driven approximate Q-learning stabilization with optimality error bound analysis (2019)
  15. Mao, Zhiping; Li, Zhen; Karniadakis, George Em: Nonlocal flocking dynamics: learning the fractional order of PDEs from particle simulations (2019)
  16. Pang, Guofei; Yang, Liu; Karniadakis, George Em: Neural-net-induced Gaussian process regression for function approximation and PDE solution (2019)
  17. Price, Ilan; Fowkes, Jaroslav; Hopman, Daniel: Gaussian processes for unconstraining demand (2019)
  18. Seongil Jo; Taeryon Choi; Beomjo Park; Peter Lenk: bsamGP: An R Package for Bayesian Spectral Analysis Models Using Gaussian Process Priors (2019) not zbMATH
  19. Seshadri, Pranay; Yuchi, Shaowu; Parks, Geoffrey T.: Dimension reduction via Gaussian ridge functions (2019)
  20. Bradford, Eric; Schweidtmann, Artur M.; Lapkin, Alexei: Efficient multiobjective optimization employing Gaussian processes, spectral sampling and a genetic algorithm (2018)

1 2 3 next