GPML

Gaussian processes for machine learning (GPML) toolbox. The GPML toolbox provides a wide range of functionality for Gaussian process (GP) inference and prediction. GPs are specified by mean and covariance functions; we offer a library of simple mean and covariance functions and mechanisms to compose more complex ones. Several likelihood functions are supported including Gaussian and heavy-tailed for regression as well as others suitable for classification. Finally, a range of inference methods is provided, including exact and variational inference, Expectation Propagation, and Laplace’s method dealing with non-Gaussian likelihoods and FITC for dealing with large regression tasks.


References in zbMATH (referenced in 32 articles , 1 standard article )

Showing results 1 to 20 of 32.
Sorted by year (citations)

1 2 next

  1. Chen, Hanshu; Meng, Zeng; Zhou, Huanlin: A hybrid framework of efficient multi-objective optimization of stiffened shells with imperfection (2020)
  2. Hartmann, Marcelo; Vanhatalo, Jarno: Laplace approximation and natural gradient for Gaussian process regression with heteroscedastic Student-(t) model (2019)
  3. Herlands, William; Neill, Daniel B.; Nickisch, Hannes; Wilson, Andrew Gordon: Change surfaces for expressive multidimensional changepoints and counterfactual prediction (2019)
  4. Li, Yongqiang; Yang, Chengzan; Hou, Zhongsheng; Feng, Yuanjing; Yin, Chenkun: Data-driven approximate Q-learning stabilization with optimality error bound analysis (2019)
  5. Mao, Zhiping; Li, Zhen; Karniadakis, George Em: Nonlocal flocking dynamics: learning the fractional order of PDEs from particle simulations (2019)
  6. Price, Ilan; Fowkes, Jaroslav; Hopman, Daniel: Gaussian processes for unconstraining demand (2019)
  7. Seongil Jo; Taeryon Choi; Beomjo Park; Peter Lenk: bsamGP: An R Package for Bayesian Spectral Analysis Models Using Gaussian Process Priors (2019) not zbMATH
  8. Seshadri, Pranay; Yuchi, Shaowu; Parks, Geoffrey T.: Dimension reduction via Gaussian ridge functions (2019)
  9. Bradford, Eric; Schweidtmann, Artur M.; Lapkin, Alexei: Efficient multiobjective optimization employing Gaussian processes, spectral sampling and a genetic algorithm (2018)
  10. Schulz, Eric; Speekenbrink, Maarten; Krause, Andreas: A tutorial on Gaussian process regression: modelling, exploring, and exploiting functions (2018)
  11. Van Steenkiste, Tom; van der Herten, Joachim; Couckuyt, Ivo; Dhaene, Tom: Sequential sensitivity analysis of expensive black-box simulators with metamodelling (2018)
  12. Bussas, Matthias; Sawade, Christoph; Kühn, Nicolas; Scheffer, Tobias; Landwehr, Niels: Varying-coefficient models for geospatial transfer learning (2017)
  13. Ghosh, Sanmitra; Dasmahapatra, Srinandan; Maharatna, Koushik: Fast approximate Bayesian computation for estimating parameters in differential equations (2017)
  14. Li, Yongqiang; Hou, Zhongsheng; Feng, Yuanjing; Chi, Ronghu: Data-driven approximate value iteration with optimality error bound analysis (2017)
  15. Matthews, Alexander G. De G.; van der Wilk, Mark; Nickson, Tom; Fujii, Keisuke; Boukouvalas, Alexis; León-Villagrá, Pablo; Ghahramani, Zoubin; Hensman, James: GPflow: a Gaussian process library using tensorflow (2017)
  16. Pang, Guofei; Perdikaris, Paris; Cai, Wei; Karniadakis, George Em: Discovering variable fractional orders of advection-dispersion equations from field data using multi-fidelity Bayesian optimization (2017)
  17. Zhang, Cheng; Shahbaba, Babak; Zhao, Hongkai: Hamiltonian Monte Carlo acceleration using surrogate functions with random bases (2017)
  18. Belyaev, Mikhail; Burnaev, Evgeny; Kapushev, Y.: Computationally efficient algorithm for Gaussian process regression in case of structured samples (2016)
  19. Mooij, Joris M.; Peters, Jonas; Janzing, Dominik; Zscheischler, Jakob; Schölkopf, Bernhard: Distinguishing cause from effect using observational data: methods and benchmarks (2016)
  20. Bouveyron, C.; Fauvel, M.; Girard, S.: Kernel discriminant analysis and clustering with parsimonious Gaussian process models (2015)

1 2 next