GPyTorch: Blackbox Matrix-Matrix Gaussian Process Inference with GPU Acceleration. Despite advances in scalable models, the inference tools used for Gaussian processes (GPs) have yet to fully capitalize on developments in computing hardware. We present an efficient and general approach to GP inference based on Blackbox Matrix-Matrix multiplication (BBMM). BBMM inference uses a modified batched version of the conjugate gradients algorithm to derive all terms for training and inference in a single call. BBMM reduces the asymptotic complexity of exact GP inference from O(n3) to O(n2). Adapting this algorithm to scalable approximations and complex GP models simply requires a routine for efficient matrix-matrix multiplication with the kernel and its derivative. In addition, BBMM uses a specialized preconditioner to substantially speed up convergence. In experiments we show that BBMM effectively uses GPU hardware to dramatically accelerate both exact GP inference and scalable approximations. Additionally, we provide GPyTorch, a software platform for scalable GP inference via BBMM, built on PyTorch.

References in zbMATH (referenced in 15 articles , 1 standard article )

Showing results 1 to 15 of 15.
Sorted by year (citations)

  1. Cortinovis, Alice; Kressner, Daniel; Massei, Stefano: Divide-and-conquer methods for functions of matrices with banded or hierarchical low-rank structure (2022)
  2. Paul Scherer, Thomas Gaudelet, Alison Pouplin, Suraj M S, Jyothish Soman, Lindsay Edwards, Jake P. Taylor-King: PyRelationAL: A Library for Active Learning Research and Development (2022) arXiv
  3. Wang, Hengjie; Planas, Robert; Chandramowlishwaran, Aparna; Bostanabad, Ramin: Mosaic flows: a transferable deep learning framework for solving PDEs on unseen domains (2022)
  4. Grosnit, Antoine; Cowen-Rivers, Alexander I.; Tutunov, Rasul; Griffiths, Ryan-Rhys; Wang, Jun; Bou-Ammar, Haitham: Are we forgetting about compositional optimisers in Bayesian optimisation? (2021)
  5. Haiping Lu, Xianyuan Liu, Robert Turner, Peizhen Bai, Raivo E Koot, Shuo Zhou, Mustafa Chasmai, Lawrence Schobs: PyKale: Knowledge-Aware Machine Learning from Multiple Sources in Python (2021) arXiv
  6. Shabat, Gil; Choshen, Era; Or, Dvir Ben; Carmel, Nadav: Fast and accurate Gaussian kernel ridge regression using matrix decompositions for preconditioning (2021)
  7. Shi, Tianyi; Townsend, Alex: On the compressibility of tensors (2021)
  8. Tomkins, Sabina; Liao, Peng; Klasnja, Predrag; Murphy, Susan: IntelligentPooling: practical Thompson sampling for mHealth (2021)
  9. Vincent Fortuin, AdriĆ  Garriga-Alonso, Mark van der Wilk, Laurence Aitchison: BNNpriors: A library for Bayesian neural network inference with different prior distributions (2021) arXiv
  10. Wilson, James T.; Borovitskiy, Viacheslav; Terenin, Alexander; Mostowsky, Peter; Deisenroth, Marc Peter: Pathwise conditioning of Gaussian processes (2021)
  11. Burt, David R.; Rasmussen, Carl Edward; van der Wilk, Mark: Convergence of sparse variational inference in Gaussian processes regression (2020)
  12. Tibo, Alessandro; Jaeger, Manfred; Frasconi, Paolo: Learning and interpreting multi-multi-instance learning networks (2020)
  13. Herlands, William; Neill, Daniel B.; Nickisch, Hannes; Wilson, Andrew Gordon: Change surfaces for expressive multidimensional changepoints and counterfactual prediction (2019)
  14. Zhu, Yinhao; Zabaras, Nicholas; Koutsourelakis, Phaedon-Stelios; Perdikaris, Paris: Physics-constrained deep learning for high-dimensional surrogate modeling and uncertainty quantification without labeled data (2019)
  15. Jacob R. Gardner, Geoff Pleiss, David Bindel, Kilian Q. Weinberger, Andrew Gordon Wilson: GPyTorch: Blackbox Matrix-Matrix Gaussian Process Inference with GPU Acceleration (2018) arXiv