GPyTorch: Blackbox Matrix-Matrix Gaussian Process Inference with GPU Acceleration. Despite advances in scalable models, the inference tools used for Gaussian processes (GPs) have yet to fully capitalize on developments in computing hardware. We present an efficient and general approach to GP inference based on Blackbox Matrix-Matrix multiplication (BBMM). BBMM inference uses a modified batched version of the conjugate gradients algorithm to derive all terms for training and inference in a single call. BBMM reduces the asymptotic complexity of exact GP inference from O(n3) to O(n2). Adapting this algorithm to scalable approximations and complex GP models simply requires a routine for efficient matrix-matrix multiplication with the kernel and its derivative. In addition, BBMM uses a specialized preconditioner to substantially speed up convergence. In experiments we show that BBMM effectively uses GPU hardware to dramatically accelerate both exact GP inference and scalable approximations. Additionally, we provide GPyTorch, a software platform for scalable GP inference via BBMM, built on PyTorch.
Keywords for this software
References in zbMATH (referenced in 3 articles , 1 standard article )
Showing results 1 to 3 of 3.
- Burt, David R.; Rasmussen, Carl Edward; van der Wilk, Mark: Convergence of sparse variational inference in Gaussian processes regression (2020)
- Herlands, William; Neill, Daniel B.; Nickisch, Hannes; Wilson, Andrew Gordon: Change surfaces for expressive multidimensional changepoints and counterfactual prediction (2019)
- Jacob R. Gardner, Geoff Pleiss, David Bindel, Kilian Q. Weinberger, Andrew Gordon Wilson: GPyTorch: Blackbox Matrix-Matrix Gaussian Process Inference with GPU Acceleration (2018) arXiv