Exact Hamiltonian Monte Carlo for Truncated Multivariate Gaussians. We present a Hamiltonian Monte Carlo algorithm to sample from multivariate Gaussian distributions in which the target space is constrained by linear and quadratic inequalities or products thereof. The Hamiltonian equations of motion can be integrated exactly and there are no parameters to tune. The algorithm mixes faster and is more efficient than Gibbs sampling. The runtime depends on the number and shape of the constraints but the algorithm is highly parallelizable. In many cases, we can exploit special structure in the covariance matrices of the untruncated Gaussian to further speed up the runtime. A simple extension of the algorithm permits sampling from distributions whose log-density is piecewise quadratic, as in the ”Bayesian Lasso” model.

References in zbMATH (referenced in 20 articles )

Showing results 1 to 20 of 20.
Sorted by year (citations)

  1. Chalkis, Apostolos; Emiris, Ioannis Z.; Fisikopoulos, Vissarion; Repouskos, Panagiotis; Tsigaridas, Elias: Efficient sampling in spectrahedra and volume approximation (2022)
  2. Huang, Jingfang; Cao, Jian; Fang, Fuhui; Genton, Marc G.; Keyes, David E.; Turkiyyah, George: An (O(N)) algorithm for computing expectation of (N)-dimensional truncated multi-variate normal distribution. I: Fundamentals (2021)
  3. Schultheiss, Christoph; Renaux, Claude; Bühlmann, Peter: Multicarving for high-dimensional post-selection inference (2021)
  4. Bachoc, François; Helbert, Céline; Picheny, Victor: Gaussian process optimization with failures: classification and convergence proof (2020)
  5. López-Lopera, Andrés F.; Bachoc, François; Durrande, Nicolas; Rohmer, Jérémy; Idier, Déborah; Roustant, Olivier: Approximating Gaussian process emulators with linear inequality constraints and noisy observations via MC and MCMC (2020)
  6. Mulgrave, Jami J.; Ghosal, Subhashis: Bayesian inference in nonparanormal graphical models (2020)
  7. Nishimura, Akihiko; Dunson, David: Recycling intermediate steps to improve Hamiltonian Monte Carlo (2020)
  8. Ray, Pallavi; Pati, Debdeep; Bhattacharya, Anirban: Efficient Bayesian shape-restricted function estimation with constrained Gaussian process priors (2020)
  9. Bachoc, François; Lagnoux, Agnès; López-Lopera, Andrés F.: Maximum likelihood estimation for Gaussian processes under inequality constraints (2019)
  10. Benjamini, Yuval; Taylor, Jonathan; Irizarry, Rafael A.: Selection-corrected statistical inference for region detection with high-throughput assays (2019)
  11. Gunawan, D.; Tran, M.-N.; Suzuki, K.; Dick, J.; Kohn, R.: Computationally efficient Bayesian estimation of high-dimensional Archimedean copulas with discrete and mixed margins (2019)
  12. Azzimonti, Dario; Ginsbourger, David: Estimating orthant probabilities of high-dimensional Gaussian vectors with an application to set estimation (2018)
  13. Bierkens, Joris; Bouchard-Côté, Alexandre; Doucet, Arnaud; Duncan, Andrew B.; Fearnhead, Paul; Lienart, Thibaut; Roberts, Gareth; Vollmer, Sebastian J.: Piecewise deterministic Markov processes for scalable Monte Carlo on restricted domains (2018)
  14. Jacobovic, Royi: On the relation between the true and sample correlations under Bayesian modelling of gene expression datasets (2018)
  15. López-Lopera, Andrés F.; Bachoc, François; Durrande, Nicolas; Roustant, Olivier: Finite-dimensional Gaussian approximation with linear inequality constraints (2018)
  16. Tian, Xiaoying; Taylor, Jonathan: Selective inference with a randomized response (2018)
  17. Canale, Antonio; Pagui, Euloge Clovis Kenne; Scarpa, Bruno: Bayesian modeling of university first-year students’ grades after placement test (2016)
  18. Ridgway, James: Computation of Gaussian orthant probabilities in high dimension (2016)
  19. Burda, Martin: Constrained Hamiltonian Monte Carlo in BEKK GARCH with targeting (2015)
  20. Pakman, Ari; Huggins, Jonathan; Smith, Carl; Paninski, Liam: Fast state-space methods for inferring dendritic synaptic connectivity (2014)