Blendenpik

Blendenpik: supercharging Lapack’s least-squares solver. Several innovative random-sampling and random-mixing techniques for solving problems in linear algebra have been proposed in the last decade, but they have not yet made a significant impact on numerical linear algebra. We show that by using a high-quality implementation of one of these techniques, we obtain a solver that performs extremely well in the traditional yardsticks of numerical linear algebra: it is significantly faster than high-performance implementations of existing state-of-the-art algorithms, and it is numerically backward stable. More specifically, we describe a least-squares solver for dense highly overdetermined systems that achieves residuals similar to those of direct QR factorization-based solvers (LAPACK), outperforms LAPACK by large factors, and scales significantly better than any QR-based solver.


References in zbMATH (referenced in 35 articles )

Showing results 1 to 20 of 35.
Sorted by year (citations)

1 2 next

  1. Malik, Osman Asif; Becker, Stephen: Guarantees for the Kronecker fast Johnson-Lindenstrauss transform using a coherence and sampling argument (2020)
  2. Richtárik, Peter; Takáč, Martin: Stochastic reformulations of linear systems: algorithms and convergence theory (2020)
  3. Zhang, Liping; Wei, Yimin: Randomized core reduction for discrete ill-posed problem (2020)
  4. Bjarkason, Elvar K.: Pass-efficient randomized algorithms for low-rank matrix approximation using any number of views (2019)
  5. Mor-Yosef, Liron; Avron, Haim: Sketching for principal component regression (2019)
  6. Trogdon, Thomsa: On spectral and numerical properties of random butterfly matrices (2019)
  7. Wang, Haiying: More efficient estimation for logistic regression with optimal subsamples (2019)
  8. Wu, Tao; Gleich, David F.: Multiway Monte Carlo method for linear systems (2019)
  9. Zhou, Quan; Guan, Yongtao: Fast model-fitting of Bayesian variable selection regression using the iterative complex factorization algorithm (2019)
  10. Battaglino, Casey; Ballard, Grey; Kolda, Tamara G.: A practical randomized CP tensor decomposition (2018)
  11. Shabat, Gil; Shmueli, Yaniv; Aizenbud, Yariv; Averbuch, Amir: Randomized LU decomposition (2018)
  12. Yang, Jiyan; Chow, Yin-Lam; Ré, Christopher; Mahoney, Michael W.: Weighted SGD for (\ell_p) regression with randomized preconditioning (2018)
  13. Avron, Haim; Clarkson, Kenneth L.; Woodruff, David P.: Faster kernel ridge regression using sketching and preconditioning (2017)
  14. Zhang, Xiaojia Shelly; de Sturler, Eric; Paulino, Glaucio H.: Stochastic sampling for deterministic structural topology optimization with many load cases: density-based and ground structure approaches (2017)
  15. Clarkson, Kenneth L.; Drineas, Petros; Magdon-Ismail, Malik; Mahoney, Michael W.; Meng, Xiangrui; Woodruff, David P.: The fast Cauchy transform and faster robust linear regression (2016)
  16. Scott, Jennifer; Tuma, Miroslav: Preconditioning of linear least squares by robust incomplete factorization for implicitly held normal equations (2016)
  17. Wei, Yimin; Xie, Pengpeng; Zhang, Liping: Tikhonov regularization and randomized GSVD (2016)
  18. Zhang, Xiaowei; Cheng, Li; Chu, Delin; Liao, Li-Zhi; Ng, Michael K.; Tan, Roger C. E.: Incremental regularized least squares for dimensionality reduction of large-scale data (2016)
  19. Bourgain, Jean; Dirksen, Sjoerd; Nelson, Jelani: Toward a unified theory of sparse dimensionality reduction in Euclidean space (2015)
  20. Holodnak, John T.; Ipsen, Ilse C. F.: Randomized approximation of the Gram matrix: exact computation and probabilistic bounds (2015)

1 2 next