Blendenpik

Blendenpik: supercharging Lapack’s least-squares solver. Several innovative random-sampling and random-mixing techniques for solving problems in linear algebra have been proposed in the last decade, but they have not yet made a significant impact on numerical linear algebra. We show that by using a high-quality implementation of one of these techniques, we obtain a solver that performs extremely well in the traditional yardsticks of numerical linear algebra: it is significantly faster than high-performance implementations of existing state-of-the-art algorithms, and it is numerically backward stable. More specifically, we describe a least-squares solver for dense highly overdetermined systems that achieves residuals similar to those of direct QR factorization-based solvers (LAPACK), outperforms LAPACK by large factors, and scales significantly better than any QR-based solver.


References in zbMATH (referenced in 43 articles )

Showing results 1 to 20 of 43.
Sorted by year (citations)

1 2 3 next

  1. Huang, Guangxin; Liu, Yuanyuan; Yin, Feng: Tikhonov regularization with MTRSVD method for solving large-scale discrete ill-posed problems (2022)
  2. Ailon, Nir; Yehuda, Gal: The complexity of computing (almost) orthogonal matrices with (\varepsilon)-copies of the Fourier transform (2021)
  3. Chi, Jocelyn T.; Ipsen, Ilse C. F.: Multiplicative perturbation bounds for multivariate multiple linear regression in Schatten (p)-norms (2021)
  4. Du, Yi-Shu; Hayami, Ken; Zheng, Ning; Morikuni, Keiichi; Yin, Jun-Feng: Kaczmarz-type inner-iteration preconditioned flexible GMRES methods for consistent linear systems (2021)
  5. Du, Yi-Shu; Hayami, Ken; Zheng, Ning; Morikuni, Keiichi; Yin, Jun-Feng: Kaczmarz-type inner-iteration preconditioned flexible GMRES methods for consistent linear systems (2021)
  6. Sobczyk, Aleksandros; Gallopoulos, Efstratios: Estimating leverage scores via rank revealing methods and randomization (2021)
  7. Chung, Julianne; Chung, Matthias; Tanner Slagel, J.; Tenorio, Luis: Sampled limited memory methods for massive linear inverse problems (2020)
  8. Malik, Osman Asif; Becker, Stephen: Fast randomized matrix and tensor interpolative decomposition using countsketch (2020)
  9. Malik, Osman Asif; Becker, Stephen: Guarantees for the Kronecker fast Johnson-Lindenstrauss transform using a coherence and sampling argument (2020)
  10. Richtárik, Peter; Takáč, Martin: Stochastic reformulations of linear systems: algorithms and convergence theory (2020)
  11. Zhang, Liping; Wei, Yimin: Randomized core reduction for discrete ill-posed problem (2020)
  12. Bjarkason, Elvar K.: Pass-efficient randomized algorithms for low-rank matrix approximation using any number of views (2019)
  13. Mor-Yosef, Liron; Avron, Haim: Sketching for principal component regression (2019)
  14. Trogdon, Thomsa: On spectral and numerical properties of random butterfly matrices (2019)
  15. Wang, Haiying: More efficient estimation for logistic regression with optimal subsamples (2019)
  16. Wu, Tao; Gleich, David F.: Multiway Monte Carlo method for linear systems (2019)
  17. Zhou, Quan; Guan, Yongtao: Fast model-fitting of Bayesian variable selection regression using the iterative complex factorization algorithm (2019)
  18. Battaglino, Casey; Ballard, Grey; Kolda, Tamara G.: A practical randomized CP tensor decomposition (2018)
  19. Shabat, Gil; Shmueli, Yaniv; Aizenbud, Yariv; Averbuch, Amir: Randomized LU decomposition (2018)
  20. Yang, Jiyan; Chow, Yin-Lam; Ré, Christopher; Mahoney, Michael W.: Weighted SGD for (\ell_p) regression with randomized preconditioning (2018)

1 2 3 next