ROBPCA: A New Approach to Robust Principal Component Analysis. We introduce a new method for robust principal component analysis (PCA). Classical PCA is based on the empirical covariance matrix of the data and hence is highly sensitive to outlying observations. Two robust approaches have been developed to date. The first approach is based on the eigenvectors of a robust scatter matrix such as the minimum covariance determinant or an S-estimator and is limited to relatively low-dimensional data. The second approach is based on projection pursuit and can handle high-dimensional data. Here we propose the ROBPCA approach, which combines projection pursuit ideas with robust scatter matrix estimation. ROBPCA yields more accurate estimates at noncontaminated datasets and more robust estimates at contaminated data. ROBPCA can be computed rapidly, and is able to detect exact-fit situations. As a by-product, ROBPCA produces a diagnostic plot that displays and classifies the outliers. We apply the algorithm to several datasets from chemometrics and engineering.

References in zbMATH (referenced in 77 articles )

Showing results 1 to 20 of 77.
Sorted by year (citations)

1 2 3 4 next

  1. Majumdar, Subhabrata; Chatterjee, Snigdhansu: On weighted multivariate sign functions (2022)
  2. De Brabanter, Kris; De Brabanter, Jos: Robustness by reweighting for kernel estimators: an overview (2021)
  3. de Sousa, J.; Hron, K.; Fačevicová, K.; Filzmoser, P.: Robust principal component analysis for compositional tables (2021)
  4. Öner, Yüksel; Bulut, Hasan: A robust EM clustering approach: ROBEM (2021)
  5. Boudt, Kris; Rousseeuw, Peter J.; Vanduffel, Steven; Verdonck, Tim: The minimum regularized covariance determinant estimator (2020)
  6. Bulut, Hasan: Mahalanobis distance based on minimum regularized covariance determinant estimators for high dimensional data (2020)
  7. Cappozzo, Andrea; Greselin, Francesca; Murphy, Thomas Brendan: Anomaly and novelty detection for robust semi-supervised learning (2020)
  8. Hayashi, Kohei; Yoshida, Yuichi: Testing proximity to subspaces: approximate (\ell_\infty) minimization in constant time (2020)
  9. Paindaveine, Davy; Remy, Julien; Verdebout, Thomas: Testing for principal component directions under weak identifiability (2020)
  10. Sando, Keishi; Hino, Hideitsu: Modal principal component analysis (2020)
  11. Agostinelli, Claudio; Greco, Luca: Weighted likelihood estimation of multivariate location and scatter (2019)
  12. Bai, Jushan; Ng, Serena: Rank regularized estimation of approximate factor models (2019)
  13. Cevallos-Valdiviezo, Holger; Van Aelst, Stefan: Fast computation of robust subspace estimators (2019)
  14. Debruyne, Michiel; Höppner, Sebastiaan; Serneels, Sven; Verdonck, Tim: Outlyingness: which variables contribute most? (2019)
  15. Kirschstein, T.; Liebscher, Steffen: Assessing the market values of soccer players -- a robust analysis of data from German 1. and 2. Bundesliga (2019)
  16. Raymaekers, Jakob; Rousseeuw, Peter: A generalized spatial sign covariance matrix (2019)
  17. Alrawashdeh, Mufda Jameel; Radwan, Taha Radwan; Abunawas, Kalid Abunawas: Performance of linear discriminant analysis using different robust methods (2018)
  18. Archimbaud, Aurore; Nordhausen, Klaus; Ruiz-Gazen, Anne: ICS for multivariate outlier detection with application to quality control (2018)
  19. Bertsimas, Dimitris; Copenhaver, Martin S.: Characterization of the equivalence of robustification and regularization in linear and matrix regression (2018)
  20. Di Palma, M. A.; Filzmoser, P.; Gallo, M.; Hron, K.: A robust Parafac model for compositional data (2018)

1 2 3 4 next