ROBPCA

ROBPCA: A New Approach to Robust Principal Component Analysis. We introduce a new method for robust principal component analysis (PCA). Classical PCA is based on the empirical covariance matrix of the data and hence is highly sensitive to outlying observations. Two robust approaches have been developed to date. The first approach is based on the eigenvectors of a robust scatter matrix such as the minimum covariance determinant or an S-estimator and is limited to relatively low-dimensional data. The second approach is based on projection pursuit and can handle high-dimensional data. Here we propose the ROBPCA approach, which combines projection pursuit ideas with robust scatter matrix estimation. ROBPCA yields more accurate estimates at noncontaminated datasets and more robust estimates at contaminated data. ROBPCA can be computed rapidly, and is able to detect exact-fit situations. As a by-product, ROBPCA produces a diagnostic plot that displays and classifies the outliers. We apply the algorithm to several datasets from chemometrics and engineering.


References in zbMATH (referenced in 61 articles )

Showing results 1 to 20 of 61.
Sorted by year (citations)

1 2 3 4 next

  1. Boudt, Kris; Rousseeuw, Peter J.; Vanduffel, Steven; Verdonck, Tim: The minimum regularized covariance determinant estimator (2020)
  2. Hayashi, Kohei; Yoshida, Yuichi: Testing proximity to subspaces: approximate (\ell_\infty) minimization in constant time (2020)
  3. Paindaveine, Davy; Remy, Julien; Verdebout, Thomas: Testing for principal component directions under weak identifiability (2020)
  4. Agostinelli, Claudio; Greco, Luca: Weighted likelihood estimation of multivariate location and scatter (2019)
  5. Bai, Jushan; Ng, Serena: Rank regularized estimation of approximate factor models (2019)
  6. Cevallos-Valdiviezo, Holger; Van Aelst, Stefan: Fast computation of robust subspace estimators (2019)
  7. Debruyne, Michiel; Höppner, Sebastiaan; Serneels, Sven; Verdonck, Tim: Outlyingness: which variables contribute most? (2019)
  8. Raymaekers, Jakob; Rousseeuw, Peter: A generalized spatial sign covariance matrix (2019)
  9. Alrawashdeh, Mufda Jameel; Radwan, Taha Radwan; Abunawas, Kalid Abunawas: Performance of linear discriminant analysis using different robust methods (2018)
  10. Archimbaud, Aurore; Nordhausen, Klaus; Ruiz-Gazen, Anne: ICS for multivariate outlier detection with application to quality control (2018)
  11. Bertsimas, Dimitris; Copenhaver, Martin S.: Characterization of the equivalence of robustification and regularization in linear and matrix regression (2018)
  12. Heritier, Stephane; Victoria-Feser, Maria-Pia: Discussion of: “The power of monitoring: how to make the most of a contaminated multivariate sample” (2018)
  13. Raymaekers, Jakob; Rousseeuw, Peter J.; Vranckx, Iwein: Discussion of: “The power of monitoring: how to make the most of a contaminated multivariate sample” (2018)
  14. Wilcox, Rand: Modern statistics for the social and behavioral sciences. A practical introduction (2017)
  15. Yoshida, Takuma: Nonlinear surface regression with dimension reduction method (2017)
  16. Greco, Luca; Farcomeni, Alessio: A plug-in approach to sparse and robust principal component analysis (2016)
  17. Heylen, Joke; van Mechelen, Iven; Verduyn, Philippe; Ceulemans, Eva: KSC-N: clustering of hierarchical time profile data (2016)
  18. Morris, Katherine; McNicholas, Paul D.: Clustering, classification, discriminant analysis, and dimension reduction via generalized hyperbolic mixtures (2016)
  19. Schmitt, Eric; Vakili, Kaveh: The FastHCS algorithm for robust PCA (2016)
  20. Fusco, Elisa: Enhancing non-compensatory composite indicators: a directional proposal (2015)

1 2 3 4 next