wbs

Wild binary segmentation for multiple change-point detection. We propose a new technique, called wild binary segmentation (WBS), for consistent estimation of the number and locations of multiple change-points in data. We assume that the number of change-points can increase to infinity with the sample size. Due to a certain random localisation mechanism, WBS works even for very short spacings between the change-points and/or very small jump magnitudes, unlike standard binary segmentation. On the other hand, despite its use of localisation, WBS does not require the choice of a window or span parameter, and does not lead to a significant increase in computational complexity. WBS is also easy to code. We propose two stopping criteria for WBS: one based on thresholding and the other based on what we term the `strengthened Schwarz information criterion’. We provide default recommended values of the parameters of the procedure and show that it offers very good practical performance in comparison with the state of the art. The WBS methodology is implemented in the R package wbs, available on CRAN. {par} In addition, we provide a new proof of consistency of binary segmentation with improved rates of convergence, as well as a corresponding result for WBS.


References in zbMATH (referenced in 67 articles , 1 standard article )

Showing results 41 to 60 of 67.
Sorted by year (citations)
  1. Fryzlewicz, Piotr: Tail-greedy bottom-up data decompositions and fast multiple change-point detection (2018)
  2. Garreau, Damien; Arlot, Sylvain: Consistent change-point detection with kernels (2018)
  3. Hyun, Sangwon; G’sell, Max; Tibshirani, Ryan J.: Exact post-selection inference for the generalized Lasso path (2018)
  4. Jewell, Sean; Witten, Daniela: Exact spike train inference via (\ell_0) optimization (2018)
  5. Lee, Sokbae; Liao, Yuan; Seo, Myung Hwan; Shin, Youngki: Oracle estimation of a change point in high-dimensional quantile regression (2018)
  6. Ludkin, Matthew; Eckley, Idris; Neal, Peter: Dynamic stochastic block models: parameter estimation and detection of changes in community structure (2018)
  7. Ruggieri, Eric: A pruned recursive solution to the multiple change point problem (2018)
  8. Wang, Guanghui; Zou, Changliang; Yin, Guosheng: Change-point detection in multinomial data with a large number of categories (2018)
  9. Wang, Tengyao; Samworth, Richard J.: High dimensional change point estimation via sparse projection (2018)
  10. Galeano, Pedro; Wied, Dominik: Dating multiple change points in the correlation matrix (2017)
  11. Haynes, Kaylea; Fearnhead, Paul; Eckley, Idris A.: A computationally efficient nonparametric approach for changepoint detection (2017)
  12. Korkas, Karolos K.; Fryzlewicz, Piotr: Multiple change-point detection for non-stationary time series using wild binary segmentation (2017)
  13. Maidstone, Robert; Hocking, Toby; Rigaill, Guillem; Fearnhead, Paul: On optimal multiple changepoint algorithms for large data (2017)
  14. Messer, Michael; Costa, Kauê M.; Roeper, Jochen; Schneider, Gaby: Multi-scale detection of rate changes in spike trains with weak dependencies (2017)
  15. Messer, Michael; Schneider, Gaby: The shark fin function: asymptotic behavior of the filtered derivative for point processes in case of change points (2017)
  16. Soh, Yong Sheng; Chandrasekaran, Venkat: High-dimensional change-point estimation: combining filtering with convex optimization (2017)
  17. Zhang, Feipeng; Li, Qunhua: Robust bent line regression (2017)
  18. Biau, Gérard; Bleakley, Kevin; Mason, David M.: Long signal change-point detection (2016)
  19. Cho, Haeran: Change-point detection in panel data via double CUSUM statistic (2016)
  20. Davis, Richard A.; Hancock, Stacey A.; Yao, Yi-Ching: On consistency of minimum description length model selection for piecewise autoregressions (2016)