RCV1

RCV1: A New Benchmark Collection for Text Categorization Research. Reuters Corpus Volume I (RCV1) is an archive of over 800,000 manually categorized newswire stories recently made available by Reuters, Ltd. for research purposes. Use of this data for research on text categorization requires a detailed understanding of the real world constraints under which the data was produced. Drawing on interviews with Reuters personnel and access to Reuters documentation, we describe the coding policy and quality control procedures used in producing the RCV1 data, the intended semantics of the hierarchical category taxonomies, and the corrections necessary to remove errorful data. We refer to the original data as RCV1-v1, and the corrected data as RCV1-v2. We benchmark several widely used supervised learning methods on RCV1-v2, illustrating the collection’s properties, suggesting new directions for research, and providing baseline results for future studies. We make available detailed, per-category experimental results, as well as corrected versions of the category assignments and taxonomy structures, via online appendices.

This software is also peer reviewed by journal TOMS.


References in zbMATH (referenced in 114 articles )

Showing results 1 to 20 of 114.
Sorted by year (citations)

1 2 3 4 5 6 next

  1. Jaffe, Ariel; Kluger, Yuval; Linderman, George C.; Mishne, Gal; Steinerberger, Stefan: Randomized near-neighbor graphs, giant components and applications in data science (2020)
  2. Jung, Jinhong; Sael, Lee: Fast and accurate pseudoinverse with sparse matrix reordering and incremental approach (2020)
  3. Loor, Marcelo; De Tré, Guy: Handling subjective information through augmented (fuzzy) computation (2020)
  4. Nakano, Felipe Kenji; Cerri, Ricardo; Vens, Celine: Active learning for hierarchical multi-label classification (2020)
  5. Yang, Tianbao; Zhang, Lijun; Lin, Qihang; Zhu, Shenghuo; Jin, Rong: High-dimensional model recovery from random sketched data by exploring intrinsic sparsity (2020)
  6. Yousefian, Farzad; Nedić, Angelia; Shanbhag, Uday V.: On stochastic and deterministic quasi-Newton methods for nonstrongly convex optimization: asymptotic convergence and rate analysis (2020)
  7. Yuan, Xiao-Tong; Li, Ping: On convergence of distributed approximate Newton methods: globalization, sharper bounds and beyond (2020)
  8. Yuan, Xiao-Tong; Liu, Bo; Wang, Lezi; Liu, Qingshan; Metaxas, Dimitris N.: Dual iterative hard thresholding (2020)
  9. Duchi, John; Namkoong, Hongseok: Variance-based regularization with convex objectives (2019)
  10. Fercoq, Olivier; Bianchi, Pascal: A coordinate-descent primal-dual algorithm with large step size and possibly nonseparable functions (2019)
  11. Karakus, Can; Sun, Yifan; Diggavi, Suhas; Yin, Wotao: Redundancy techniques for straggler mitigation in distributed optimization and learning (2019)
  12. Krishnamurthy, Akshay; Agarwal, Alekh; Huang, Tzu-Kuo; Iii, Hal Daumé; Langford, John: Active learning for cost-sensitive classification (2019)
  13. Milzarek, Andre; Xiao, Xiantao; Cen, Shicong; Wen, Zaiwen; Ulbrich, Michael: A stochastic semismooth Newton method for nonsmooth nonconvex optimization (2019)
  14. Song, Yangqiu; Upadhyay, Shyam; Peng, Haoruo; Mayhew, Stephen; Roth, Dan: Toward any-language zero-shot topic classification of textual documents (2019)
  15. Bashar, Md Abul; Li, Yuefeng: Interpretation of text patterns (2018)
  16. Bottou, Léon; Curtis, Frank E.; Nocedal, Jorge: Optimization methods for large-scale machine learning (2018)
  17. Burkhardt, Sophie; Kramer, Stefan: Online multi-label dependency topic models for text classification (2018)
  18. Elenberg, Ethan R.; Khanna, Rajiv; Dimakis, Alexandros G.; Negahban, Sahand: Restricted strong convexity implies weak submodularity (2018)
  19. Francisco Charte, Antonio J. Rivera, David Charte, María J. del Jesus, Francisco Herrera: Tips, guidelines and tools for managing multi-label datasets: the mldr.datasets R package and the Cometa data repository (2018) arXiv
  20. Gudivada, Venkat N.; Arbabifard, Kamyar: Open-source libraries, application frameworks, and workflow systems for NLP (2018)

1 2 3 4 5 6 next