t-SNE

Visualizing Data using t-SNE. We present a new technique called ”t-SNE” that visualizes high-dimensional data by giving each datapoint a location in a two or three-dimensional map. The technique is a variation of Stochastic Neighbor Embedding (Hinton and Roweis, 2002) that is much easier to optimize, and produces significantly better visualizations by reducing the tendency to crowd points together in the center of the map. t-SNE is better than existing techniques at creating a single map that reveals structure at many different scales. This is particularly important for high-dimensional data that lie on several different, but related, low-dimensional manifolds, such as images ofobjects from multiple classes seen from multiple viewpoints. For visualizing the structure of very large data sets, we show how t-SNE can use random walks on neighborhood graphs to allow the implicit structure of all of the data to influence the way in which a subset of the data is displayed. We illustrate the performance of t-SNE on a wide variety of data sets and compare it with many other non-parametric visualization techniques, including Sammon mapping, Isomap, and Locally Linear Embedding. The visualizations produced by t-SNE are significantly better than those produced by the other techniques on almost all of the data sets.


References in zbMATH (referenced in 95 articles , 2 standard articles )

Showing results 1 to 20 of 95.
Sorted by year (citations)

1 2 3 4 5 next

  1. Burkart, Nadia; Huber, Marco F.: A survey on the explainability of supervised machine learning (2021)
  2. Gao, Tingran; Brodzki, Jacek; Mukherjee, Sayan: The geometry of synchronization problems and learning group actions (2021)
  3. Guan, Leying; Chen, Xi; Hung Wong, Wing: Detecting strong signals in gene perturbation experiments: an adaptive approach with power guarantee and FDR control (2020)
  4. Hoiles, William; Krishnamurthy, Vikram; Pattanayak, Kunal: Rationally inattentive inverse reinforcement learning explains YouTube commenting behavior (2020)
  5. Horenko, Illia: On a scalable entropic breaching of the overfitting barrier for small data problems in machine learning (2020)
  6. Jaffe, Ariel; Kluger, Yuval; Linderman, George C.; Mishne, Gal; Steinerberger, Stefan: Randomized near-neighbor graphs, giant components and applications in data science (2020)
  7. Keshavarzzadeh, Vahid; Kirby, Robert M.; Narayan, Akil: Stress-based topology optimization under uncertainty via simulation-based Gaussian process (2020)
  8. Lang, Rongling; Lu, Ruibo; Zhao, Chenqian; Qin, Honglei; Liu, Guodong: Graph-based semi-supervised one class support vector machine for detecting abnormal lung sounds (2020)
  9. Lee, Kookjin; Carlberg, Kevin T.: Model reduction of dynamical systems on nonlinear manifolds using deep convolutional autoencoders (2020)
  10. Lee, O-Joun; Jung, Jason J.: Story embedding: learning distributed representations of stories based on character networks (2020)
  11. Lv, Shaoqing; Xiang, Ju; Feng, Jingyu; Wang, Honggang; Lu, Guangyue; Li, Min: Community enhancement network embedding based on edge reweighting preprocessing (2020)
  12. Philippe Boileau, Nima Hejazi, Sandrine Dudoit: scPCA: A toolbox for sparse contrastive principal component analysis in R (2020) not zbMATH
  13. Ruiz, Francisco J. R.; Athey, Susan; Blei, David M.: SHOPPER: a probabilistic model of consumer choice with substitutes and complements (2020)
  14. Baharev, Ali; Neumaier, Arnold; Schichl, Hermann: A manifold-based approach to sparse global constraint satisfaction problems (2019)
  15. Bekkouch, Imad Eddine Ibrahim; Youssry, Youssef; Gafarov, Rustam; Khan, Adil; Khattak, Asad Masood: Triplet loss network for unsupervised domain adaptation (2019)
  16. Bugbee, Bruce; Bush, Brian W.; Gruchalla, Kenny; Potter, Kristin; Brunhart-lupo, Nicholas; Krishnan, Venkat: Enabling immersive engagement in energy system models with deep learning (2019)
  17. Cai, Hongmin; Huang, Qinjian; Rong, Wentao; Song, Yan; Li, Jiao; Wang, Jinhua; Chen, Jiazhou; Li, Li: Breast microcalcification diagnosis using deep convolutional neural network from digital mammograms (2019)
  18. Chen, Mingjia; Zou, Qianfang; Wang, Changbo; Liu, Ligang: EdgeNet: deep metric learning for 3D shapes (2019)
  19. Chien, Vincent S. C.; Maess, Burkhard; Knösche, Thomas R.: A generic deviance detection principle for cortical on/off responses, omission response, and mismatch negativity (2019)
  20. Genctav, Asli; Tari, Sibel: Discrepancy: local/global shape characterization with a roundness bias (2019)

1 2 3 4 5 next