Wasserstein GAN

WassersteinGAN, Wasserstein GAN: We introduce a new algorithm named WGAN, an alternative to traditional GAN training. In this new model, we show that we can improve the stability of learning, get rid of problems like mode collapse, and provide meaningful learning curves useful for debugging and hyperparameter searches. Furthermore, we show that the corresponding optimization problem is sound, and provide extensive theoretical work highlighting the deep

References in zbMATH (referenced in 159 articles )

Showing results 1 to 20 of 159.
Sorted by year (citations)

1 2 3 ... 6 7 8 next

  1. Aldroubi, Akram; Diaz Martin, Rocio; Medri, Ivan; Rohde, Gustavo K.; Thareja, Sumati: The signed cumulative distribution transform for 1-D signal analysis and classification (2022)
  2. Amari, Shun-ichi; Matsuda, Takeru: Wasserstein statistics in one-dimensional location scale models (2022)
  3. Ambrosio, Luigi; Goldman, Michael; Trevisan, Dario: On the quadratic random matching problem in two-dimensional domains (2022)
  4. Baskerville, Nicholas P.; Keating, Jonathan P.; Mezzadri, Francesco; Najnudel, Joseph: A spin glass model for the loss surfaces of generative adversarial networks (2022)
  5. E, Weinan; Han, Jiequn; Jentzen, Arnulf: Algorithms for solving high dimensional PDEs: from nonlinear Monte Carlo to machine learning (2022)
  6. Friesecke, Gero; Schulz, Andreas S.; Vögler, Daniela: Genetic column generation: fast computation of high-dimensional multimarginal optimal transport problems (2022)
  7. Gao, Jia-Xing; Wang, Zhen-Yi; Zhang, Michael Q.; Qian, Min-Ping; Jiang, Da-Quan: A data-driven method to learn a jump diffusion process from aggregate biological gene expression data (2022)
  8. Gao, Yihang; Ng, Michael K.: Wasserstein generative adversarial uncertainty quantification in physics-informed neural networks (2022)
  9. Hassanaly, Malik; Glaws, Andrew; Stengel, Karen; King, Ryan N.: Adversarial sampling of unknown and high-dimensional conditional distributions (2022)
  10. Heaton, Howard; Fung, Samy Wu; Lin, Alex Tong; Osher, Stanley; Yin, Wotao: Wasserstein-based projections with applications to inverse problems (2022)
  11. Huang, Kevin; Zhang, Junyu; Zhang, Shuzhong: Cubic regularized Newton method for the saddle point models: a global and local convergence analysis (2022)
  12. Jain, Niharika; Olmo, Alberto; Sengupta, Sailik; Manikonda, Lydia; Kambhampati, Subbarao: Imperfect imaGANation: implications of GANs exacerbating biases on facial data augmentation and snapchat face lenses (2022)
  13. Komarichev, Artem; Hua, Jing; Zhong, Zichun: Learning geometry-aware joint latent space for simultaneous multimodal shape generation (2022)
  14. Li, Hong-an; Zhang, Min; Yu, Zhenhua; Li, Zhanli; Li, Na: An improved pix2pix model based on Gabor filter for robust color image rendering (2022)
  15. Liu, Shu; Li, Wuchen; Zha, Hongyuan; Zhou, Haomin: Neural parametric Fokker-Planck equation (2022)
  16. Manole, Tudor; Balakrishnan, Sivaraman; Wasserman, Larry: Minimax confidence intervals for the sliced Wasserstein distance (2022)
  17. Meng, Xuhui; Yang, Liu; Mao, Zhiping; del Águila Ferrandis, José; Karniadakis, George Em: Learning functional priors and posteriors from data and physics (2022)
  18. Niles-Weed, Jonathan; Berthet, Quentin: Minimax estimation of smooth densities in Wasserstein distance (2022)
  19. Oh, Sehyeok; Lee, Seungcheol; Son, Myeonggyun; Kim, Jooha; Ki, Hyungson: Accurate prediction of the particle image velocimetry flow field and rotor thrust using deep learning (2022)
  20. Ong, Yong Zheng; Yang, Haizhao: Generative imaging and image processing via generative encoder (2022)

1 2 3 ... 6 7 8 next