OpenSeq2Seq: extensible toolkit for distributed and mixed precision training of sequence-to-sequence models. We present OpenSeq2Seq - an open-source toolkit for training sequence-to-sequence models. The main goal of our toolkit is to allow researchers to most effectively explore different sequence-to-sequence architectures. The efficiency is achieved by fully supporting distributed and mixed-precision training. OpenSeq2Seq provides building blocks for training encoder-decoder models for neural machine translation and automatic speech recognition. We plan to extend it with other modalities in the future.
Keywords for this software
References in zbMATH (referenced in 1 article , 1 standard article )
Showing result 1 of 1.
- Oleksii Kuchaiev; Boris Ginsburg; Igor Gitman; Vitaly Lavrukhin; Carl Case; Paulius Micikevicius: OpenSeq2Seq: extensible toolkit for distributed and mixed precision training of sequence-to-sequence models (2018) arXiv