1 Matching Annotations
- May 2023
-
www.semanticscholar.org www.semanticscholar.org
-
We propose a simple solution to use a single Neural Machine Translation (NMT) model to translatebetween multiple languages. Our solution requires no changes to the model architecture from a standardNMT system but instead introduces an artificial token at the beginning of the input sentence to specifythe required target language. The rest of the model, which includes an encoder, decoder and attentionmodule, remains unchanged and is shared across all languages. Using a shared wordpiece vocabulary, ourapproach enables Multilingual NMT using a single model without any increase in parameters, which issignificantly simpler than previous proposals for Multilingual NMT. On the WMT’14 benchmarks, a singlemultilingual model achieves comparable performance for English→French and surpasses state-of-the-artresults for English→German. Similarly, a single multilingual model surpasses state-of-the-art resultsfor French→English and German→English on WMT’14 and WMT’15 benchmarks, respectively. Onproduction corpora, multilingual models of up to twelve language pairs allow for better translation ofmany individual pairs. In addition to improving the translation quality of language pairs that the modelwas trained with, our models can also learn to perform implicit bridging between language pairs neverseen explicitly during training, showing that transfer learning and zero-shot translation is possible forneural translation. Finally, we show analyses that hints at a universal interlingua representation in ourmodels and show some interesting examples when mixing languages.
this could help me
-