Helping the Weak Makes You Strong: Simple Multi-Task Learning Improves Non-Autoregressive Translators
Published in EMNLP, 2022
A simple, model-agnostic multi-task training framework for non-autoregressive translation, where weak autoregressive decoders are used during training to force the NAR decoder to learn more informative representations.
Code: https://github.com/wxy-nlp/MultiTaskNAT
Keywords: non-autoregressive generation, neural machine translation, multi-task learning
Recommended citation: Xinyou Wang*, Zaixiang Zheng, and Shujian Huang. (2022). "Helping the Weak Makes You Strong: Simple Multi-Task Learning Improves Non-Autoregressive Translators." Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, 5513-5519.
Download Paper
