Author of the publication

mT5: A Massively Multilingual Pre-trained Text-to-Text Transformer.

, , , , , , , and . NAACL-HLT, page 483-498. Association for Computational Linguistics, (2021)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

Unsupervised Transfer Learning for Spoken Language Understanding in Intelligent Agents., , and . CoRR, (2018)Supervised Contextual Embeddings for Transfer Learning in Natural Language Processing Tasks., , , , , and . CoRR, (2019)Distilling Large Language Models into Tiny and Effective Students using pQRNN., , , and . CoRR, (2021)DOCmT5: Document-Level Pretraining of Multilingual Language Models., , , and . NAACL-HLT (Findings), page 425-437. Association for Computational Linguistics, (2022)mT5: A Massively Multilingual Pre-trained Text-to-Text Transformer., , , , , , , and . NAACL-HLT, page 483-498. Association for Computational Linguistics, (2021)DOCmT5: Document-Level Pretraining of Multilingual Language Models., , , and . CoRR, (2021)XTREME: A Massively Multilingual Multi-task Benchmark for Evaluating Cross-lingual Generalisation., , , , , and . ICML, volume 119 of Proceedings of Machine Learning Research, page 4411-4421. PMLR, (2020)Harnessing Multilinguality in Unsupervised Machine Translation for Rare Languages., , , and . CoRR, (2020)mT5: A massively multilingual pre-trained text-to-text transformer, , , , , , , and . (2020)cite arxiv:2010.11934.Harnessing Multilinguality in Unsupervised Machine Translation for Rare Languages., , , and . NAACL-HLT, page 1126-1137. Association for Computational Linguistics, (2021)