Author of the publication

mmT5: Modular Multilingual Pre-Training Solves Source Language Hallucinations.

, , , , , and . EMNLP (Findings), page 1978-2008. Association for Computational Linguistics, (2023)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

Can Wikipedia Help Offline Reinforcement Learning?, , and . CoRR, (2022)Variational Inference for Learning Representations of Natural Language Edits., , and . AAAI, page 13552-13560. AAAI Press, (2021)Gemini 1.5: Unlocking multimodal understanding across millions of tokens of context., , , , , , , , , and 43 other author(s). CoRR, (2024)On the Role of Parallel Data in Cross-lingual Transfer Learning., and . CoRR, (2022)LEWIS: Levenshtein Editing for Unsupervised Text Style Transfer., and . ACL/IJCNLP (Findings), volume ACL/IJCNLP 2021 of Findings of ACL, page 3932-3944. Association for Computational Linguistics, (2021)Variational Inference for Learning Representations of Natural Language Edits., , and . CoRR, (2020)Combining Pretrained High-Resource Embeddings and Subword Representations for Low-Resource Languages., , and . AfricaNLP, (2020)BUFFET: Benchmarking Large Language Models for Few-shot Cross-lingual Transfer., , , , , , , , and . CoRR, (2023)Learning to Model Editing Processes., and . EMNLP (Findings), page 3822-3832. Association for Computational Linguistics, (2022)Large Language Models are Zero-Shot Reasoners., , , , and . NeurIPS, (2022)