Author of the publication

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

No persons found for author name Sutawika, Lintang
add a person with the name Sutawika, Lintang
 

Other publications of authors with the same name

Towards better structured and less noisy Web data: Oscar with Register annotations., , , , , , , , , and 3 other author(s). W-NUT@COLING, page 215-221. Association for Computational Linguistics, (2022)Crosslingual Generalization through Multitask Finetuning., , , , , , , , , and 9 other author(s). ACL (1), page 15991-16111. Association for Computational Linguistics, (2023)Samsung Research Philippines - Datasaur AI's Submission for the WMT22 Large Scale Multilingual Translation Task., and . WMT, page 1034-1038. Association for Computational Linguistics, (2022)BLOOM+1: Adding Language Support to BLOOM for Zero-Shot Prompting., , , , , , , , , and 5 other author(s). ACL (1), page 11682-11703. Association for Computational Linguistics, (2023)Multitask Prompted Training Enables Zero-Shot Task Generalization., , , , , , , , , and 30 other author(s). ICLR, OpenReview.net, (2022)Emergent and Predictable Memorization in Large Language Models., , , , , , and . CoRR, (2023)Prompting Multilingual Large Language Models to Generate Code-Mixed Texts: The Case of South East Asian Languages., , , , , , , , , and 2 other author(s). CoRR, (2023)What Language Model to Train if You Have One Million GPU Hours?, , , , , , , , , and 9 other author(s). CoRR, (2022)What Language Model to Train if You Have One Million GPU Hours?, , , , , , , , , and 8 other author(s). EMNLP (Findings), page 765-782. Association for Computational Linguistics, (2022)Multitask Prompted Training Enables Zero-Shot Task Generalization, , , , , , , , , and 30 other author(s). International Conference on Learning Representations, (2022)