Author of the publication

Do Transformer Modifications Transfer Across Implementations and Applications?

, , , , , , , , , , , , , , , and . EMNLP (1), page 5758-5773. Association for Computational Linguistics, (2021)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

Rethinking embedding coupling in pre-trained language models., , , , and . CoRR, (2020)Demystifying the Better Performance of Position Encoding Variants for Transformer., , , , , and . CoRR, (2021)Charformer: Fast Character Transformers via Gradient-based Subword Tokenization., , , , , , , , , and . ICLR, OpenReview.net, (2022)Entropy Generation of Desalination Powered by Variable Temperature Waste Heat., , , , and . Entropy, 17 (11): 7530-7566 (2015)Language Models are Multilingual Chain-of-Thought Reasoners., , , , , , , , , and 2 other author(s). CoRR, (2022)Do Transformer Modifications Transfer Across Implementations and Applications?, , , , , , , , , and 6 other author(s). CoRR, (2021)Large Language Models Encode Clinical Knowledge., , , , , , , , , and 20 other author(s). CoRR, (2022)Scale Efficiently: Insights from Pretraining and Finetuning Transformers., , , , , , , , , and . ICLR, OpenReview.net, (2022)UL2: Unifying Language Learning Paradigms., , , , , , , , , and 3 other author(s). ICLR, OpenReview.net, (2023)Transcending Scaling Laws with 0.1% Extra Compute., , , , , , , , , and 6 other author(s). EMNLP, page 1471-1486. Association for Computational Linguistics, (2023)