Author of the publication

ED2LM: Encoder-Decoder to Language Model for Faster Document Re-ranking Inference.

, , , , , , , , , , and . ACL (Findings), page 3747-3758. Association for Computational Linguistics, (2022)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

Transformer Memory as a Differentiable Search Index., , , , , , , , , and 3 other author(s). NeurIPS, (2022)HyperPrompt: Prompt-based Task-Conditioning of Transformers., , , , , , , , , and 2 other author(s). ICML, volume 162 of Proceedings of Machine Learning Research, page 8678-8690. PMLR, (2022)Personalized Online Spell Correction for Personal Search., , , and . WWW, page 2785-2791. ACM, (2019)A New Generation of Perspective API: Efficient Multilingual Character-level Transformers., , , , , , and . KDD, page 3197-3207. ACM, (2022)OmniNet: Omnidirectional Representations from Transformers., , , , , , , , and . ICML, volume 139 of Proceedings of Machine Learning Research, page 10193-10202. PMLR, (2021)Charformer: Fast Character Transformers via Gradient-based Subword Tokenization., , , , , , , , , and . ICLR, OpenReview.net, (2022)ED2LM: Encoder-Decoder to Language Model for Faster Document Re-ranking Inference., , , , , , , , , and 1 other author(s). ACL (Findings), page 3747-3758. Association for Computational Linguistics, (2022)Are Pretrained Convolutions Better than Pretrained Transformers?, , , , , , and . ACL/IJCNLP (1), page 4349-4359. Association for Computational Linguistics, (2021)ExT5: Towards Extreme Multi-Task Scaling for Transfer Learning., , , , , , , , , and 4 other author(s). ICLR, OpenReview.net, (2022)Charformer: Fast Character Transformers via Gradient-based Subword Tokenization., , , , , , , , , and . CoRR, (2021)