Author of the publication

Shall We Pretrain Autoregressive Language Models with Retrieval? A Comprehensive Study.

, , , , , , , , , , , and . EMNLP, page 7763-7786. Association for Computational Linguistics, (2023)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

An adaptive implicit-explicit scheme for the DNS and LES of compressible flows on unstructured grids., , , and . J. Comput. Phys., 229 (17): 5944-5965 (2010)ChatQA: Building GPT-4 Level Conversational QA Models., , , , , , and . CoRR, (2024)Using DeepSpeed and Megatron to Train Megatron-Turing NLG 530B, A Large-Scale Generative Language Model., , , , , , , , , and 10 other author(s). CoRR, (2022)Retrieval meets Long Context Large Language Models., , , , , , , , , and . CoRR, (2023)Context Generation Improves Open Domain Question Answering., , , , , , , , and . EACL (Findings), page 781-796. Association for Computational Linguistics, (2023)Adding Instructions during Pretraining: Effective way of Controlling Toxicity in Language Models., , , and . EACL, page 2628-2643. Association for Computational Linguistics, (2023)Exploring the Limits of Domain-Adaptive Training for Detoxifying Large-Scale Language Models., , , , , , , , and . NeurIPS, (2022)Neural ODEs for Image Segmentation with Level Sets., , , , , and . CoRR, (2019)Stable and accurate schemes for the compressible Navier-Stokes equations., , and . J. Comput. Phys., 227 (4): 2293-2316 (2008)BioMegatron: Larger Biomedical Domain Language Model., , , , , , and . EMNLP (1), page 4700-4706. Association for Computational Linguistics, (2020)