Author of the publication

Co-training and Co-distillation for Quality Improvement and Compression of Language Models.

, , , , , , and . EMNLP (Findings), page 7458-7467. Association for Computational Linguistics, (2023)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

Automated Multi-task Learning.. University of California, San Diego, USA, (2017)base-search.net (ftcdlib:qt9410482w).The Belebele Benchmark: a Parallel Reading Comprehension Dataset in 122 Language Variants., , , , , , , , , and . CoRR, (2023)Adaptable Claim Rewriting with Offline Reinforcement Learning for Effective Misinformation Discovery., , , , , , , and . CoRR, (2022)RoAST: Robustifying Language Models via Adversarial Perturbation with Selective Training., , , , , , , , , and . EMNLP (Findings), page 3412-3444. Association for Computational Linguistics, (2023)Improve Transformer Models with Better Relative Position Embeddings., , , and . EMNLP (Findings), volume EMNLP 2020 of Findings of ACL, page 3327-3335. Association for Computational Linguistics, (2020)XLM-V: Overcoming the Vocabulary Bottleneck in Multilingual Masked Language Models., , , , , , , and . EMNLP, page 13142-13152. Association for Computational Linguistics, (2023)Embedding-based Zero-shot Retrieval through Query Generation, , , , , , and . (2020)cite arxiv:2009.10270.Deep Automated Multi-task Learning., and . IJCNLP(2), page 55-60. Asian Federation of Natural Language Processing, (2017)Generating Hashtags for Short-form Videos with Guided Signals., , , , , , , , and . ACL (1), page 9482-9495. Association for Computational Linguistics, (2023)Co-training and Co-distillation for Quality Improvement and Compression of Language Models., , , , , , and . EMNLP (Findings), page 7458-7467. Association for Computational Linguistics, (2023)