Author of the publication

Enhancing Chat Language Models by Scaling High-quality Instructional Conversations.

, , , , , , , and . EMNLP, page 3029-3051. Association for Computational Linguistics, (2023)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

Exploring Mode Connectivity for Pre-trained Language Models., , , , , , , , and . EMNLP, page 6726-6746. Association for Computational Linguistics, (2022)ProQA: Structural Prompt-based Pre-training for Unified Question Answering., , , , , , , , and . NAACL-HLT, page 4230-4243. Association for Computational Linguistics, (2022)ToolLLM: Facilitating Large Language Models to Master 16000+ Real-world APIs., , , , , , , , , and 8 other author(s). CoRR, (2023)bert2BERT: Towards Reusable Pretrained Language Models., , , , , , , , , and . CoRR, (2021)Different Tunes Played with Equal Skill: Exploring a Unified Optimization Subspace for Parameter-Efficient Tuning., , , , , , , , and . EMNLP (Findings), page 3348-3366. Association for Computational Linguistics, (2022)Pass off Fish Eyes for Pearls: Attacking Model Selection of Pre-trained Models., , , , , , and . ACL (1), page 5060-5072. Association for Computational Linguistics, (2022)Investigate-Consolidate-Exploit: A General Strategy for Inter-Task Agent Self-Evolution., , , , , , , , and . CoRR, (2024)Enhancing Chat Language Models by Scaling High-quality Instructional Conversations., , , , , , , and . EMNLP, page 3029-3051. Association for Computational Linguistics, (2023)ProAgent: From Robotic Process Automation to Agentic Process Automation., , , , , , , , , and 2 other author(s). CoRR, (2023)Parameter-efficient fine-tuning of large-scale pre-trained language models., , , , , , , , , and 10 other author(s). Nat. Mac. Intell., 5 (3): 220-235 (March 2023)