Author of the publication

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

Parameter-efficient fine-tuning of large-scale pre-trained language models., , , , , , , , , and 10 other author(s). Nat. Mac. Intell., 5 (3): 220-235 (March 2023)AgentVerse: Facilitating Multi-Agent Collaboration and Exploring Emergent Behaviors in Agents., , , , , , , , , and 3 other author(s). CoRR, (2023)Plug-and-Play Document Modules for Pre-trained Models., , , , , , , , , and . ACL (1), page 15713-15729. Association for Computational Linguistics, (2023)ChatEval: Towards Better LLM-based Evaluators through Multi-Agent Debate., , , , , , , and . CoRR, (2023)On Transferability of Prompt Tuning for Natural Language Understanding., , , , , , , , , and 1 other author(s). CoRR, (2021)On Transferability of Prompt Tuning for Natural Language Processing., , , , , , , , , and 3 other author(s). NAACL-HLT, page 3949-3969. Association for Computational Linguistics, (2022)Exploring the Impact of Model Scaling on Parameter-Efficient Tuning., , , , , , , , , and 2 other author(s). EMNLP, page 15062-15078. Association for Computational Linguistics, (2023)Arbitrary Few Parameters are Good Enough for Adapting Large-scale Pre-trained Language Models., , , , , , , , , and . CoRR, (2023)Delta Tuning: A Comprehensive Study of Parameter Efficient Methods for Pre-trained Language Models., , , , , , , , , and 10 other author(s). CoRR, (2022)