Author of the publication

On Transferability of Prompt Tuning for Natural Language Processing.

, , , , , , , , , , , , and . NAACL-HLT, page 3949-3969. Association for Computational Linguistics, (2022)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

StableToolBench: Towards Stable Large-Scale Benchmarking on Tool Learning of Large Language Models., , , , , , , , and . CoRR, (2024)Large Language Model as Autonomous Decision Maker., , , , , and . CoRR, (2023)AgentVerse: Facilitating Multi-Agent Collaboration and Exploring Emergent Behaviors in Agents., , , , , , , , , and 3 other author(s). CoRR, (2023)Learning to Annotate: Modularizing Data Augmentation for Text Classifiers with Natural Language Explanations., , , , , , , and . CoRR, (2019)Parameter-efficient fine-tuning of large-scale pre-trained language models., , , , , , , , , and 10 other author(s). Nat. Mac. Intell., 5 (3): 220-235 (March 2023)Comparing Three Methods of Selecting Training Samples in Supervised Classification of Multispectral Remote Sensing Images., , , , , and . Sensors, 23 (20): 8530 (October 2023)ERICA: Improving Entity and Relation Understanding for Pre-trained Language Models via Contrastive Learning., , , , , , , , and . ACL/IJCNLP (1), page 3350-3363. Association for Computational Linguistics, (2021)FPT: Improving Prompt Tuning Efficiency via Progressive Training., , , , , , and . EMNLP (Findings), page 6877-6887. Association for Computational Linguistics, (2022)Tell Me More! Towards Implicit User Intention Understanding of Language Model Driven Agents., , , , , , , , , and 1 other author(s). CoRR, (2024)bert2BERT: Towards Reusable Pretrained Language Models., , , , , , , , , and . CoRR, (2021)