From post

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

GKD: A General Knowledge Distillation Framework for Large-scale Pre-trained Language Model., , , , , , , , , и 2 other автор(ы). CoRR, (2023)MRT: Tracing the Evolution of Scientific Publications., , , и . IEEE Trans. Knowl. Data Eng., 35 (1): 711-724 (2023)AlignBench: Benchmarking Chinese Alignment of Large Language Models., , , , , , , , , и 7 other автор(ы). CoRR, (2023)GKD: A General Knowledge Distillation Framework for Large-scale Pre-trained Language Model., , , , , , и . ACL (industry), стр. 134-148. Association for Computational Linguistics, (2023)GLM-130B: An Open Bilingual Pre-trained Model., , , , , , , , , и 8 other автор(ы). CoRR, (2022)OAG-Bench: A Human-Curated Benchmark for Academic Graph Mining., , , , , , , , , и 12 other автор(ы). CoRR, (2024)GLM-130B: An Open Bilingual Pre-trained Model., , , , , , , , , и 9 other автор(ы). ICLR, OpenReview.net, (2023)Parameter-Efficient Prompt Tuning Makes Generalized and Calibrated Neural Text Retrievers., , , , , , , , и . CoRR, (2022)Are Intermediate Layers and Labels Really Necessary? A General Language Model Distillation Method., , , , , , и . ACL (Findings), стр. 9678-9696. Association for Computational Linguistics, (2023)