From post

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

EVA2.0: Investigating Open-domain Chinese Dialogue Systems with Large-scale Pre-training., , , , , , , , , и 1 other автор(ы). Mach. Intell. Res., 20 (2): 207-219 (апреля 2023)Knowledge Distillation of Large Language Models., , , и . CoRR, (2023)PPT: Pre-trained Prompt Tuning for Few-shot Learning., , , и . CoRR, (2021)Pre-Trained Models: Past, Present and Future., , , , , , , , , и 12 other автор(ы). CoRR, (2021)CPM-2: Large-scale cost-effective pre-trained language models., , , , , , , , , и 9 other автор(ы). AI Open, (2021)Pre-trained models: Past, present and future., , , , , , , , , и 14 other автор(ы). AI Open, (2021)Towards Optimal Learning of Language Models., , , , , и . CoRR, (2024)CUGE: A Chinese Language Understanding and Generation Evaluation Benchmark., , , , , , , , , и 25 other автор(ы). CoRR, (2021)Train No Evil: Selective Masking for Task-Guided Pre-Training., , , , и . EMNLP (1), стр. 6966-6974. Association for Computational Linguistics, (2020)Learning Instructions with Unlabeled Data for Zero-Shot Cross-Task Generalization., , , и . EMNLP, стр. 1617-1634. Association for Computational Linguistics, (2022)