From post

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

Prefix Language Models are Unified Modal Learners., , , и . CoRR, (2022)LoraRetriever: Input-Aware LoRA Retrieval and Composition for Mixed Tasks in the Wild., , , , , , и . CoRR, (2024)Struc-Bench: Are Large Language Models Really Good at Generating Complex Structured Data?, , , , , , и . CoRR, (2023)Pre-training Text-to-Text Transformers for Concept-centric Common Sense., , , , , и . CoRR, (2020)Meta Learning for Knowledge Distillation., , и . CoRR, (2021)Agents: An Open-source Framework for Autonomous Language Agents., , , , , , , , , и 7 other автор(ы). CoRR, (2023)Scaling-up medical vision-and-language representation learning with federated learning., , , и . Eng. Appl. Artif. Intell., 126 (Part D): 107037 (ноября 2023)To Repeat or Not To Repeat: Insights from Scaling LLM under Token-Crisis., , , , и . CoRR, (2023)Scheduled DropHead: A Regularization Method for Transformer Models., , , , и . EMNLP (Findings), том EMNLP 2020 из Findings of ACL, стр. 1971-1980. Association for Computational Linguistics, (2020)Beyond Preserved Accuracy: Evaluating Loyalty and Robustness of BERT Compression., , , , , и . EMNLP (1), стр. 10653-10659. Association for Computational Linguistics, (2021)