From post

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

Continuation KD: Improved Knowledge Distillation through the Lens of Continuation Optimization., , , , и . EMNLP (Findings), стр. 5260-5269. Association for Computational Linguistics, (2022)Do we need Label Regularization to Fine-tune Pre-trained Language Models?, , , , , , , и . EACL, стр. 166-177. Association for Computational Linguistics, (2023)Towards Understanding Label Regularization for Fine-tuning Pre-trained Language Models., , , , , , , и . CoRR, (2022)HAGRID: A Human-LLM Collaborative Dataset for Generative Information-Seeking with Attribution., , , , и . CoRR, (2023)How to Select One Among All? An Extensive Empirical Study Towards the Robustness of Knowledge Distillation in Natural Language Understanding., , , , , и . CoRR, (2021)Annealing Knowledge Distillation., , , и . EACL, стр. 2493-2504. Association for Computational Linguistics, (2021)Segmentation Approach for Coreference Resolution Task., и . CoRR, (2020)How to Select One Among All ? An Empirical Study Towards the Robustness of Knowledge Distillation in Natural Language Understanding., , , , , и . EMNLP (Findings), стр. 750-762. Association for Computational Linguistics, (2021)Improved knowledge distillation by utilizing backward pass knowledge in neural networks., , и . CoRR, (2023)Pro-KD: Progressive Distillation by Following the Footsteps of the Teacher., , , , , и . CoRR, (2021)