Author of the publication

HAGRID: A Human-LLM Collaborative Dataset for Generative Information-Seeking with Attribution.

, , , , and . CoRR, (2023)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

Do we need Label Regularization to Fine-tune Pre-trained Language Models?, , , , , , , and . EACL, page 166-177. Association for Computational Linguistics, (2023)Continuation KD: Improved Knowledge Distillation through the Lens of Continuation Optimization., , , , and . EMNLP (Findings), page 5260-5269. Association for Computational Linguistics, (2022)Towards Understanding Label Regularization for Fine-tuning Pre-trained Language Models., , , , , , , and . CoRR, (2022)HAGRID: A Human-LLM Collaborative Dataset for Generative Information-Seeking with Attribution., , , , and . CoRR, (2023)How to Select One Among All? An Extensive Empirical Study Towards the Robustness of Knowledge Distillation in Natural Language Understanding., , , , , and . CoRR, (2021)Annealing Knowledge Distillation., , , and . EACL, page 2493-2504. Association for Computational Linguistics, (2021)Segmentation Approach for Coreference Resolution Task., and . CoRR, (2020)How to Select One Among All ? An Empirical Study Towards the Robustness of Knowledge Distillation in Natural Language Understanding., , , , , and . EMNLP (Findings), page 750-762. Association for Computational Linguistics, (2021)Improved knowledge distillation by utilizing backward pass knowledge in neural networks., , and . CoRR, (2023)Pro-KD: Progressive Distillation by Following the Footsteps of the Teacher., , , , , and . CoRR, (2021)