From post

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

ODIN: A Single Model for 2D and 3D Perception., , , , , , , и . CoRR, (2024)AmericasNLI: Evaluating Zero-shot Natural Language Understanding of Pretrained Multilingual Models in Truly Low-resource Languages., , , , , , , , , и 7 other автор(ы). CoRR, (2021)AmericasNLI: Machine translation and natural language inference systems for Indigenous languages of the Americas., , , , , , , , , и 8 other автор(ы). Frontiers Artif. Intell., (2022)MLQE-PE: A Multilingual Quality Estimation and Post-Editing Dataset., , , , , , , , и . CoRR, (2020)Self-training Improves Pre-training for Natural Language Understanding., , , , , , , и . NAACL-HLT, стр. 5408-5418. Association for Computational Linguistics, (2021)Findings of the WMT 2020 Shared Task on Parallel Corpus Filtering and Alignment., , , , , и . WMT@EMNLP, стр. 726-742. Association for Computational Linguistics, (2020)Language Is Not All You Need: Aligning Perception with Language Models., , , , , , , , , и 8 other автор(ы). NeurIPS, (2023)DUBLIN: Visual Document Understanding By Language-Image Network., , , , , , , , , и . EMNLP (Industry Track), стр. 693-706. Association for Computational Linguistics, (2023)A Massive Collection of Cross-Lingual Web-Document Pairs., , , и . CoRR, (2019)Self-training Improves Pre-training for Natural Language Understanding., , , , , , , и . CoRR, (2020)