From post

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

XLM-E: Cross-lingual Language Model Pre-training via ELECTRA., , , , , , , и . CoRR, (2021)Dispersion Based Similarity for Mining Similar Papers in Citation Network., и . ICDM Workshops, стр. 524-531. IEEE Computer Society, (2015)Language Is Not All You Need: Aligning Perception with Language Models., , , , , , , , , и 8 other автор(ы). NeurIPS, (2023)Bootstrapping a high quality multilingual multimodal dataset for Bletchley., , , , , и . ACML, том 189 из Proceedings of Machine Learning Research, стр. 738-753. PMLR, (2022)On the Representation Collapse of Sparse Mixture of Experts., , , , , , , , , и 2 other автор(ы). NeurIPS, (2022)Language Is Not All You Need: Aligning Perception with Language Models., , , , , , , , , и 8 other автор(ы). CoRR, (2023)Image as a Foreign Language: BEIT Pretraining for Vision and Vision-Language Tasks., , , , , , , , , и 1 other автор(ы). CVPR, стр. 19175-19186. IEEE, (2023)XLM-E: Cross-lingual Language Model Pre-training via ELECTRA., , , , , , , , , и 1 other автор(ы). ACL (1), стр. 6170-6182. Association for Computational Linguistics, (2022)Beyond English-Centric Bitexts for Better Multilingual Language Representation Learning., , , , , , , и . ACL (1), стр. 15354-15373. Association for Computational Linguistics, (2023)InfoXLM: An Information-Theoretic Framework for Cross-Lingual Language Model Pre-Training., , , , , , , , , и . NAACL-HLT, стр. 3576-3588. Association for Computational Linguistics, (2021)