From post

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

Self-training Improves Pre-training for Natural Language Understanding., , , , , , , и . CoRR, (2020)Supervised Contrastive Learning for Pre-trained Language Model Fine-tuning., , , и . (2020)cite arxiv:2011.01403.SentEval: An Evaluation Toolkit for Universal Sentence Representations., и . LREC, European Language Resources Association (ELRA), (2018)CCNet: Extracting High Quality Monolingual Datasets from Web Crawl Data., , , , , , и . LREC, стр. 4003-4012. European Language Resources Association, (2020)Multilingual Speech Translation from Efficient Finetuning of Pretrained Models., , , , , , , , и . ACL/IJCNLP (1), стр. 827-838. Association for Computational Linguistics, (2021)Emerging Cross-lingual Structure in Pretrained Language Models., , , , и . CoRR, (2019)Meta-Prod2Vec - Product Embeddings Using Side-Information for Recommendation., , и . CoRR, (2016)Self-training and Pre-training are Complementary for Speech Recognition., , , , , , , и . CoRR, (2020)XLS-R: Self-supervised Cross-lingual Speech Representation Learning at Scale., , , , , , , , , и 3 other автор(ы). CoRR, (2021)XTREME-S: Evaluating Cross-lingual Speech Representations., , , , , , , , , и 9 other автор(ы). INTERSPEECH, стр. 3248-3252. ISCA, (2022)