From post

MixOut: A Simple Yet Effective Data Augmentation Scheme for Slot-Filling.

, и . IWSDS, том 704 из Lecture Notes in Electrical Engineering, стр. 279-288. Springer, (2020)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

Unsupervised Transfer Learning for Spoken Language Understanding in Intelligent Agents., , и . CoRR, (2018)Supervised Contextual Embeddings for Transfer Learning in Natural Language Processing Tasks., , , , , и . CoRR, (2019)Distilling Large Language Models into Tiny and Effective Students using pQRNN., , , и . CoRR, (2021)DOCmT5: Document-Level Pretraining of Multilingual Language Models., , , и . NAACL-HLT (Findings), стр. 425-437. Association for Computational Linguistics, (2022)mT5: A Massively Multilingual Pre-trained Text-to-Text Transformer., , , , , , , и . NAACL-HLT, стр. 483-498. Association for Computational Linguistics, (2021)DOCmT5: Document-Level Pretraining of Multilingual Language Models., , , и . CoRR, (2021)XTREME: A Massively Multilingual Multi-task Benchmark for Evaluating Cross-lingual Generalisation., , , , , и . ICML, том 119 из Proceedings of Machine Learning Research, стр. 4411-4421. PMLR, (2020)Harnessing Multilinguality in Unsupervised Machine Translation for Rare Languages., , , и . CoRR, (2020)mT5: A massively multilingual pre-trained text-to-text transformer, , , , , , , и . (2020)cite arxiv:2010.11934.Harnessing Multilinguality in Unsupervised Machine Translation for Rare Languages., , , и . NAACL-HLT, стр. 1126-1137. Association for Computational Linguistics, (2021)