Author of the publication

LuxemBERT: Simple and Practical Data Augmentation in Language Model Pre-Training for Luxembourgish.

, , , , , , , , and . LREC, page 5080-5089. European Language Resources Association, (2022)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

Soft Prompt Tuning for Cross-Lingual Transfer: When Less is More., , , , , and . CoRR, (2024)A Comparison of Pre-Trained Language Models for Multi-Class Text Classification in the Financial Domain., , , , , , and . WWW (Companion Volume), page 260-268. ACM / IW3C2, (2021)Comparing Pre-Training Schemes for Luxembourgish BERT Models., , , , , , , , and . KONVENS, page 17-27. Association for Computational Lingustics, (2023)Comparing MultiLingual and Multiple MonoLingual Models for Intent Classification and Slot Filling., , , , , and . NLDB, volume 12801 of Lecture Notes in Computer Science, page 367-375. Springer, (2021)Evaluating Parameter-Efficient Finetuning Approaches for Pre-trained Models on the Financial Domain., , , , , and . EMNLP (Findings), page 15482-15491. Association for Computational Linguistics, (2023)Enhancing Text-to-SQL Translation for Financial System Design., , , , , , , , and . CoRR, (2023)Evaluating Pretrained Transformer-based Models on the Task of Fine-Grained Named Entity Recognition., , , , and . COLING, page 3750-3760. International Committee on Computational Linguistics, (2020)Evaluating Data Augmentation Techniques for the Training of Luxembourgish Language Models., , , and . KONVENS, page 174-179. Association for Computational Lingustics, (2023)LuxemBERT: Simple and Practical Data Augmentation in Language Model Pre-Training for Luxembourgish., , , , , , , , and . LREC, page 5080-5089. European Language Resources Association, (2022)Evaluating the Impact of Text De-Identification on Downstream NLP Tasks., , , , , , , , and . NoDaLiDa, page 10-16. University of Tartu Library, (2023)