Author of the publication

One-Shot Neural Cross-Lingual Transfer for Paradigm Completion.

, , and . ACL (1), page 1993-2003. Association for Computational Linguistics, (2017)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

Weakly Supervised POS Taggers Perform Poorly on Truly Low-Resource Languages., , and . AAAI, page 8066-8073. AAAI Press, (2020)Meeting the Needs of Low-Resource Languages: The Value of Automatic Alignments via Pretrained Models., , , , , , , and . EACL, page 3894-3908. Association for Computational Linguistics, (2023)The SIGMORPHON 2020 Shared Task on Unsupervised Morphological Paradigm Completion., , , and . SIGMORPHON, page 51-62. Association for Computational Linguistics, (2020)The LMU System for the CoNLL-SIGMORPHON 2017 Shared Task on Universal Morphological Reinflection., and . CoNLL Shared Task (1), page 40-48. Association for Computational Linguistics, (2017)Unlabeled Data for Morphological Generation With Character-Based Sequence-to-Sequence Models., and . SWCN@EMNLP, page 76-81. Association for Computational Linguistics, (2017)A Major Obstacle for NLP Research: Let's Talk about Time Allocation!, , and . EMNLP, page 8959-8969. Association for Computational Linguistics, (2022)The World of an Octopus: How Reporting Bias Influences a Language Model's Perception of Color., , , and . EMNLP (1), page 823-835. Association for Computational Linguistics, (2021)Open-domain Dialogue Generation: What We Can Do, Cannot Do, And Should Do Next., , , , and . ConvAI@ACL, page 148-165. Association for Computational Linguistics, (2022)Navigating Wanderland: Highlighting Off-Task Discussions in Classrooms., , , , , , , , , and 1 other author(s). AIED, volume 13916 of Lecture Notes in Computer Science, page 727-732. Springer, (2023)How to Adapt Your Pretrained Multilingual Model to 1600 Languages., and . ACL/IJCNLP (1), page 4555-4567. Association for Computational Linguistics, (2021)