From post

Curate and Generate: A Corpus and Method for Joint Control of Semantics and Style in Neural NLG.

, , , и . ACL (1), стр. 5938-5951. Association for Computational Linguistics, (2019)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

AmericasNLI: Evaluating Zero-shot Natural Language Understanding of Pretrained Multilingual Models in Truly Low-resource Languages., , , , , , , , , и 7 other автор(ы). ACL (1), стр. 6279-6299. Association for Computational Linguistics, (2022)AmericasNLI: Evaluating Zero-shot Natural Language Understanding of Pretrained Multilingual Models in Truly Low-resource Languages., , , , , , , , , и 7 other автор(ы). CoRR, (2021)AmericasNLI: Machine translation and natural language inference systems for Indigenous languages of the Americas., , , , , , , , , и 8 other автор(ы). Frontiers Artif. Intell., (2022)Athena: Constructing Dialogues Dynamically with Discourse Constraints., , , , , , , , , и 4 other автор(ы). CoRR, (2020)Curate and Generate: A Corpus and Method for Joint Control of Semantics and Style in Neural NLG., , , и . ACL (1), стр. 5938-5951. Association for Computational Linguistics, (2019)Findings of the Second AmericasNLP Competition on Speech-to-Text Translation., , , , , , , , , и 24 other автор(ы). NeurIPS (Competition and Demos), том 220 из Proceedings of Machine Learning Research, стр. 217-232. PMLR, (2021)Meeting the Needs of Low-Resource Languages: The Value of Automatic Alignments via Pretrained Models., , , , , , , и . EACL, стр. 3894-3908. Association for Computational Linguistics, (2023)Open-domain Dialogue Generation: What We Can Do, Cannot Do, And Should Do Next., , , , и . ConvAI@ACL, стр. 148-165. Association for Computational Linguistics, (2022)How to Adapt Your Pretrained Multilingual Model to 1600 Languages., и . ACL/IJCNLP (1), стр. 4555-4567. Association for Computational Linguistics, (2021)Since the Scientific Literature Is Multilingual, Our Models Should Be Too., и . CoRR, (2024)