@jaeschke

Which *BERT? A Survey Organizing Contextualized Encoders

, , and . Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), page 7516--7533. Association for Computational Linguistics, (November 2020)
DOI: 10.18653/v1/2020.emnlp-main.608

Abstract

Pretrained contextualized text encoders are now a staple of the NLP community. We present a survey on language representation learning with the aim of consolidating a series of shared lessons learned across a variety of recent efforts. While significant advancements continue at a rapid pace, we find that enough has now been discovered, in different directions, that we can begin to organize advances according to common themes. Through this organization, we highlight important considerations when interpreting recent contributions and choosing which model to use.

Description

Which *BERT? A Survey Organizing Contextualized Encoders - ACL Anthology

Links and resources

Tags

community

  • @jaeschke
  • @rikbose
  • @nosebrain
  • @dblp
@jaeschke's tags highlighted