Inproceedings,

bert2BERT: Towards Reusable Pretrained Language Models.

, , , , , , , , , and .
ACL (1), page 2134-2148. Association for Computational Linguistics, (2022)

Meta data

Tags

Users

  • @dblp

Comments and Reviews