Article,

Explanation Guided Knowledge Distillation for Pre-trained Language Model Compression.

, , , , , and .
ACM Trans. Asian Low Resour. Lang. Inf. Process., 23 (2): 32:1-32:19 (February 2024)

Meta data

Tags

Users

  • @dblp

Comments and Reviews