Article,

ScaLA: Accelerating Adaptation of Pre-Trained Transformer-Based Language Models via Efficient Large-Batch Adversarial Noise.

, , and .
CoRR, (2022)

Meta data

Tags

Users

  • @dblp

Comments and Reviews