Artikel,

ScaLA: Accelerating Adaptation of Pre-Trained Transformer-Based Language Models via Efficient Large-Batch Adversarial Noise.

, , und .
CoRR, (2022)

Metadaten

Tags

Nutzer

  • @dblp

Kommentare und Rezensionen