@dblp

Training Model by Knowledge Distillation for Image-text Matching Use knowledge distillation method to compress pre-trained models in Image-Text matching tasks.Design lightweight models and use knowledge distillation methods to achieve better results for previously ineffective models after training.

, , and . ICAICE, page 476-481. ACM, (2023)

Links and resources

Tags