Author of the publication

Training Model by Knowledge Distillation for Image-text Matching Use knowledge distillation method to compress pre-trained models in Image-Text matching tasks.Design lightweight models and use knowledge distillation methods to achieve better results for previously ineffective models after training.

, , and . ICAICE, page 476-481. ACM, (2023)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

Dynamic Cross-Correlations between Participants' Attentions to P2P Lending and Offline Loan in the Private Lending Market., , and . Complex., (2019)Analysis Method of Power Consumption Characteristics of Residents in Low-Voltage Stations Based on Clustering Algorithm., , , , , and . ICITEE, page 493-498. ACM, (2020)NTIRE 2022 Spectral Demosaicing Challenge and Data Set., , , , , , , , , and 27 other author(s). CVPR Workshops, page 881-895. IEEE, (2022)A self-stabilizing MSA algorithm in high-dimension data stream., , and . Neural Networks, 23 (7): 865-871 (2010)A weighted information criterion for multiple minor components and its adaptive extraction algorithms., , , and . Neural Networks, (2017)Vecnet: A Spectral and Multi-Scale Spatial Fusion Deep Network for Pixel-Level Cloud Type Classification in Himawari-8 Imagery., , , , , , and . IGARSS, page 4083-4086. IEEE, (2021)Unified and Coupled Self-Stabilizing Algorithms for Minor and Principal Eigen-pairs Extraction., , , and . Neural Processing Letters, 45 (1): 197-222 (2017)Collaborative Deep Reinforcement Learning for Joint Object Search., , , and . CVPR, page 7072-7081. IEEE Computer Society, (2017)Training Model by Knowledge Distillation for Image-text Matching Use knowledge distillation method to compress pre-trained models in Image-Text matching tasks.Design lightweight models and use knowledge distillation methods to achieve better results for previously ineffective models after training., , and . ICAICE, page 476-481. ACM, (2023)Enhancing Emotion Recognition in Incomplete Data: A Novel Cross-Modal Alignment, Reconstruction, and Refinement Framework., , , , , , , , , and . CoRR, (2024)