From post

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

DistiLLM: Towards Streamlined Distillation for Large Language Models., , , и . ICML, OpenReview.net, (2024)DiffBlender: Scalable and Composable Multimodal Text-to-Image Diffusion Models., , , , и . CoRR, (2023)Real-time and Explainable Detection of Epidemics with Global News Data., , , , и . Healthcare AI and COVID-19 Workshop, том 184 из Proceedings of Machine Learning Research, стр. 73-90. PMLR, (2022)Self-Contrastive Learning., , , , , и . CoRR, (2021)ReFine: Re-randomization before Fine-tuning for Cross-domain Few-shot Learning., , , , , и . CIKM, стр. 4359-4363. ACM, (2022)Coreset Sampling from Open-Set for Fine-Grained Self-Supervised Learning., , и . CVPR, стр. 7537-7547. IEEE, (2023)Recycle-and-Distill: Universal Compression Strategy for Transformer-based Speech SSL Models with Attention Map Reusing and Masking Distillation., , , и . INTERSPEECH, стр. 316-320. ISCA, (2023)Learning Video Temporal Dynamics with Cross-Modal Attention for Robust Audio-Visual Speech Recognition., , , , и . CoRR, (2024)Self-Contrastive Learning: Single-Viewed Supervised Contrastive Framework Using Sub-network., , , , , и . AAAI, стр. 197-205. AAAI Press, (2023)Understanding Cross-Domain Few-Shot Learning Based on Domain Similarity and Few-Shot Difficulty., , , , , и . NeurIPS, (2022)