Author of the publication

Robust Diversified Graph Contrastive Network for Incomplete Multi-view Clustering.

, , , , , , and . ACM Multimedia, page 3936-3944. ACM, (2022)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

Video super-resolution based on nonlinear mapping and patch similarity., , , , and . CCIS, page 48-56. IEEE, (2016)Self-adaptive spatial image denoising model based on scale correlation and SURE-LET in the nonsubsampled contourlet transform domain., , and . Sci. China Inf. Sci., 57 (9): 1-15 (2014)Mining and searching association relation of scientific papers based on deep learning., , , , and . CoRR, (2022)Cross-Media Scientific Research Achievements Retrieval Based on Deep Language Model., , , and . CoRR, (2022)A Hierarchical Multi-label Classification Algorithm for Scientific Papers Based on Graph Attention Networks., , , , , and . CICAI, volume 13069 of Lecture Notes in Computer Science, page 735-746. Springer, (2021)Semantic Structure Enhanced Contrastive Adversarial Hash Network for Cross-media Representation Learning., , , , , , and . ACM Multimedia, page 277-285. ACM, (2022)Dynamic Self-adaptive Multiscale Distillation from Pre-trained Multimodal Large Model for Efficient Cross-modal Representation Learning., , , , and . CoRR, (2024)Study on food safety semantic retrieval system based on domain ontology., , and . CCIS, page 40-44. IEEE, (2011)Video super-resolution reconstruction based on correlation learning and spatio-temporal nonlocal similarity., , and . Multimedia Tools Appl., 75 (17): 10241-10269 (2016)Cross-Media Semantic Correlation Learning Based on Deep Hash Network and Semantic Expansion for Social Network Cross-Media Search., , , , , , and . IEEE Trans. Neural Networks Learn. Syst., 31 (9): 3634-3648 (2020)