Author of the publication

A survey on neural topic models: methods, applications, and challenges.

, , and . Artif. Intell. Rev., 57 (2): 18 (February 2024)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

Updating Language Models with Unstructured Facts: Towards Practical Knowledge Editing., , , and . CoRR, (2024)Adaptive Contrastive Learning on Multimodal Transformer for Review Helpfulness Predictions., , , , , and . CoRR, (2022)Mitigating Data Sparsity for Short Text Topic Modeling by Topic-Semantic Contrastive Learning., , and . EMNLP, page 2748-2760. Association for Computational Linguistics, (2022)Learning Multilingual Topics with Neural Variational Inference., , , and . NLPCC (1), volume 12430 of Lecture Notes in Computer Science, page 840-851. Springer, (2020)On the Affinity, Rationality, and Diversity of Hierarchical Topic Modeling., , , , , , and . AAAI, page 19261-19269. AAAI Press, (2024)Vision-and-Language Pretraining., , , and . CoRR, (2022)Short Text Topic Modeling with Topic Distribution Quantization and Negative Sampling Decoder., , , and . EMNLP (1), page 1772-1782. Association for Computational Linguistics, (2020)DemaFormer: Damped Exponential Moving Average Transformer with Energy-Based Modeling for Temporal Language Grounding., , , , , and . EMNLP (Findings), page 3635-3649. Association for Computational Linguistics, (2023)A Survey on Neural Topic Models: Methods, Applications, and Challenges., , and . CoRR, (2024)DemaFormer: Damped Exponential Moving Average Transformer with Energy-Based Modeling for Temporal Language Grounding., , , , , and . CoRR, (2023)