Latent Beta-Liouville Probabilistic Modeling for Bursty Topic Discovery in Textual Data
DOI:
https://doi.org/10.32473/flairs.37.1.135043Keywords:
Topic Modeling, word burstiness, Beta-Liouville Distribution, Dirichlet Compound Multinomial Distribution, Natural Language ProcessingAbstract
Topic modeling has become a fundamental technique for uncovering latent thematic structures within large collections of textual data. However, conventional models often struggle to capture the burstiness of topics. This characteristic, where the occurrence of a word increases its likelihood of subsequent appearances in a document, is fundamental in natural language processing. To address this gap, we introduce a novel topic modeling framework, integrating Beta-Liouville and Dirichlet Compound Multinomial distributions. Our approach, named Beta-Liouville Dirichlet Compound Multinomial Latent Dirichlet Allocation (BLDCMLDA), is designed to specifically model word burstiness and support a wide range of adaptable topic proportion patterns. Through experiments on diverse benchmark text datasets, the BLDCMLDA model has demonstrated superior performance over conventional models. Our promising results in terms of perplexity and coherence scores demonstrate the effectiveness of BLDCMLDA in capturing the nuances of word usage dynamics in natural language.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2024 Shadan Ghadimi, Hafsa Ennajari, Nizar Bouguila
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.