Latent Beta-Liouville Probabilistic Modeling for Bursty Topic Discovery in Textual Data
DOI :
https://doi.org/10.32473/flairs.37.1.135043Mots-clés :
Topic Modeling, word burstiness, Beta-Liouville Distribution, Dirichlet Compound Multinomial Distribution, Natural Language ProcessingRésumé
Topic modeling has become a fundamental technique for uncovering latent thematic structures within large collections of textual data. However, conventional models often struggle to capture the burstiness of topics. This characteristic, where the occurrence of a word increases its likelihood of subsequent appearances in a document, is fundamental in natural language processing. To address this gap, we introduce a novel topic modeling framework, integrating Beta-Liouville and Dirichlet Compound Multinomial distributions. Our approach, named Beta-Liouville Dirichlet Compound Multinomial Latent Dirichlet Allocation (BLDCMLDA), is designed to specifically model word burstiness and support a wide range of adaptable topic proportion patterns. Through experiments on diverse benchmark text datasets, the BLDCMLDA model has demonstrated superior performance over conventional models. Our promising results in terms of perplexity and coherence scores demonstrate the effectiveness of BLDCMLDA in capturing the nuances of word usage dynamics in natural language.
Téléchargements
Publié-e
Comment citer
Numéro
Rubrique
Licence
© Shadan Ghadimi, Hafsa Ennajari, Nizar Bouguila 2024
Cette œuvre est sous licence Creative Commons Attribution - Pas d'Utilisation Commerciale 4.0 International.