A hierarchical count data clustering based on Multinomial Nested Dirichlet Mixture using the Minorization-Maximization framework
DOI:
https://doi.org/10.32473/flairs.37.1.135263Palavras-chave:
Hierarchical feature learning, mixture model clustering, Minorization-Maximization (MM), Multinomial Nested Dirichlet Mixture (MNDM), Minimum Message Length (MML), Spatial Pyramid Matching (SPM)Resumo
Despite the widespread acceptance of mixture models among researchers, obtaining good results involves several challenges. In this paper, we address two main challenges: parameter estimation and data representation strategies.
Expectation-Maximization (EM) is a widely used framework for parameter estimation. However, many factors complicate the process through intractable calculations of the posterior distribution and the parameters. Minorization-Maximization (MM) is an alternative framework that relaxes the complications and requirements of the EM. This paper adopts the MM framework for the Multinomial Nested Dirichlet Mixture in a hierarchical manner. The hierarchical nature of the MNDM is exploited through a Hierarchical Feature Learning framework (HFL), where the data represented is a result of the well-known Spatial Pyramid Matching method.
Moreover, the mixture's components are determined by the Minimum Message Length (MML). Therefore, this paper presents an HFL framework for the data representation of the MNDM, where its learning is based on the MM framework, and its selection is based on MML. The validation of the two addressed improvements is proven through three visual datasets using recall and precision performance metrics.
Downloads
Publicado
Como Citar
Edição
Seção
Licença
Copyright (c) 2024 Fares Alkhawaja, Manar Amayri, Nizar Bouguila
Este trabalho está licenciado sob uma licença Creative Commons Attribution-NonCommercial 4.0 International License.