Simultaneous count data feature selection and clustering using Multinomial Nested Dirichlet Mixture
DOI:
https://doi.org/10.32473/flairs.37.1.135262Palavras-chave:
mixture model clustering, Expectation-Maximization (EM), feature saliences, feature selection, Multinomial Nested Dirichlet Mixture (MNDM), Minimum Message Length (MML)Resumo
The elevating effect of the curse of dimensionality in count data has made clustering a challenging task. This paper solves this by adopting the concept of feature saliency as a feature selection method in the context of using the Multinomial Nested Dirichlet Mixture (MNDM). The MNDM is a generalization of the Dirichlet Compound Mixture (DCM) that suffers from several limitations. The model learning is accomplished through the expectation-maximization method. The Minimum Message Length criterion is used to simultaneously determine the best number of components in the mixture with the updated selected features. At the price of convergence times, the results show better performance through different metrics, as the model aims to select the salient features and tune away the non-salient anomalistic features.
Downloads
Publicado
Como Citar
Edição
Seção
Licença
Copyright (c) 2024 Fares Alkhawaja, Manar Amayri, Nizar Bouguila
Este trabalho está licenciado sob uma licença Creative Commons Attribution-NonCommercial 4.0 International License.