Simultaneous count data feature selection and clustering using Multinomial Nested Dirichlet Mixture

作者

  • Fares Alkhawaja Concordia Institute for Information Systems Engineering (CIISE)
  • Manar Amayri Concordia Institute for Information Systems Engineering (CIISE)
  • Nizar Bouguila Concordia Institute for Information Systems Engineering (CIISE)

##plugins.pubIds.doi.readerDisplayName##:

https://doi.org/10.32473/flairs.37.1.135262

关键词:

mixture model clustering, Expectation-Maximization (EM), feature saliences, feature selection, Multinomial Nested Dirichlet Mixture (MNDM), Minimum Message Length (MML)

摘要

The elevating effect of the curse of dimensionality in count data has made clustering a challenging task. This paper solves this by adopting the concept of feature saliency as a feature selection method in the context of using the Multinomial Nested Dirichlet Mixture (MNDM). The MNDM is a generalization of the Dirichlet Compound Mixture (DCM) that suffers from several limitations. The model learning is accomplished through the expectation-maximization method. The Minimum Message Length criterion is used to simultaneously determine the best number of components in the mixture with the updated selected features. At the price of convergence times, the results show better performance through different metrics, as the model aims to select the salient features and tune away the non-salient anomalistic features.

##submission.downloads##

已出版

2024-05-13

栏目

Special Track: Neural Networks and Data Mining