Simultaneous count data feature selection and clustering using Multinomial Nested Dirichlet Mixture
##plugins.pubIds.doi.readerDisplayName##:
https://doi.org/10.32473/flairs.37.1.135262关键词:
mixture model clustering, Expectation-Maximization (EM), feature saliences, feature selection, Multinomial Nested Dirichlet Mixture (MNDM), Minimum Message Length (MML)摘要
The elevating effect of the curse of dimensionality in count data has made clustering a challenging task. This paper solves this by adopting the concept of feature saliency as a feature selection method in the context of using the Multinomial Nested Dirichlet Mixture (MNDM). The MNDM is a generalization of the Dirichlet Compound Mixture (DCM) that suffers from several limitations. The model learning is accomplished through the expectation-maximization method. The Minimum Message Length criterion is used to simultaneously determine the best number of components in the mixture with the updated selected features. At the price of convergence times, the results show better performance through different metrics, as the model aims to select the salient features and tune away the non-salient anomalistic features.
##submission.downloads##
已出版
##submission.howToCite##
期
栏目
##submission.license##
##submission.copyrightStatement##
##submission.license.cc.by-nc4.footer##