Arun, R and Suresh, Vommina and Madhavan, Veni CE and Murty, Narasimha M (2010) On Finding the Natural Number of Topics with Latent Dirichlet Allocation: Some Observations. In: 14th Pacific-Asia Conference on Knowledge Discovery and Data Mining, JUN 21-24, 2010, Hyderabad, pp. 391-402.
Full text not available from this repository. (Request a copy)Abstract
It is important to identify the ``correct'' number of topics in mechanisms like Latent Dirichlet Allocation(LDA) as they determine the quality of features that are presented as features for classifiers like SVM. In this work we propose a measure to identify the correct number of topics and offer empirical evidence in its favor in terms of classification accuracy and the number of topics that are naturally present in the corpus. We show the merit of the measure by applying it on real-world as well as synthetic data sets(both text and images). In proposing this measure, we view LDA as a matrix factorization mechanism, wherein a given corpus C is split into two matrix factors M-1 and M-2 as given by C-d*w = M1(d*t) x Q(t*w).Where d is the number of documents present in the corpus anti w is the size of the vocabulary. The quality of the split depends on ``t'', the right number of topics chosen. The measure is computed in terms of symmetric KL-Divergence of salient distributions that are derived from these matrix factors. We observe that the divergence values are higher for non-optimal number of topics - this is shown by a `dip' at the right value for `t'.
Item Type: | Conference Paper |
---|---|
Publication: | Lecture Notes in Computer Science |
Series.: | Lecture Notes in Artificial Intelligence |
Publisher: | Springer |
Additional Information: | Copyright of this article belongs to Springer . |
Keywords: | LDA Topic SVD KL-Divergence. |
Department/Centre: | Division of Electrical Sciences > Computer Science & Automation |
Date Deposited: | 27 Sep 2010 11:33 |
Last Modified: | 27 Sep 2010 11:33 |
URI: | http://eprints.iisc.ac.in/id/eprint/32489 |
Actions (login required)
View Item |