Creating and sharing knowledge for telecommunications

Information Theoretic Text Classification Using the Ziv-Merhav Method

Antão, DPC ; Figueiredo, M. A. T.

Information Theoretic Text Classification Using the Ziv-Merhav Method, Proc Iberian Conf. on Pattern Recognition and Image Analysis, Estoril, Portugal, Vol. II, pp. 355 - 362, June, 2005.

Digital Object Identifier:

Download Full text PDF ( 193 KBs)

Abstract
Most approaches to text classification rely on some measure of
(dis)similarity between sequences of symbols. Information theoretic measures have the advantage of making very few assumptions on the models which are considered to have generated the sequences, and have been the focus of recent interest. This paper addresses the use of the Ziv-Merhav method (ZMM) for the estimation of relative entropy (or Kullback-Leibler divergence) from sequences of symbols as a tool for text classification. We describe an implementation of the ZMM based on a modified version of the Lempel-Ziv algorithm (LZ77). Assessing the accuracy of the ZMM on synthetic Markov sequences shows that it yields good estimates of the Kullback-Leibler divergence. Finally, we apply the method in a text classification problem (more specifically, authorship attribution) outperforming a previously proposed (also information theoretic) method.