Creating and sharing knowledge for telecommunications

Nonextensive Entropic Kernels

Aguiar, P. ; Figueiredo, M. A. T.

Nonextensive Entropic Kernels, Proc International Conf. on Machine Learning - ICML, Helsinki, Finland, Vol. , pp. - , July, 2008.

Digital Object Identifier:

 

Abstract
Positive definite kernels on probability measures
have been recently applied in structured
data classification problems. Some
of these kernels are related to classic information
theoretic quantities, such as mutual
information and the Jensen-Shannon divergence.
Meanwhile, driven by recent advances
in Tsallis statistics, nonextensive generalizations
of Shannon’s information theory have
been proposed. This paper bridges these
two trends. We introduce the Jensen-Tsallis
q-difference, a generalization of the Jensen-
Shannon divergence. We then define a new
family of nonextensive mutual information
kernels, which allow weights to be assigned
to their arguments, and which includes the
Boolean, Jensen-Shannon, and linear kernels
as particular cases. We illustrate the performance
of these kernels on text categorization
tasks.