ePrints@IIScePrints@IISc Home | About | Browse | Latest Additions | Advanced Search | Contact | Help

Learning With Jensen-Tsallis Kernels

Ghoshdastidar, Debarghya and Adsul, Ajay P and Dukkipati, Ambedkar (2016) Learning With Jensen-Tsallis Kernels. In: IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 27 (10). pp. 2108-2119.

[img] PDF
IEEE_Tra_Neu_Net_Ler_Sys_27-10_2108_2016.pdf - Published Version
Restricted to Registered users only

Download (2MB) | Request a copy
Official URL: http://dx.doi.org/10.1109/TNNLS.2016.2550578

Abstract

Jensen-type Jensen-Shannon (JS) and Jensen-Tsallis] kernels were first proposed by Martins et al. (2009). These kernels are based on JS divergences that originated in the information theory. In this paper, we extend the Jensen-type kernels on probability measures to define positive-definite kernels on Euclidean space. We show that the special cases of these kernels include dot-product kernels. Since Jensen-type divergences are multidistribution divergences, we propose their multipoint variants, and study spectral clustering and kernel methods based on these. We also provide experimental studies on benchmark image database and gene expression database that show the benefits of the proposed kernels compared with the existing kernels. The experiments on clustering also demonstrate the use of constructing multipoint similarities.

Item Type: Journal Article
Publication: IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS
Additional Information: Copy right for this article belongs to the IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC, 445 HOES LANE, PISCATAWAY, NJ 08855-4141 USA
Department/Centre: Division of Electrical Sciences > Computer Science & Automation
Date Deposited: 03 Dec 2016 06:04
Last Modified: 03 Dec 2016 06:04
URI: http://eprints.iisc.ac.in/id/eprint/55243

Actions (login required)

View Item View Item