Information theoretic learning with infinitely divisible kernels

Luis G. Sanchez Giraldo, Jose C. Principe

Research output: Contribution to conferencePaperpeer-review

17 Scopus citations

Abstract

In this paper, we develop a framework for information theoretic learning based on infinitely divisible matrices. We formulate an entropy-like functional on positive definite matrices based on Renyi’s axiomatic definition of entropy and examine some key properties of this functional that lead to the concept of infinite divisibility. The proposed formulation avoids the plug in estimation of density and brings along the representation power of reproducing kernel Hilbert spaces. As an application example, we derive a supervised metric learning algorithm using a matrix based analogue to conditional entropy achieving results comparable with the state of the art.

Original languageEnglish
StatePublished - 2013
Event1st International Conference on Learning Representations, ICLR 2013 - Scottsdale, United States
Duration: May 2 2013May 4 2013

Conference

Conference1st International Conference on Learning Representations, ICLR 2013
Country/TerritoryUnited States
CityScottsdale
Period5/2/135/4/13

Bibliographical note

Publisher Copyright:
© 2013 International Conference on Learning Representations, ICLR. All rights reserved.

ASJC Scopus subject areas

  • Education
  • Computer Science Applications
  • Linguistics and Language
  • Language and Linguistics

Fingerprint

Dive into the research topics of 'Information theoretic learning with infinitely divisible kernels'. Together they form a unique fingerprint.

Cite this