Ir directamente a la navegación principal Ir directamente a la búsqueda Ir directamente al contenido principal

Information theoretic learning with infinitely divisible kernels

Producción científica: Paperrevisión exhaustiva

17 Citas (Scopus)

Resumen

In this paper, we develop a framework for information theoretic learning based on infinitely divisible matrices. We formulate an entropy-like functional on positive definite matrices based on Renyi’s axiomatic definition of entropy and examine some key properties of this functional that lead to the concept of infinite divisibility. The proposed formulation avoids the plug in estimation of density and brings along the representation power of reproducing kernel Hilbert spaces. As an application example, we derive a supervised metric learning algorithm using a matrix based analogue to conditional entropy achieving results comparable with the state of the art.

Idioma originalEnglish
EstadoPublished - 2013
Evento1st International Conference on Learning Representations, ICLR 2013 - Scottsdale, United States
Duración: may 2 2013may 4 2013

Conference

Conference1st International Conference on Learning Representations, ICLR 2013
País/TerritorioUnited States
CiudadScottsdale
Período5/2/135/4/13

Nota bibliográfica

Publisher Copyright:
© 2013 International Conference on Learning Representations, ICLR. All rights reserved.

ASJC Scopus subject areas

  • Education
  • Computer Science Applications
  • Linguistics and Language
  • Language and Linguistics

Huella

Profundice en los temas de investigación de 'Information theoretic learning with infinitely divisible kernels'. En conjunto forman una huella única.

Citar esto