Abstract
The matrix-based Rényi's α-order entropy functional was recently introduced using the normalized eigenspectrum of a Hermitian matrix of the projected data in a reproducing kernel Hilbert space (RKHS). However, the current theory in the matrix-based Rényi's α-order entropy functional only defines the entropy of a single variable or mutual information between two random variables. In information theory and machine learning communities, one is also frequently interested in multivariate information quantities, such as the multivariate joint entropy and different interactive quantities among multiple variables. In this paper, we first define the matrix-based Rényi's α-order joint entropy among multiple variables. We then show how this definition can ease the estimation of various information quantities that measure the interactions among multiple variables, such as interactive information and total correlation. We finally present an application to feature selection to show how our definition provides a simple yet powerful way to estimate a widely-acknowledged intractable quantity from data. A real example on hyperspectral image (HSI) band selection is also provided.
Original language | English |
---|---|
Article number | 8787866 |
Pages (from-to) | 2960-2966 |
Number of pages | 7 |
Journal | IEEE Transactions on Pattern Analysis and Machine Intelligence |
Volume | 42 |
Issue number | 11 |
DOIs | |
State | Published - Nov 1 2020 |
Bibliographical note
Publisher Copyright:© 1979-2012 IEEE.
Keywords
- Rényi's α-order entropy functional
- feature selection
- multivariate information quantities
ASJC Scopus subject areas
- Software
- Computer Vision and Pattern Recognition
- Computational Theory and Mathematics
- Artificial Intelligence
- Applied Mathematics