Abstract
We present a review on the recent advances and emerging opportunities around the theme of analyzing deep neural networks (DNNs) with information-theoretic methods. We first discuss popular information-theoretic quantities and their estimators. We then introduce recent developments on information-theoretic learning principles (e.g., loss functions, regularizers and objectives) and their parameterization with DNNs. We finally briefly review current usages of information-theoretic concepts in a few modern machine learning problems and list a few emerging opportunities.
Original language | English |
---|---|
Title of host publication | Proceedings of the 30th International Joint Conference on Artificial Intelligence, IJCAI 2021 |
Editors | Zhi-Hua Zhou |
Pages | 4669-4678 |
Number of pages | 10 |
ISBN (Electronic) | 9780999241196 |
DOIs | |
State | Published - 2021 |
Event | 30th International Joint Conference on Artificial Intelligence, IJCAI 2021 - Virtual, Online, Canada Duration: Aug 19 2021 → Aug 27 2021 |
Publication series
Name | IJCAI International Joint Conference on Artificial Intelligence |
---|---|
ISSN (Print) | 1045-0823 |
Conference
Conference | 30th International Joint Conference on Artificial Intelligence, IJCAI 2021 |
---|---|
Country/Territory | Canada |
City | Virtual, Online |
Period | 8/19/21 → 8/27/21 |
Bibliographical note
Publisher Copyright:© 2021 International Joint Conferences on Artificial Intelligence. All rights reserved.
ASJC Scopus subject areas
- Artificial Intelligence