Computing entropy rate of symbol sources & a distribution-free limit theorem

Research output: Contribution to conferencePaperpeer-review

5 Scopus citations

Abstract

Entropy rate of sequential data-streams naturally quantifies the complexity of the generative process. Thus entropy rate fluctuations could be used as a tool to recognize dynamical perturbations in signal sources, and could potentially be carried out without explicit background noise characterization. However, state of the art algorithms to estimate the entropy rate have markedly slow convergence; making such entropic approaches non-viable in practice. We present here a fundamentally new approach to estimate entropy rates, which is demonstrated to converge significantly faster in terms of input data lengths, and is shown to be effective in diverse applications ranging from the estimation of the entropy rate of English texts to the estimation of complexity of chaotic dynamical systems. Additionally, the convergence rate of entropy estimates do not follow from any standard limit theorem, and reported algorithms fail to provide any confidence bounds on the computed values. Exploiting a connection to the theory of probabilistic automata, we establish a convergence rate of O(log|s|/3√|s|) as a function of the input length |s|, which then yields explicit uncertainty estimates, as well as required data lengths to satisfy pre-specified confidence bounds.

Original languageEnglish
DOIs
StatePublished - 2014
Event2014 48th Annual Conference on Information Sciences and Systems, CISS 2014 - Princeton, NJ, United States
Duration: Mar 19 2014Mar 21 2014

Conference

Conference2014 48th Annual Conference on Information Sciences and Systems, CISS 2014
Country/TerritoryUnited States
CityPrinceton, NJ
Period3/19/143/21/14

Keywords

  • Entropy rate
  • Probabilistic automata
  • Stochastic processes
  • Symbolic dynamics

ASJC Scopus subject areas

  • Information Systems

Fingerprint

Dive into the research topics of 'Computing entropy rate of symbol sources & a distribution-free limit theorem'. Together they form a unique fingerprint.

Cite this