Computing Entropy Rate Of Symbol Sources & A Distribution-free Limit Theorem

Research output: Working paperPreprint

1 Downloads (Pure)

Abstract

Entropy rate of sequential data-streams naturally quantifies the complexity of the generative process. Thus entropy rate fluctuations could be used as a tool to recognize dynamical perturbations in signal sources, and could potentially be carried out without explicit background noise characterization. However, state of the art algorithms to estimate the entropy rate have markedly slow convergence; making such entropic approaches non-viable in practice. We present here a fundamentally new approach to estimate entropy rates, which is demonstrated to converge significantly faster in terms of input data lengths, and is shown to be effective in diverse applications ranging from the estimation of the entropy rate of English texts to the estimation of complexity of chaotic dynamical systems. Additionally, the convergence rate of entropy estimates do not follow from any standard limit theorem, and reported algorithms fail to provide any confidence bounds on the computed values. Exploiting a connection to the theory of probabilistic automata, we establish a convergence rate of $O(\log \vert s \vert/\sqrt[3]{\vert s \vert})$ as a function of the input length $\vert s \vert$, which then yields explicit uncertainty estimates, as well as required data lengths to satisfy pre-specified confidence bounds.
Original languageUndefined/Unknown
StatePublished - Jan 3 2014

Keywords

  • cs.IT
  • cs.LG
  • math.IT
  • math.PR
  • stat.CO
  • stat.ML

Fingerprint

Dive into the research topics of 'Computing Entropy Rate Of Symbol Sources & A Distribution-free Limit Theorem'. Together they form a unique fingerprint.

Cite this