Abstract
The penalty incurred by imposing a finite delay constraint in lossless source coding of a memoryless source is investigated. It is well known that for the so-called block-to-variable and variable-to-variable codes, the redundancy decays at best polynomially with the delay, where in this case the delay is identified with the source block length or maximal source phrase length, respectively. In stark contrast, it is shown that for sequential codes (e.g., a delay-limited arithmetic code) the redundancy can be made to decay exponentially with the delay constraint. The corresponding redundancy-delay exponent is shown to be at least as good as the Rényi entropy of order 2 of the source, but (for almost all sources) not better than a quantity depending on the minimal source symbol probability and the alphabet size.
Original language | English |
---|---|
Article number | 6846353 |
Pages (from-to) | 5470-5485 |
Number of pages | 16 |
Journal | IEEE Transactions on Information Theory |
Volume | 60 |
Issue number | 9 |
DOIs | |
State | Published - Sep 2014 |
Keywords
- Lossless source coding
- arithmetic coding
- coding delay
- redundancy
All Science Journal Classification (ASJC) codes
- Information Systems
- Computer Science Applications
- Library and Information Sciences