Universal code (data compression)

Universal code (data compression)

In data compression, a universal code for integers is a prefix code that maps the positive integers onto binary codewords, with the additional property that whatever the true probability distribution on integers, as long as the distribution is monotonic (i.e., p(i) geq p(i+1) for all positive i), the expected lengths of the codewords are within a constant factor of the expected lengths that the optimal code for that probability distribution would have assigned. A universal code is "asymptotically optimal" if the ratio between actual and optimal expected lengths is bounded by a function of the information entropy of the code that, in addition to being bounded, approaches 1 as entropy approaches infinity.

In general, most prefix codes for integers assign longer codewords to larger integers. Such a code can be used to efficiently communicate a message drawn from a set of possible messages, by simply ordering the set of messages by decreasing probability and then sending the index of the intended message. Universal codes are generally not used for precisely known probability distributions, and no universal code is known to be optimal for any distribution used in practice.

These are some universal codes for integers; an asterisk (*) indicates a code that can be trivially restated in lexicographical order, while a double dagger () indicates a code that is asymptotically optimal:
* Elias gamma coding *
* Elias delta coding * ‡
* Elias omega coding * ‡
* Exp-Golomb coding *, which has Elias gamma coding as a special case. (Used in H.264/MPEG-4 AVC)
* Fibonacci coding
* Levenstein (Левенштейн) coding * ‡, the original universal coding technique [http://www.compression.ru/download/articles/int/levenstein_1968_on_the_redundancy_and_delay.pdf]
* Byte coding ‡, also known as comma coding, where a special bit pattern (with at least two bits) is used to mark the end of the code — for example, if an integer is encoded as a sequence of nibbles representing digits in base 15 instead of the more natural base 16, then the highest nibble value (i.e., a sequence of four ones in binary) can be used to indicate the end of the integer.

These are non-universal ones:

* unary coding, which is used in Elias codes
* Rice coding, which is used in the FLAC audio codec as which has unary coding as a special case
* Golomb coding, which has Rice coding and unary coding as special cases.

Their nonuniversality can be observed by noticing that, if any of these are used to code the Gauss-Kuzmin distribution or the Zeta distribution with parameter s=2, expected codeword length is infinite. For example, using unary coding on the Zeta distribution yields an expected length of

E(l) = frac{6}{pi^2} sum_{l=1}^infty frac{1}{l} = infty . ,

On the other hand, using the universal Elias gamma coding for the Gauss-Kuzmin distribution results in an expected codeword length (about 3.51 bits) near entropy (about 3.43 bits) [http://scholar.google.com/scholar?cluster=13442560459874106744] .

A universal code should not be confused with universal source coding, in which the data compression method need not be a fixed prefix code and the ratio between actual and optimal expected lengths must approach one. However, note that an asymptotically optimal universal code can be used on independent identically-distributed sources, by using increasingly large blocks, as a method of universal source coding.

Relationship to practical compression

Huffman coding and arithmetic encoding (when they can be used) give at least as good, and often better compression than any universal code.

However, universal codes are useful when Huffman coding cannot be used — for example, when one does not know the exact probability of each message, but only knows the rankings of their probabilities.

Universal codes are also useful when Huffman codes are inconvenient. For example, when the transmitter but not the receiver knows the probabilities of the messages, Huffman coding requires an overhead of transmitting those probabilities to the receiver. Using a universal code does not have that overhead.

Each universal code, like each other self-delimiting (prefix) binary code, has its own "implied probability distribution" given by "p"("i")=2-"l"("i") where "l"("i") is the length of the "i"th codeword and "p"("i") is the corresponding symbol's probability. If the actual message probabilities are "q"("i") and Kullback–Leibler divergence "D"KL("q"||"p") is minimized by the code with "l"("i"), then the optimal Huffman code for that set of messages will be equivalent to that code. Likewise, how close a code is to optimal can be measured by this divergence. Since universal codes are simpler and faster to encode and decode than Huffman codes (which is, in turn, simpler and faster than arithmetic encoding), the universal code would be preferable in cases where "D"KL("q"||"p") is sufficiently small. [http://www.cs.tut.fi/~albert/Dev/pucrunch/]

For any geometric distribution (an exponential distribution on integers), a Golomb code is optimal. With universal codes, the implicit distribution is approximately a power law such as 1/n^2 (more precisely, a Zipf distribution). For the Fibonacci code, the implicit distribution is approximately 1/n^q, with

:q = 1/log_2(varphi) simeq 1.44,

where varphi is the golden ratio. For the ternary comma code (i.e., encoding in base 3, represented with 2 bits per symbol), the implicit distribution is a power law with q=1+log_3(4/3) simeq 1.26. These distributions thus have near-optimal codes with their respective power laws.

The figures below compare Elias Gamma, Elias Delta, Fibonacci, and various Rice codings for bits used to encode integer values. A baseline, "direct binary", for binary encoding where the size is already known is also included.

References

* David J. C. MacKay. " [http://www.inference.phy.cam.ac.uk/mackay/itila/book.html Information Theory, Inference, and Learning Algorithms] " Cambridge: Cambridge University Press, 2003. ISBN 0-521-64298-1
* [http://www.ics.uci.edu/~dan/pubs/DC-Sec3.html]

External links

* [http://www.inference.phy.cam.ac.uk/mackay/itila/ On-line textbook: Information Theory, Inference, and Learning Algorithms] , by David MacKay, has a chapter on codes for integers, including an accessible introduction to Elias codes.

* [http://www-lat.compression.ru/download/integers.html Кодирование целых чисел] has mostly English-language papers on universal and other integer codes.


Wikimedia Foundation. 2010.

Игры ⚽ Нужно решить контрольную?

Look at other dictionaries:

  • Universal code — can refer to: * Universal code (data compression) * Universal Code (biology) * An alternate term for a Universal law * Universal code (ethics) * Universal Product Code * Universal code (typography) * Universal code (cartography) * Universal Code… …   Wikipedia

  • Data compression — Source coding redirects here. For the term in computer programming, see Source code. In computer science and information theory, data compression, source coding or bit rate reduction is the process of encoding information using fewer bits than… …   Wikipedia

  • Lossless data compression — is a class of data compression algorithms that allows the exact original data to be reconstructed from the compressed data. The term lossless is in contrast to lossy data compression, which only allows an approximation of the original data to be… …   Wikipedia

  • Universal coding — can refer to one of two concepts in data compression: * Universal code (data compression), a fixed prefix code that, for any probability mass function, has a data compression ratio within a constant of the optimal prefix code * Universal source… …   Wikipedia

  • Compression artifact — Original image, with good color grade Loss of edge clarity and tone fuzziness in heavy JPEG compression A compression ar …   Wikipedia

  • Grammar-based code — Grammar based codes are compression algorithms based on the idea of constructing a context free grammar for the string to be compressed. Examples include universal lossless data compression algorithms proposed in Kieffer and Yang 2000, and… …   Wikipedia

  • Commercial code (communications) — In telecommunication, a commercial code is a code once used to save on cablegram costs [1][2]. Telegraph (and telex) companies have always charged based on the length of the message sent and this has not changed since the 19th Century. To this… …   Wikipedia

  • Dynamic Markov compression — (DMC) is a lossless data compression algorithm developed by Gordon Cormack and Nigel Horspool [1]. It uses predictive arithmetic coding similar to prediction by partial matching (PPM), except that the input is predicted one bit at a time (rather… …   Wikipedia

  • Prefix code — A prefix code is a code, typically a variable length code, with the prefix property : no code word is a prefix of any other code word in the set. A code with code words {0, 10, 11} has the prefix property; a code consisting of {0, 1, 10, 11} does …   Wikipedia

  • Video compression picture types — In the field of video compression a video frame is compressed using different algorithms with different advantages and disadvantages, centered mainly around amount of data compression. These different algorithms for video frames are called… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”