Z-channel (information theory)
- Z-channel (information theory)
In information theory, a Z-channel with crossover probability "p" is a binary-input-binary-output channel that flips the input bit 0 with probability "p", but maps input bit 1 to 1 with probability 1. The Z-channel has a capacity of
:
bits per channel use. For small "p", the capacity is approximated by
:
where "H"("p") is the binary entropy function.
Wikimedia Foundation.
2010.
Look at other dictionaries:
information theory — in for*ma tion the o*ry, n. (Math., Telecommunications) The science which studies the capacity of systems to contain and transmit information[2], and the factors such as noise and channel capacity that may affect the rate or accuracy of… … The Collaborative International Dictionary of English
Information theory — Not to be confused with Information science. Information theory is a branch of applied mathematics and electrical engineering involving the quantification of information. Information theory was developed by Claude E. Shannon to find fundamental… … Wikipedia
information theory — the mathematical theory concerned with the content, transmission, storage, and retrieval of information, usually in the form of messages or data, and esp. by means of computers. [1945 50] * * * ▪ mathematics Introduction a mathematical… … Universalium
History of information theory — The decisive event which established the discipline of information theory, and brought it to immediate worldwide attention, was the publication of Claude E. Shannon s classic paper A Mathematical Theory of Communication in the Bell System… … Wikipedia
information theory — Synonyms and related words: EDP, automatic electronics, autonetics, bionics, bit, channel, circuit analysis, communication engineering, communication explosion, communication technology, communication theory, communications, communications… … Moby Thesaurus
information theory — noun A branch of applied mathematics and engineering involving the quantification of information sent over a communication channel, disregarding the meaning of the sent messages, exemplified by the . See Also: informatics, information science … Wiktionary
Semiotic information theory — considers the information content of signs and expressions as it is conceived within the semiotic or sign relational framework developed by Charles Sanders Peirce.Information and uncertaintyThe good of information is its use in reducing our… … Wikipedia
Entropy (information theory) — In information theory, entropy is a measure of the uncertainty associated with a random variable. The term by itself in this context usually refers to the Shannon entropy, which quantifies, in the sense of an expected value, the information… … Wikipedia
List of information theory topics — This is a list of information theory topics, by Wikipedia page.*A Mathematical Theory of Communication *algorithmic information theory *arithmetic encoding *channel capacity *Communication Theory of Secrecy Systems *conditional entropy… … Wikipedia
Redundancy (information theory) — Redundancy in information theory is the number of bits used to transmit a message minus the number of bits of actual information in the message. Informally, it is the amount of wasted space used to transmit certain data. Data compression is a way … Wikipedia