Binary symmetric channel

Binary symmetric channel

A binary symmetric channel (or BSC) is a common communications channel model used in coding theory and information theory. In this model, a transmitter wishes to send a bit (a zero or a one), and the receiver receives a bit. It is assumed that the bit is "usually" transmitted correctly, but that it will be "flipped" with a small probability (the "crossover probability"). This channel is used frequently in information theory because it is one of the simplest channels to analyze.

Description

The BSC is a "binary channel"; that is, it can transmit only one of two symbols (usually called 0 and 1). (A non-binary channel would be capable of transmitting more than 2 symbols, possibly even an infinite number of choices) The transmission is not perfect, and occasionally the receiver gets the wrong bit.

This channel is often used by theorists because it is one of the simplest noisy channels to analyze. Many problems in communication theory can be reduced to a BSC. On the other hand, being able to transmit effectively over the BSC can give rise to solutions for more complicated channels.

Definition

A binary symmetric channel with crossover probability "p" is a channel with binary input and binary output and probability of error "p"; that is, if "X" is the transmitted random variable and "Y" the received variable, then the channel is characterized by the conditional probabilities: Pr( "Y: Pr( "Y" = 0 | "X" = 1) = "p": Pr( "Y" = 1 | "X" = 0 ) = "p": Pr( "Y" = 1 | "X" = 1 ) = 1-"p"

It is assumed that 0 ≤ "p" ≤ 1/2. If "p">1/2, then the receiver can swap the output (interpret 1 when it sees 0, and visa versa) and obtain an equivalent channel with crossover probability 1-"p" ≤ 1/2.

Capacity of the BSC

The capacity of the channel is 1 - H("p"), where H("p") is the binary entropy function.

The converse can be shown by a sphere packing argument. Given a codeword, there are roughly 2 "n" H("p") typical output sequences. There are 2"n" total possible outputs, and the input chooses from a codebook of size 2"nR". Therefore, the receiver would choose to partition the space into "spheres" with 2"n" / 2"nR" = 2"n"(1-"R") potential outputs each. If "R"> 1 - H("p"), then the spheres will be packed too tightly asymptotically and the receiver will not be able to identify the correct codeword with vanishing probability.

References

* David J. C. MacKay. " [http://www.inference.phy.cam.ac.uk/mackay/itila/book.html Information Theory, Inference, and Learning Algorithms] " Cambridge: Cambridge University Press, 2003. ISBN 0-521-64298-1
* Thomas M. Cover, Joy A. Thomas. "Elements of information theory", 1st Edition. New York: Wiley-Interscience, 1991. ISBN 0-471-06259-6.

See also

* Binary erasure channel


Wikimedia Foundation. 2010.

Игры ⚽ Поможем сделать НИР

Look at other dictionaries:

  • Binary erasure channel — A binary erasure channel (or BEC) is a common communications channel model used in coding theory and information theory. In this model, a transmitter sends a bit (a zero or a one), and the receiver either receives the bit or it receives a message …   Wikipedia

  • Binary Symmetric Channel — Schema eines BSC Ein binärer symmetrischer Kanal (BSC) ist ein informationstheoretischer Kanal, bei dem die Wahrscheinlichkeit einer Falschübermittlung von 1 genau so hoch ist wie die Wahrscheinlichkeit der Falschübermittlung einer 0. Das heißt,… …   Deutsch Wikipedia

  • Channel (communications) — Old telephone wires are a challenging communications channel for modern digital communications. In telecommunications and computer networking, a communication channel, or channel, refers either to a physical transmission medium such as a wire, or …   Wikipedia

  • Distributed source coding — (DSC) is an important problem in information theory and communication. DSC problems regard the compression of multiple correlated information sources that do not communicate with each other.[1] By modeling the correlation between multiple sources …   Wikipedia

  • Information theory — Not to be confused with Information science. Information theory is a branch of applied mathematics and electrical engineering involving the quantification of information. Information theory was developed by Claude E. Shannon to find fundamental… …   Wikipedia

  • information theory — the mathematical theory concerned with the content, transmission, storage, and retrieval of information, usually in the form of messages or data, and esp. by means of computers. [1945 50] * * * ▪ mathematics Introduction       a mathematical… …   Universalium

  • Linear code — In mathematics and information theory, a linear code is an important type of block code used in error correction and detection schemes. Linear codes allow for more efficient encoding and decoding algorithms than other codes (cf. syndrome… …   Wikipedia

  • Decoding methods — In communication theory and coding theory, decoding is the process of translating received messages into codewords of a given code. There has been many common methods of mapping messages to codewords. These are often used to recover messages sent …   Wikipedia

  • List of mathematics articles (B) — NOTOC B B spline B* algebra B* search algorithm B,C,K,W system BA model Ba space Babuška Lax Milgram theorem Baby Monster group Baby step giant step Babylonian mathematics Babylonian numerals Bach tensor Bach s algorithm Bachmann–Howard ordinal… …   Wikipedia

  • BSc — steht für: Balanced Scorecard, ein Controlling bzw. Strategieinstrument Banking Supervision Committee, ein Teil des European System of Central Banks Barcelona Supercomputing Center, ein spanisches Computerarchitektur und… …   Deutsch Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”