- Binary symmetric channel
A

**binary symmetric channel**(or BSC) is a commoncommunications channel model used incoding theory andinformation theory . In this model, a transmitter wishes to send abit (a zero or a one), and the receiver receives a bit. It is assumed that the bit is "usually" transmitted correctly, but that it will be "flipped" with a smallprobability (the "crossover probability"). This channel is used frequently in information theory because it is one of the simplest channels to analyze.**Description**The BSC is a "binary channel"; that is, it can transmit only one of two symbols (usually called 0 and 1). (A non-binary channel would be capable of transmitting more than 2 symbols, possibly even an infinite number of choices) The transmission is not perfect, and occasionally the receiver gets the wrong bit.

This channel is often used by theorists because it is one of the simplest noisy channels to analyze. Many problems in

communication theory can be reduced to a BSC. On the other hand, being able to transmit effectively over the BSC can give rise to solutions for more complicated channels.**Definition**A

**binary symmetric channel with crossover probability "p**" is a channel with binary input and binary output and probability of error "p"; that is, if "X" is the transmittedrandom variable and "Y" the received variable, then the channel is characterized by the conditional probabilities: Pr( "Y: Pr( "Y" = 0 | "X" = 1) = "p": Pr( "Y" = 1 | "X" = 0 ) = "p": Pr( "Y" = 1 | "X" = 1 ) = 1-"p"It is assumed that 0 ≤ "p" ≤ 1/2. If "p">1/2, then the receiver can swap the output (interpret 1 when it sees 0, and visa versa) and obtain an equivalent channel with crossover probability 1-"p" ≤ 1/2.

**Capacity of the BSC**The capacity of the channel is 1 - H("p"), where H("p") is the

binary entropy function .The converse can be shown by a

sphere packing argument. Given a codeword, there are roughly 2^{"n" H("p")}typical output sequences. There are 2^{"n"}total possible outputs, and the input chooses from acodebook of size 2^{"nR"}. Therefore, the receiver would choose to partition the space into "spheres" with 2^{"n"}/ 2^{"nR"}= 2^{"n"(1-"R")}potential outputs each. If "R"> 1 - H("p"), then the spheres will be packed too tightly asymptotically and the receiver will not be able to identify the correct codeword with vanishing probability.**References*** David J. C. MacKay. " [

*http://www.inference.phy.cam.ac.uk/mackay/itila/book.html Information Theory, Inference, and Learning Algorithms*] " Cambridge: Cambridge University Press, 2003. ISBN 0-521-64298-1

* Thomas M. Cover, Joy A. Thomas. "Elements of information theory", 1st Edition. New York: Wiley-Interscience, 1991. ISBN 0-471-06259-6.**See also***

Binary erasure channel

*Wikimedia Foundation.
2010.*