- Interleaving
Interleaving in
computer science is a way to arrangedata in a non-contiguous way in order to increase performance.It is used in:
*time-division multiplexing (TDM) intelecommunications
* computer memory
*disk storage Interleaving is mainly used in data communication,
multimedia file format s,radio transmission (for example insatellite s) or by ADSL. The termmultiplexing is sometimes used to refer to the interleaving ofdigital signal data.Interleaving is also used for multidimensional data structures, see
Z-order (curve) .Interleaving in disk storage
Historically, interleaving was used in ordering
block storage on disk-based storage devices such as thefloppy disk and thehard disk . The primary purpose of interleaving was to adjust the timing differences between when the computer was ready to transfer data, and when that data was actually arriving at the drive head to be read. Interleaving was very common prior to the 1990s, but faded from use as processing speeds increased. Modern disk storage is not interleaved.Interleaving was used to arrange the sectors in the most efficient manner possible, so that after reading a sector, time would be permitted for processing, and then the next sector in sequence is ready to be read just as the computer is ready to do so. Matching the sector interleave to the processing speed therefore accelerates the data transfer, but an incorrect interleave can make the system perform markedly slower.
Example Usage
Information is commonly stored on disk storage in very small pieces referred to as "sectors" or "blocks". These are arranged in concentric rings referred to as "tracks" or "cylinders" across the surface of each disk. While it may seem easiest to order these blocks in direct serial order in each track, such as 1 2 3 4 5 6 7 8 9, for early computing devices this ordering was not practical.
Data to be written or read is put into a special region of reusable memory referred to as a "buffer". When data needed to be written, it was moved into the buffer, and then written from the buffer to the disk. When data was read, the reverse took place, transferring first into the buffer and then moved to where it was needed. Most early computers were not fast enough to read a sector, move the data from the buffer to somewhere else, and be ready to read the next sector by the time that next sector was appearing under the read head.
When sectors were arranged in direct serial order, after the first sector was read the computer may spend the time it takes for three sectors to pass by before it is ready to receive data again. However with the sectors in direct order, sector two, three, and four have already passed by. The computer doesn't need sectors 4, 5, 6, 7, 8, 9, or 1, and must wait for these to pass by, before reading sector two. This waiting for the disk spin around to the right spot slows the data transfer rate.
To correct for the processing delays, the ideal interleave for this system would be 1:4, ordering the sectors like this: 1 8 6 4 2 9 7 5 3. It reads sector 1, processes for three sectors whereby 8 6 and 4 pass by, and just as the computer becomes ready again, sector two is arriving just as it is needed.
Modern disk storage does not need interleaving since the buffer space is now so much larger. Data is now more commonly stored as clusters which are groups of sectors, and the data buffer is sufficiently large to allow all sectors in a block to be read at once without any delay between sectors.
Interleaving in data transmission
Interleaving is used in digital data transmission technology to protect the transmission against burst errors. These errors overwrite a lot of bits in a row, so a typical error correction scheme that expects errors to be more uniformly distributed can be overwhelmed. Interleaving is used to help stop this from happening.
Data is often transmitted with error control bits that enable the receiver to correct a certain number of errors that occur during transmission. If a burst error occurs, too many errors can be made in one code word, and that codeword cannot be correctly decoded. To reduce the effect of such burst errors, the bits of a number of codewords are interleaved before being transmitted. This way, a burst error affects only a correctable number of bits in each codeword, and the decoder can decode the codewords correctly.
This method is popular because it is a less complex and cheaper way to handle burst errors than directly increasing the power of the error correction scheme.
Let's look at an example. We apply an error correcting code so that the channel codeword has four bits and one-bit errors can be corrected. The channel codewords are put into a block like this: aaaabbbbccccddddeeeeffffgggg.
Consider transmission without interleaving:
Error-free message: aaaabbbbccccddddeeeeffffgggg Transmission with a burst error: aaaabbbbccc____deeeeffffgggg
The codeword dddd is altered in three bits, so either it cannot be decoded at all (decoding failure) or it might be decoded into the wrong codeword (false decoding). Which of the two happens depends on the error correcting code applied.
Now, let's do the same with interleaving:
Error-free code words: aaaabbbbccccddddeeeeffffgggg Interleaved: abcdefgabcdefgabcdefgabcdefg Transmission with a burst error: abcdefgabcd____bcdefgabcdefg Received code words after deinterleaving: aa_abbbbccccdddde_eef_ffg_gg
In each of the codewords aaaa, eeee, ffff, gggg, only one bit is altered, so our one-bit-error-correcting-code will decode everything correctly.
Of course, latency is increased by interleaving because we cannot send the second bit of codeword aaaa before awaiting the first bit of codeword gggg.
For a different example, consider a meaningful sentence like: ThisIsAnExampleOfInterleaving, and suppose we get a burst error corrupting six letters. First, let us see what the sentence looks like without interleaving.
Consider transmission without interleaving:
Original transmitted sentence: ThisIsAnExampleOfInterleaving Received sentence with a burst error: ThisIs______pleOfInterleaving We find that the term "AnExample" is lost or unintelligible.
Now we repeat this example but interleave the sentence prior to transmission. The message is interleaved by transmitting every fourth letter starting at the first letter, then every fourth letter starting at the second, an so on. To make the message a multiple of four letters, three dots have been added to the end. (This is an example of block interleaving.)
Consider transmission with interleaving:
Transmitted sentence: ThisIsAnExampleOfInterleaving... Error-free transmission: TIEpfeaghsxlIrv.iAaenli.snmOten. Received sentence with a burst error: TIEpfe______Irv.iAaenli.snmOten. Received sentence after deinterleaving: T_isI_AnE_amp_eOfInterle_vin_...
No single word is completely lost and it is easy to recover them.
See also [http://media.dsl-magazin.de/24/25/722.gifgraphical "Representation of interleaving"] .
Disadvantages of interleaving
Use of interleaving techniques increases latency. This is because the entire interleaved block must be received before the critical data can be returned.
Wikimedia Foundation. 2010.