- Offset (computer science)
-
For other uses, see Offset.
In computer science, an offset within an array or other data structure object is an integer indicating the distance (displacement) from the beginning of the object up until a given element or point, presumably within the same object. The concept of a distance is valid only if all elements of the object are the same size (typically given in bytes or words).
For example, given an array of characters A, containing abcdef, one can say that the element containing the letter 'c' has an offset of 2 from the start of A.
In computer engineering and low-level programming (such as assembly language), an offset usually denotes the number of address locations added to a base address in order to get to a specific absolute address. In this (original) meaning of offset, only the basic address unit, usually the 8-bit byte, is used to specify the offset's size. In this context an offset is sometimes called a relative address.
In IBM System/360 instructions, a 12-bit offset embedded within certain instructions provided a range of between 0 and 4096 bytes. For example within an unconditional branch instruction (X'47F0Fxxx') the xxx 12bit hexadecimal offset provided the byte offset from the base register (15) to branch to. An odd offset would cause a program check (unless the base register itself also contained an odd address) - since instructions had to be aligned on half-word boundaries to execute without a program or hardware interrupt.
Categories:- Computer science stubs
- Computer memory
Wikimedia Foundation. 2010.