- Linear time
In
computational complexity theory , analgorithm is said to take linear time, or O("n") time, if theasymptotic upper bound for the time it requires isproportional to the size of the input, which is usually denoted "n".Informally spoken, the
running time increases linearly with the size of the input. For example, a procedure that adds up all elements of a list requires time proportional to the length of the list. This description is slightly inaccurate, since the running time can significantly deviate from a precise proportionality, especially for small values of "n". For more information, see the article onbig O notation .Linear time is often viewed as a desirable attribute for an algorithm. Much research has been invested into creating algorithms exhibiting (nearly) linear time or better. This research includes both
software andhardware methods. In the case of hardware, some algorithms which, mathematically speaking, can never achievelinear time with standardcomputation model s are able to run inlinear time . There are several hardware technologies which exploitparallelism to provide this. An example iscontent-addressable memory .This concept of Linear Time is used in string matching Algorithms such as Bayer-Moore Algorithm, Ukkonen's Algorithm
See also
*
Polynomial time
*Exponential time
*Constant time
Wikimedia Foundation. 2010.