- Lochs' theorem
In
number theory , Lochs' theorem is a theorem concerning the rate of convergance of thecontinued fraction expansion of a typical real number. The theorem was proved by G. Lochs in 1964. [Lochs, G. "Abh. Hamburg Univ. Math. Sem." 27, 142-144, 1964]The theorem states that for
almost all real numbers in the interval (0,1), the number of terms "m" of the number's continued fraction expansion that are required to determine the first "n" places of the number's decimal expansion behavesasymptotically as follows:: [MathWorld|urlname=LochsTheorem|title=Lochs' Theorem] [OEIS|id=A086819]
As this limit is only slightly smaller than 1, this can be interpreted as saying that each additional term in the continued fraction representation of a "typical" real number increases the accuracy of the representation by approximately one decimal place.
The reciprocal of this limit
: [OEIS|id=A062542]
is twice the base-10 logarithm of
Lévy's constant .References
Wikimedia Foundation. 2010.