Inverse quadratic interpolation

Inverse quadratic interpolation

In numerical analysis, inverse quadratic interpolation is a root-finding algorithm, meaning that it is an algorithm for solving equations of the form "f"("x") = 0. The idea is to use quadratic interpolation to approximate the inverse of "f". This algorithm is rarely used on its own, but it is important because it forms part of the popular Brent's method.

The method

The inverse quadratic interpolation algorithm is defined by the recurrence relation

: x_{n+1} = frac{f_{n-1}f_n}{(f_{n-2}-f_{n-1})(f_{n-2}-f_n)} x_{n-2} + frac{f_{n-2}f_n}{(f_{n-1}-f_{n-2})(f_{n-1}-f_n)} x_{n-1}

::::: {} + frac{f_{n-2}f_{n-1{(f_n-f_{n-2})(f_n-f_{n-1})} x_n,

where "f""k" = "f"("x""k"). As can be seen from the recurrence relation, this method requires three initial values, "x"0, "x"1 and "x"2.

Explanation of the method

We use the three preceding iterates, "x""n"−2, "x""n"−1 and "x""n", with their function values, "f""n"−2, "f""n"−1 and "f""n". Applying the Lagrange interpolation formula to do quadratic interpolation on the inverse of "f" yields

: f^{-1}(y) = frac{(y-f_{n-1})(y-f_n)}{(f_{n-2}-f_{n-1})(f_{n-2}-f_n)} x_{n-2} + frac{(y-f_{n-2})(y-f_n)}{(f_{n-1}-f_{n-2})(f_{n-1}-f_n)} x_{n-1}

::::: {} + frac{(y-f_{n-2})(y-f_{n-1})}{(f_n-f_{n-2})(f_n-f_{n-1})} x_n.

We are looking for a root of "f", so we substitute "y" = "f"("x") = 0 in the above equation and this results in the above recursion formula.

Behaviour

The asymptotic behaviour is very good: generally, the iterates "x""n" converge fast to the root once they get close. However, performance is often quite poor if you do not start very close to the actual root. For instance, if by any chance two of the function values "f""n"−2, "f""n"−1 and "f""n" coincide, the algorithm fails completely. Thus, inverse quadratic interpolation is seldom used as a stand-alone algorithm.

Comparison with other root-finding methods

As noted in the introduction, inverse quadratic interpolation is used in Brent's method.

Inverse quadratic interpolation is also closely related to some other root-finding methods.Using linear interpolation instead of quadratic interpolation gives the secant method. Interpolating "f" instead of the inverse of "f" gives Muller's method.

ee also

* Successive parabolic interpolation is a related method that uses parabolas to find extrema rather than roots.

References

*Cleve Moler, [http://www.mathworks.com/moler/chapters.html Numerical Computing with MATLAB] , Section 4.5, Society for Industrial and Applied Mathematics (SIAM), 2004. ISBN 0-89871-560-1.


Wikimedia Foundation. 2010.

Игры ⚽ Поможем сделать НИР

Look at other dictionaries:

  • Successive parabolic interpolation — is a technique for finding the extremum (minimum or maximum) of a continuous unimodal function by successively fitting parabolas (polynomials of degree two) to the function at three unique points, and at each iteration replacing the oldest point… …   Wikipedia

  • Brent's method — In numerical analysis, Brent s method is a complicated but popular root finding algorithm combining the bisection method, the secant method and inverse quadratic interpolation. It has the reliability of bisection but it can be as quick as some of …   Wikipedia

  • List of numerical analysis topics — This is a list of numerical analysis topics, by Wikipedia page. Contents 1 General 2 Error 3 Elementary and special functions 4 Numerical linear algebra …   Wikipedia

  • Root-finding algorithm — A root finding algorithm is a numerical method, or algorithm, for finding a value x such that f(x) = 0, for a given function f. Such an x is called a root of the function f. This article is concerned with finding scalar, real or complex roots,… …   Wikipedia

  • List of mathematics articles (I) — NOTOC Ia IA automorphism ICER Icosagon Icosahedral 120 cell Icosahedral prism Icosahedral symmetry Icosahedron Icosian Calculus Icosian game Icosidodecadodecahedron Icosidodecahedron Icositetrachoric honeycomb Icositruncated dodecadodecahedron… …   Wikipedia

  • Newton's method — In numerical analysis, Newton s method (also known as the Newton–Raphson method), named after Isaac Newton and Joseph Raphson, is a method for finding successively better approximations to the roots (or zeroes) of a real valued function. The… …   Wikipedia

  • mathematics — /math euh mat iks/, n. 1. (used with a sing. v.) the systematic treatment of magnitude, relationships between figures and forms, and relations between quantities expressed symbolically. 2. (used with a sing. or pl. v.) mathematical procedures,… …   Universalium

  • Timeline of mathematics — A timeline of pure and applied mathematics history. Contents 1 Before 1000 BC 2 1st millennium BC 3 1st millennium AD 4 1000–1500 …   Wikipedia

  • Newton's method in optimization — A comparison of gradient descent (green) and Newton s method (red) for minimizing a function (with small step sizes). Newton s method uses curvature information to take a more direct route. In mathematics, Newton s method is an iterative method… …   Wikipedia

  • List of statistics topics — Please add any Wikipedia articles related to statistics that are not already on this list.The Related changes link in the margin of this page (below search) leads to a list of the most recent changes to the articles listed below. To see the most… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”