Nelder-Mead method

Nelder-Mead method

:"See simplex algorithm for the numerical solution of the linear programming problem."

The Nelder-Mead method or downhill simplex method or amoeba method is a commonly used nonlinear optimization algorithm. It is due to John Nelder & R. Mead (1965) and is a numerical method for minimizing an objective function in a many-dimensional space.

Overview

The method uses the concept of a simplex, which is a polytope of "N" + 1 vertices in "N" dimensions; a line segment on a line, a triangle on a plane, a tetrahedron in three-dimensional space and so forth.

The method approximately finds a locally optimal solution to a problem with "N" variables when the objective function varies smoothly. For example, a suspension bridge engineer has to choose how thick each strut, cable, and pier must be. Clearly these all link together, but it is not easy to visualize the impact of changing any specific element. The engineer can use the Nelder-Mead method to generate trial designs which are then tested on a large computer model. As each run of the simulation is expensive, it is important to make good decisions about where to look. Nelder-Mead generates a new test position by extrapolating the behavior of the objective function measured at each test point arranged as a simplex. The algorithm then chooses to replace one of these test points with the new test point and so the algorithm progresses.

The simplest step is to replace the worst point with a point reflected through the centroid of the remaining "N" points. If this point is better than the best current point, then we can try stretching exponentially out along this line. On the other hand, if this new point isn't much better than the previous value, then we are stepping across a valley, so we shrink the simplex towards the best point.

Like all general purpose multidimensional optimization algorithms, Nelder-Mead occasionally gets stuck in a rut. The standard approach to handle this is to restart the algorithm with a new simplex starting at the current best value. This can be extended in a similar way to simulated annealing to escape small local minima.

Many variations exist depending on the actual nature of problem being solved. The most common, perhaps, is to use a constant size small simplex that climbs local gradients to local maxima. Visualize a small triangle on an elevation map flip flopping its way up a hill to a local peak. This, however, tends to perform poorly against the method described in this article because it makes small, unnecessary steps in areas of little interest.

This method is also known as the Flexible Polyhedron Method.

One possible variation of the NM algorithm

* 1. First order according to the values at the vertices::: f( extbf{x}_{1}) leq f( extbf{x}_{2}) leq cdots leq f( extbf{x}_{n+1})
* 2. Compute a reflection: extbf{x}_{r} = extbf{x}_{o} + alpha ( extbf{x}_{o}- extbf{x}_{n+1})

::x_{o} is the center of gravity of all points except x_{n+1}.

::If f( extbf{x}_{1}) leq f( extbf{x}_{r}) < f( extbf{x}_{n}),::then we compute a new simplex with ext{x}_{r} and by rejecting ext{x}_{n+1}. Go to step 1.

* 3. expansion: If f( extbf{x}_{r})

::If f( extbf{x}_{e}) < f( extbf{x}_{r}) compute new simplex::with ext{x}_{e} and by rejecting ext{x}_{n+1} and go to step 1. Else compute new simplexwith ext{x}_{r} and by rejecting ext{x}_{n+1} and go to step 1.

* 4. contraction: If f( extbf{x}_{r}) geq f( extbf{x}_{n}) ext{ let } extbf{x}_{c} = extbf{x}_{n+1}+ ho( extbf{x}_{o}- extbf{x}_{n+1}). ::If f( extbf{x}_{c}) leq f( extbf{x}_{n+1}) compute new simplexwith ext{x}_{c} and by rejecting ext{x}_{n+1}. Go to step 1. Else go to step 5.

* 5. shrink step: Compute the n vertices evaluations:::x_{i} = x_{1} + sigma(x_{i} - x_{1}) ext{ for all i } in{2,dots,n+1}. go to step 1.

Note: alpha, ho, gamma and sigma are respectively the reflection, the expansion, the contraction and the shrink coefficient. Standard value are alpha =1, gamma =2, ho =1/2 and sigma =1/2.

For the reflection, since ext{x}_{n+1} is the vertex with the higher associated value along the vertices, we can expect to find a lower value at the reflection of ext{x}_{n+1} in the opposite face formed by all vertices point ext{x}_{i} except ext{x}_{n+1}.

For the expansion, if the reflection point ext{x}_{r} is the new minimum along the vertices we can expect to find interesting values along the direction from ext{x}_{o} to ext{x}_{r}.

Concerning the contraction: If f( ext{x}_{r}) > f( ext{x}_{n}) we can expect that a better value will be inside the simplex formed by all the vertices ext{x}_{i}.

The initial simplex is important, indeed, a too small initial simplex can lead to a local search, consequently the NM can get more easily stuck. So this simplex should depend on the nature of the problem.

See also

* Conjugate gradient method
* Broyden-Fletcher-Goldfarb-Shanno or BFGS method
* Differential evolution

References

Further reading

* J.A. Nelder and R. Mead, Computer Journal, 1965, vol 7, pp 308-313 [http://citeseer.ist.psu.edu/context/23725/0] (text not online)
* Avriel, Mordecai (2003). Nonlinear Programming: Analysis and Methods. Dover Publishing. ISBN 0-486-43227-0.
* K.I.M. McKinnon, "Convergence of the Nelder-Mead simplex method to a non-stationary point", SIAM J Optimization, 1999, vol 9, pp148-158. [http://citeseer.ist.psu.edu/15874.html] (algorithm summary online).

External links

* [http://www3.imperial.ac.uk/people/j.nelder John Nelder FRS]
* [http://www.boomer.org/c/p3/c11/c1106.html Nelder-Mead (Simplex) Method]
* [http://math.fullerton.edu/mathews/n2003/NelderMeadMod.html Nelder-Mead Search for a Minimum]


Wikimedia Foundation. 2010.

Игры ⚽ Нужен реферат?

Look at other dictionaries:

  • Nelder–Mead method — Nelder–Mead simplex search over the Rosenbrock banana function (above) and Himmelblau s function (below) See simplex algorithm for Dantzig s algorithm for the problem of linear opti …   Wikipedia

  • Método Nelder-Mead — Búsqueda del valor mínimo a través del simplex Nelder–Mead en las función banana de Rosenbrock (arriba) y en la función de Himmelblau (abajo) El método Nelder Mead es un algoritmo de optimización ampliamente utilizado. Es …   Wikipedia Español

  • Methode de Nelder-Mead — Méthode de Nelder Mead Illustration de la méthode de Nelder Mead La méthode de Nelder Mead est un algorithme d optimisation non linéaire. Elle est publiée[1] par Nelder et Mead en 1965. C est une méthode numérique qui minimise une fonction dans… …   Wikipédia en Français

  • Méthode De Nelder-Mead — Illustration de la méthode de Nelder Mead La méthode de Nelder Mead est un algorithme d optimisation non linéaire. Elle est publiée[1] par Nelder et Mead en 1965. C est une méthode numérique qui minimise une fonction dans un espace à plusieurs… …   Wikipédia en Français

  • Méthode de nelder-mead — Illustration de la méthode de Nelder Mead La méthode de Nelder Mead est un algorithme d optimisation non linéaire. Elle est publiée[1] par Nelder et Mead en 1965. C est une méthode numérique qui minimise une fonction dans un espace à plusieurs… …   Wikipédia en Français

  • Méthode de Nelder-Mead — Illustration de la méthode de Nelder Mead sur la fonction de Rosenbrock La méthode de Nelder Mead est un algorithme d optimisation non linéaire. Elle est publiée[1] par Nelder  …   Wikipédia en Français

  • John Nelder — John Ashworth Nelder FRS (born 8 October 1924) is a British statistician.Born in Dulverton, Somerset, he was educated at Blundell s School and Sidney Sussex College, Cambridge, where he read mathematics.Nelder s appointments include Head,… …   Wikipedia

  • Downhill-Simplex-Verfahren — Der Simplex Algorithmus nach John Nelder und Roger Mead (Comp. J., vol. 7, 1965, p. 308) oder auch Downhill Simplex Verfahren oder manchmal auch einfach Simplex Algorithmus ist im Unterschied zum Namensvetter für lineare Probleme (Simplex… …   Deutsch Wikipedia

  • Simplex algorithm — In mathematical optimization theory, the simplex algorithm, created by the American mathematician George Dantzig in 1947, is a popular algorithm for numerical solution of the linear programming problem. The journal Computing in Science and… …   Wikipedia

  • List of numerical analysis topics — This is a list of numerical analysis topics, by Wikipedia page. Contents 1 General 2 Error 3 Elementary and special functions 4 Numerical linear algebra …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”