BRST algorithm

BRST algorithm

Boender-Rinnooy-Stougie-Timmer algorithm (BRST) is an optimization algorithm suitable for finding global optimum of black box functions. In their paper Boender "et al" cite journal | author = Boender, C.G.E., A.H.G. Rinnooy Kan, L. Strougie and G.T. Timmer | date = 1982 | title = A stochastic method for global optimization | journal = Mathematical Programming | volume = 22 | pages = 125–140 | doi = 10.1007/BF01581033] describe their method as a stochastic method involving a combination of sampling, clustering and local search, terminating with a range of confidence intervals on the value of the global minimum.

The algorithm of Boender "et al" has been modified by Timmer cite paper | last = Timmer | first = G.T. | title = Global optimization: A stochastic approach | format = Ph.D. Thesis | publisher = Erasmus University Rotterdam | date = 1984 ] . Timmer considered several clustering methods. Based on experiments a method named "multi level single linkage" was deemed most accurate.

Csendes' algorithms cite journal | last = Csendes | first = T. | title = Nonlinear parameter estimation by global optimization—Efficiency and reliability | journal = Acta Cybernetica | volume = 8 | date = 1988 | pages = 361–370 ] are implementations of the algorithm of [Boender "et al"] and originated the public-domain software product GLOBAL. The local algorithms used are a random direction, linear search algorithm also used by Törn, and a quasi--Newton algorithm not using the derivative of the function. The results show the dependence of the result on the auxiliary local algorithm used.

Background

Extending the class of functions to include multimodal functions makes the global optimization problem unsolvable in general. In order to be solvable some smoothness condition on the function in addition to continuity must be known.

The existence of several local minima and unsolvability in general are important characteristics of global optimization. Unsolvability here means that a solution cannot be guaranteed in a finite number of steps.There are two ways to deal with the unsolvability problem. First, "a priori" conditions on f and A are posed which turns the problem into a solvable one or at least makes it possible to tell for sure that a solution has been found. This restricts the function class that is considered.The second approach which allows a larger class of objective functions to be considered is to give up the solvability requirement and only try to obtain an estimate of the global minimum. In this "probabilistic" approach it would be desirable also to obtain some results on the goodness of an obtained estimate. Some of the solvable problems may fall in this class because the number of steps required for a guaranteed solution might be prohibitively large.

When relaxing the requirement on solvability it seems rational to require that the probability that a solution is obtained approaches 1 if the procedure is allowed to continue forever. An obvious probabilistic global search procedure is to use a local algorithm starting from several points distributed over the whole optimization region. This procedure is named "Multistart". Multistart is certainly one of the earliest global procedures used. It has even been used in local optimization for increasing the confidence in the obtained solution. One drawback of Multistart is that when many starting points are used the same minimum will eventually be determined several times. In order to improve the efficiency of Multistart this should be avoided.

Clustering methods are used to avoid this repeated determination of local minima. This is realized in three steps which may be iteratively used. The three steps are:

* (a) Sample points in the region of interest.
* (b) Transform the sample to obtain points grouped around the local minima.
* (c) Use a clustering technique to recognize these groups (i.e. neighbourhoods of the local minima).

If the procedure employing these steps is successful then starting a single local optimization from each cluster would determine the local minima and thus also the global minimum. The advantage in using this approach is that the work spared by computing each minimum just once can be spent on computations in (a) and (b), which will increase the probability that the global minimum will be found.

Being a clustering method, their effectiveness is higher for low dimensional problems and become less effective for problems having a few hundred variables.

References

External links

* [http://www.abo.fi/~atorn/Globopt.html http://www.abo.fi/~atorn/Globopt.html] With the author's permission, text has been verbatim copied.
* [http://www.mat.univie.ac.at/~vpk/math/gopt_eng.html Janka] Compares various global optimization algorithms, of which BRST shows superior performance.
* [http://www.mat.univie.ac.at/~vpk/math/dix_sze_eng.html Janka] Presents the number of function-evaluations performed on the testset of Dixon-Szegö. Along with the MCS algorithm, the BRST requires the lowest number of evaluations.


Wikimedia Foundation. 2010.

Игры ⚽ Нужно сделать НИР?

Look at other dictionaries:

  • BRST — Possible alternative meanings of BRST are:* BRST formalism and quantization in Yang Mills theories * Big Red Switch Time (or Big Red Switch Treatment): computer jargon for switching your computer off, when all other options for a more elegant… …   Wikipedia

  • List of numerical analysis topics — This is a list of numerical analysis topics, by Wikipedia page. Contents 1 General 2 Error 3 Elementary and special functions 4 Numerical linear algebra …   Wikipedia

  • List of mathematics articles (B) — NOTOC B B spline B* algebra B* search algorithm B,C,K,W system BA model Ba space Babuška Lax Milgram theorem Baby Monster group Baby step giant step Babylonian mathematics Babylonian numerals Bach tensor Bach s algorithm Bachmann–Howard ordinal… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”