Optimal substructure

Optimal substructure
Figure 1. Finding the shortest path using optimal substructure. Numbers represent the length of the path; straight lines indicate single edges, wavy lines indicate shortest paths, i.e., there might be other vertices that are not shown here.

In computer science, a problem is said to have optimal substructure if an optimal solution can be constructed efficiently from optimal solutions to its subproblems.[1] This property is used to determine the usefulness of dynamic programming and greedy algorithms in a problem.

Typically, a greedy algorithm is used to solve a problem with optimal substructure if it can be proved by induction that this is optimal at each step (Cormen et al. pp. 381–2). Otherwise, providing the problem exhibits overlapping subproblems as well, dynamic programming is used. If there are no appropriate greedy algorithms and the problem fails to exhibit overlapping subproblems, often a lengthy but straightforward search of the solution space is the best alternative.

In the application of dynamic programming to mathematical optimization, Richard Bellman's Principle of Optimality is based on the idea that in order to solve a dynamic optimization problem from some starting period t to some ending period T, one implicitly has to solve subproblems starting from later dates s, where t<s<T. This is an example of optimal substructure. The Principle of Optimality is used to derive the Bellman equation, which shows how the value of the problem starting from t is related to the value of the problem starting from s.

Contents

Example

Consider finding a shortest path for travelling between two cities by car, as illustrated in Figure 1. Such an example is likely to exhibit optimal substructure. That is, if the shortest route from Seattle to Los Angeles passes through Portland and then Sacramento, then the shortest route from Portland to Los Angeles must pass through Sacramento too. That is, the problem of how to get from Portland to Los Angeles is nested inside the problem of how to get from Seattle to Los Angeles. (The wavy lines in the graph represent solutions to the subproblems.)

As an example of a problem that is unlikely to exhibit optimal substructure, consider the problem of finding the cheapest airline ticket from Buenos Aires to Moscow. Even if that ticket involves stops in Miami and then London, we can't conclude that the cheapest ticket from Miami to Moscow stops in London, because the price at which an airline sells a multi-flight trip is usually not the sum of the prices at which it would sell the individual flights in the trip.

Definition

A slightly more formal definition of optimal substructure can be given. Let a "problem" be a collection of "alternatives", and let each alternative have an associated cost, c(a). The task is to find a set of alternatives that minimizes c(a). Suppose that the alternatives can be partitioned into subsets, where each subset has its own cost function, and each alternative belongs to only one subset. The minima of each of these cost functions can be found, as can the minima of the global cost function, restricted to the same subsets. If these minima match for each subset, then it's almost obvious that a global minimum can be picked not out of the full set of alternatives, but out of only the set that consists of the minima of the smaller, local cost functions we have defined. If minimizing the local functions is a problem of "lower order", and (specifically) if, after a finite number of these reductions, the problem becomes trivial, then the problem has an optimal substructure.

Problems with optimal substructure

See also

References

  1. ^ Introduction to Algorithms, 2nd ed., (Cormen, Leiserson, Rivest, and Stein) 2001, p. 327. ISBN 0-262-03293-7.

Wikimedia Foundation. 2010.

Игры ⚽ Нужен реферат?

Look at other dictionaries:

  • Dynamic programming — For the programming paradigm, see Dynamic programming language. In mathematics and computer science, dynamic programming is a method for solving complex problems by breaking them down into simpler subproblems. It is applicable to problems… …   Wikipedia

  • Bellman equation — A Bellman equation (also known as a dynamic programming equation), named after its discoverer, Richard Bellman, is a necessary condition for optimality associated with the mathematical optimization method known as dynamic programming. It writes… …   Wikipedia

  • Greedy algorithm — A greedy algorithm is any algorithm that follows the problem solving metaheuristic of making the locally optimum choice at each stage Paul E. Black, greedy algorithm in Dictionary of Algorithms and Data Structures [online] , U.S. National… …   Wikipedia

  • Longest common subsequence problem — Not to be confused with longest common substring problem. The longest common subsequence (LCS) problem is to find the longest subsequence common to all sequences in a set of sequences (often just two). Note that subsequence is different from a… …   Wikipedia

  • Addition-chain exponentiation — In mathematics and computer science, optimal addition chain exponentiation is a method of exponentiation by positive integer powers that requires a minimal number of multiplications. It works by creating a minimal length addition chain that… …   Wikipedia

  • List of numerical analysis topics — This is a list of numerical analysis topics, by Wikipedia page. Contents 1 General 2 Error 3 Elementary and special functions 4 Numerical linear algebra …   Wikipedia

  • Algorithm — Flow chart of an algorithm (Euclid s algorithm) for calculating the greatest common divisor (g.c.d.) of two numbers a and b in locations named A and B. The algorithm proceeds by successive subtractions in two loops: IF the test B ≤ A yields yes… …   Wikipedia

  • List of mathematics articles (O) — NOTOC O O minimal theory O Nan group O(n) Obelus Oberwolfach Prize Object of the mind Object theory Oblate spheroid Oblate spheroidal coordinates Oblique projection Oblique reflection Observability Observability Gramian Observable subgroup… …   Wikipedia

  • List of combinatorics topics — This is a list of combinatorics topics.A few decades ago it might have been said that combinatorics is little more than a way to classify poorly understood problems, and some standard remedies. Great progress has been made since 1960.This page is …   Wikipedia

  • Memory bound function — Memory bound refers to a situation in which the time to complete a given computational problem is decided primarily by the amount of available memory to hold data. In other words, the limiting factor of solving a given problem is the memory… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”