Convex optimization

Convex optimization

Convex minimization, a subfield of optimization, studies the problem of minimizing convex functions over convex sets. Given a real vector space X together with a convex, real-valued function $f:\mathcal{X}\to \mathbb{R}$

defined on a convex subset $\mathcal{X}$ of X, the problem is to find a point $x^\ast$ in $\mathcal{X}$ for which the number f(x) is smallest, i.e., a point $x^\ast$ such that $f(x^\ast) \le f(x)$ for all $x \in \mathcal{X}$. Advanced treatments consider convex functions that can attain positive infinity, also; the indicator function of convex analysis is zero for every $x\in\mathcal{X}$ and positive infinity otherwise.

The convexity of f makes the powerful tools of convex analysis applicable: the Hahn–Banach theorem and the theory of subgradients lead to a particularly satisfying theory of necessary and sufficient conditions for optimality, a duality theory generalizing that for linear programming, and effective computational methods.

Convex minimization has applications in a wide range of disciplines, such as automatic control systems, estimation and signal processing, communications and networks, electronic circuit design, data analysis and modeling, statistics (optimal design), and finance. With recent improvements in computing and in optimization theory, convex minimization is nearly as straightforward as linear programming. Many optimization problems can be reformulated as convex minimization problems. For example, the problem of maximizing a concave function f can be re-formulated equivalently as a problem of minimizing the function -f, which is convex.

Besides convex minimization, the field of convex optimization also considers the far more difficult problem of maximizing convex functions:

Extensions of convex functions include pseudo-convex and quasi-convex functions. Partial extensions of the theory of convex analysis and iterative methods for approximately solving non-convex minimization problems occur in the field of generalized convexity ("abstract convex analysis").

Theory

The following statements are true about the convex minimization problem:

• if a local minimum exists, then it is a global minimum.
• the set of all (global) minima is convex.
• for each strictly convex function, if the function has a minimum, then the minimum is unique.

These results are used by the theory of convex minimization along with geometric notions from functional analysis such as the Hilbert projection theorem, the separating hyperplane theorem, and Farkas' lemma.

Standard form

Standard form is the usual and most intuitive form of describing a convex minimization problem. It consists of the following three parts:

• A convex function $f(x): \mathbb{R}^n \to \mathbb{R}$ to be minimized over the variable x
• Inequality constraints of the form $g_i(x) \leq 0$, where the functions gi are convex
• Equality constraints of the form hi(x) = 0, where the functions hi are affine. In practice, the terms "linear" and "affine" are often used interchangeably. Such constraints can be expressed in the form $h_i(x) = a_i^T x + b_i$, where ai and bi are column-vectors..

A convex minimization problem is thus written as \begin{align} &\underset{x}{\operatorname{minimize}}& & f(x) \\ &\operatorname{subject\;to} & &g_i(x) \leq 0, \quad i = 1,\dots,m \\ &&&h_i(x) = 0, \quad i = 1, \dots,p. \end{align}

Note that every equality constraint h(x) = 0 can be equivalently replaced by a pair of inequality constraints $h(x)\leq 0$ and $-h(x)\leq 0$. Therefore, for theoretical purposes, equality constraints are redundant; however, it can be beneficial to treat them specially in practice.

Following from this fact, it is easy to understand why hi(x) = 0 has to be affine as opposed to merely being convex. If hi(x) is convex, $h_i(x) \leq 0$ is convex, but $-h_i(x) \leq 0$ is concave. Therefore, the only way for hi(x) = 0 to be convex is for hi(x) to be affine.

Examples

The following problems are all convex minimization problems, or can be transformed into convex minimizations problems via a change of variables:

Lagrange multipliers

Consider a convex minimization problem given in standard form by a cost function f(x) and inequality constraints $g_i(x)\leq 0$, where $i=1\ldots m$. Then the domain $\mathcal{X}$ is: $\mathcal{X} = \left\lbrace{x\in X \vert g_1(x)\le0, \ldots, g_m(x)\le0}\right\rbrace.$

The Lagrangian function for the problem is

L(x,λ0,...,λm) = λ0f(x) + λ1g1(x) + ... + λmgm(x).

For each point x in X that minimizes f over X, there exist real numbers λ0, ..., λm, called Lagrange multipliers, that satisfy these conditions simultaneously:

1. x minimizes L(y, λ0, λ1, ..., λm) over all y in X,
2. λ0 ≥ 0, λ1 ≥ 0, ..., λm ≥ 0, with at least one λk>0,
3. λ1g1(x) = 0, ..., λmgm(x) = 0 (complementary slackness).

If there exists a "strictly feasible point", i.e., a point z satisfying

g1(z) < 0,...,gm(z) < 0,

then the statement above can be upgraded to assert that λ0=1.

Conversely, if some x in X satisfies 1-3 for scalars λ0, ..., λm with λ0 = 1, then x is certain to minimize f over X.

Methods

Convex minimization problems can be solved by the following contemporary methods:

• "Bundle methods" (Wolfe, Lemaréchal), and
• Interior-point methods (Nemirovskii and Nesterov).

Other methods of interest:

Subgradient methods can be implemented simply and so are widely used.

Maximizing convex functions

Besides convex minimization, the field of convex optimization also considers the far more difficult problem of maximizing convex functions:

Solving even close-to-convex problems can be computationally difficult. The problem of minimizing a quadratic multivariate polynomial on a cube is NP-hard. In fact, in the quadratic minimization problem, if the matrix has only one negative eigenvalue, the problem is NP-hard.

Generalized convexity

Extensions of convex functions include pseudo-convex and quasi-convex functions. Partial extensions of the theory of convex analysis and iterative methods for approximately solving non-convex minimization problems occur in the field of generalized convexity ("abstract convex analysis").

Software

Although most general-purpose nonlinear equation solvers such as LSSOL, LOQO, MINOS, and Lancelot work well, many software packages dealing exclusively with convex minimization problems are also available:

Convex minimization solvers

• MOSEK (commercial, stand-alone software and Matlab interface)
• solver.com (commercial)
• SeDuMi (GPLv2, Matlab package)
• SDPT3 (GPLv2, Matlab package)
• OBOE

Wikimedia Foundation. 2010.

Нужно решить контрольную?

Look at other dictionaries:

• Convex function — on an interval. A function (in black) is convex if and only i …   Wikipedia

• Optimization (mathematics) — In mathematics, the term optimization, or mathematical programming, refers to the study of problems in which one seeks to minimize or maximize a real function by systematically choosing the values of real or integer variables from within an… …   Wikipedia

• Convex cone — In linear algebra, a convex cone is a subset of a vector space over an ordered field that is closed under linear combinations with positive coefficients. A convex cone (light blue). Inside of it, the light red convex cone consists of all points… …   Wikipedia

• Optimization problem — In mathematics and computer science, an optimization problem is the problem of finding the best solution from all feasible solutions. Optimization problems can be divided into two categories depending on whether the variables are continuous or… …   Wikipedia

• Convex — A convex set. The word convex means curving out or bulging outward, as opposed to concave. Convex or convexity may refer to: Mathematics: Convex set, a set of points containing all line segments between each pair of its points Convex function, a… …   Wikipedia

• Convex analysis — is the branch of mathematics devoted to the study of properties of convex functions and convex sets, often with applications in convex minimization, a subdomain of optimization theory. See also List of convexity topics References J. B. Hiriart… …   Wikipedia

• optimization — /op teuh meuh zay sheuhn/ 1. the fact of optimizing; making the best of anything. 2. the condition of being optimized. 3. Math. a mathematical technique for finding a maximum or minimum value of a function of several variables subject to a set of …   Universalium

• Convex conjugate — In mathematics, convex conjugation is a generalization of the Legendre transformation. It is also known as Legendre–Fenchel transformation or Fenchel transformation (after Adrien Marie Legendre and Werner Fenchel). Contents 1 Definition 2… …   Wikipedia

• Mathematical optimization — For other uses, see Optimization (disambiguation). The maximum of a paraboloid (red dot) In mathematics, computational science, or management science, mathematical optimization (alternatively, optimization or mathematical programming) refers to… …   Wikipedia

• Conic optimization — is a subfield of convex optimization that studies a class of structured convex optimization problems called conic optimization problems. A conic optimization problem consists of minimizing a convex function over the intersection of an affine… …   Wikipedia