Entropy maximization

Entropy maximization

An entropy maximization problem is a convex optimization problem of the form

:maximize f_0(x) = - sum_{i=1}^n x_i log x_i :subject to Ax leq b, quad mathbf{1}^T x =1

where x in mathbb{R}^n_{++} is the optimization variable, Ainmathbb{R}^{m imes n} and b inmathbb{R}^m are problem parameters, and mathbf{1} denotes a vector whose components are all 1.

ee also

* Principle of maximum entropy

External links

* cite book
last=Boyd
first=Stephen
coauthors=Lieven Vandenberghe
title=Convex Optimization
publisher=Cambridge University Press
date=2004
pages=p. 362
isbn=0521833787
url=http://www.stanford.edu/~boyd/cvxbook/bv_cvxbook.pdf
accessdate=2008-08-24


Wikimedia Foundation. 2010.

Игры ⚽ Поможем написать реферат

Look at other dictionaries:

  • Maximization — or maximisation can refer to: Maximization in the sense of exaggeration Entropy maximization Maximization (economics) Profit maximization Utility maximization problem Budget maximizing model Shareholder value maximization Optimization… …   Wikipedia

  • Entropy and life — Much writing has been devoted to Entropy and life. Research concerning the relationship between the thermodynamic quantity entropy and the evolution of life began in around the turn of the 20th century. In 1910, American historian Henry Adams… …   Wikipedia

  • Entropy (statistical thermodynamics) — In thermodynamics, statistical entropy is the modeling of the energetic function entropy using probability theory. The statistical entropy perspective was introduced in 1870 with the work of the Austrian physicist Ludwig Boltzmann. Mathematical… …   Wikipedia

  • Principle of maximum entropy — This article is about the probability theoretic principle. For the classifier in machine learning, see maximum entropy classifier. For other uses, see maximum entropy (disambiguation). Bayesian statistics Theory Bayesian probability Probability… …   Wikipedia

  • Maximum entropy — may refer to: The principle of maximum entropy The maximum entropy probability distribution Maximum entropy spectral estimation Maximum entropy spectral analysis Maximum entropy thermodynamics The law of maximum entropy production Entropy… …   Wikipedia

  • Differential entropy — (also referred to as continuous entropy) is a concept in information theory that extends the idea of (Shannon) entropy, a measure of average surprisal of a random variable, to continuous probability distributions. Contents 1 Definition 2… …   Wikipedia

  • List of mathematics articles (E) — NOTOC E E₇ E (mathematical constant) E function E₈ lattice E₈ manifold E∞ operad E7½ E8 investigation tool Earley parser Early stopping Earnshaw s theorem Earth mover s distance East Journal on Approximations Eastern Arabic numerals Easton s… …   Wikipedia

  • List of numerical analysis topics — This is a list of numerical analysis topics, by Wikipedia page. Contents 1 General 2 Error 3 Elementary and special functions 4 Numerical linear algebra …   Wikipedia

  • Kullback–Leibler divergence — In probability theory and information theory, the Kullback–Leibler divergence[1][2][3] (also information divergence, information gain, relative entropy, or KLIC) is a non symmetric measure of the difference between two probability distributions P …   Wikipedia

  • Trip distribution — (or destination choice or zonal interchange analysis), is the second component (after trip generation, but before mode choice and route assignment) in the traditional four step transportation forecasting model. This step matches tripmakers’… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”