- Probability
**Probability**is the likelihood or chance that something is the case or will happen.Probability theory is used extensively in areas such asstatistics ,mathematics ,science andphilosophy to draw conclusions about the likelihood of potential events and the underlying mechanics of complex systems.**Interpretations**The word "probability" does not have a consistent direct definition. In fact, there are two broad categories of

**probability interpretations**:#Frequentists talk about probabilities only when dealing with well defined random experiments. The probability of a random event denotes the "relative frequency of occurrence" of an experiment's outcome, when repeating the experiment. Frequentists consider probability to be the relative frequency "in the long run" of outcomes. [

*The Logic of Statistical Inference, Ian Hacking, 1965*]

#Bayesians, however, assign probabilities to any statement whatsoever, even when no random process is involved. Probability, for a Bayesian, is a way to represent an individual's "degree of belief" in a statement, given the evidence.**Prehistory and Etymology**Probability has an interesting etymology. Its meaning today is almost the opposite of the meaning of the word from which it originated. Before the seventeenth century, legal evidence in Europe was considered to greater weight if a person testifying had “probity”. “

empirical evidence ” was barely a concept. Probity was a measure of authority, so evidence came from authority. A noble person had probity. Yet today, probability is the very measure of the weight of empirical evidence in science, arrived at from inductive or statistical inference. [*The Emergence of Probability: A Philosophical Study of Early Ideas about Probability, Induction and Statistical Inference, Ian Hacking, Cambridge University Press, 2006, ISBN 0521685575, 9780521685573*] [*The Cambridge History of Seventeenth-century Philosophy, Daniel Garber, 2003*]**History**The scientific study of probability is a modern development.

Gambling shows that there has been an interest in quantifying the ideas of probability for millennia, but exact mathematical descriptions of use in those problems only arose much later.According to Richard Jeffrey, "Before the middle of the seventeenth century, the term 'probable' (Latin "probabilis") meant "approvable", and was applied in that sense, univocally, to opinion and to action. A probable action or opinion was one such as sensible people would undertake or hold, in the circumstances."Jeffrey, R.C., "Probability and the Art of Judgment," Cambridge University Press. (1992). pp. 54-55 . ISBN 0-521-39459-7]

Aside from some elementary considerations made by

Girolamo Cardano in the 16th century, the doctrine of probabilities dates to the correspondence ofPierre de Fermat andBlaise Pascal (1654).Christiaan Huygens (1657) gave the earliest known scientific treatment of the subject.Jakob Bernoulli 's "Ars Conjectandi " (posthumous, 1713) andAbraham de Moivre 's "Doctrine of Chances " (1718) treated the subject as a branch of mathematics. SeeIan Hacking 's "The Emergence of Probability" for a history of the early development of the very concept of mathematical probability.The theory of errors may be traced back to

Roger Cotes 's "Opera Miscellanea" (posthumous, 1722), but a memoir prepared byThomas Simpson in 1755 (printed 1756) first applied the theory to the discussion of errors of observation. The reprint (1757) of this memoir lays down the axioms that positive and negative errors are equally probable, and that there are certain assignable limits within which all errors may be supposed to fall; continuous errors are discussed and a probability curve is given.Pierre-Simon Laplace (1774) made the first attempt to deduce a rule for the combination of observations from the principles of the theory of probabilities. He represented the law of probability of errors by a curve $y\; =\; phi(x)$, $x$ being any error and $y$ its probability, and laid down three properties of this curve:

# it is symmetric as to the $y$-axis;

# the $x$-axis is anasymptote , the probability of the error $infty$ being 0;

#the area enclosed is 1, it being certain that an error exists.He also gave (1781) a formula for the law of facility of error (a term due to Lagrange, 1774), but one which led to unmanageable equations.Daniel Bernoulli (1778) introduced the principle of the maximum product of the probabilities of a system of concurrent errors.The

method of least squares is due toAdrien-Marie Legendre (1805), who introduced it in his "Nouvelles méthodes pour la détermination des orbites des comètes" ("New Methods for Determining the Orbits of Comets"). In ignorance of Legendre's contribution, an Irish-American writer,Robert Adrain , editor of "The Analyst" (1808), first deduced the law of facility of error,:$phi(x)\; =\; ce^\{-h^2\; x^2\},$

$h$ being a constant depending on precision of observation, and $c$ a scale factor ensuring that the area under the curve equals 1. He gave two proofs, the second being essentially the same as

John Herschel 's (1850). Gauss gave the first proof which seems to have been known in Europe (the third after Adrain's) in 1809. Further proofs were given by Laplace (1810, 1812), Gauss (1823), James Ivory (1825, 1826), Hagen (1837),Friedrich Bessel (1838),W. F. Donkin (1844, 1856), andMorgan Crofton (1870). Other contributors were Ellis (1844), De Morgan (1864), Glaisher (1872), andGiovanni Schiaparelli (1875). Peters's (1856) formula for $r$, the probable error of a single observation, is well known.In the

nineteenth century authors on the general theory includedLaplace ,Sylvestre Lacroix (1816), Littrow (1833),Adolphe Quetelet (1853),Richard Dedekind (1860), Helmert (1872),Hermann Laurent (1873), Liagre, Didion, andKarl Pearson .Augustus De Morgan andGeorge Boole improved the exposition of the theory.On the geometric side (see

integral geometry ) contributors to "The Educational Times " were influential (Miller, Crofton, McColl, Wolstenholme, Watson, and Artemas Martin).**Mathematical treatment**In mathematics a probability of an event, "A" is represented by a real number in the range from 0 to 1 and written as P("A"), p("A") or Pr("A"). An impossible event has a probability of 0, and a certain event has a probability of 1. However, the converses are not always true: probability 0 events are not always impossible, nor probability 1 events certain. The rather subtle distinction between "certain" and "probability 1" is treated at greater length in the article on "

almost surely ".The "opposite" or "complement" of an event "A" is the event [not "A"] (that is, the event of "A" not occurring); its probability is given by nowrap|1= P(not "A") = 1 - P("A"). As an example, the chance of not rolling a six on a six-sided die is nowrap|1 - (chance of rolling a six) = $\{1\}\; -\; frac\{1\}\{6\}\; =\; frac\{5\}\{6\}$. See

Complementary event for a more complete treatment.If both the events "A" and "B" occur on a single performance of an experiment this is called the intersection or joint probability of "A" and "B", denoted as $P(A\; cap\; B)$.If two events, "A" and "B" are independent then the joint probability is:$P(A\; mbox\{\; and\; \}B)\; =\; P(A\; cap\; B)\; =\; P(A)\; P(B),,$for example, if two coins are flipped the chance of both being heads is $frac\{1\}\{2\}\; imes\; frac\{1\}\{2\}\; =\; frac\{1\}\{4\}$.

If either event "A" or event "B" or both events occur on a single performance of an experiment this is called the union of the events "A" and "B" denoted as $P(A\; cup\; B)$.If two events are mutually exclusive then the probability of either occurring is :$P(Ambox\{\; or\; \}B)\; =\; P(A\; cup\; B)=\; P(A)\; +\; P(B).$For example, the chance of rolling a 1 or 2 on a six-sided die is $P(1mbox\{\; or\; \}2)\; =\; P(1)\; +\; P(2)\; =\; frac\{1\}\{6\}\; +\; frac\{1\}\{6\}\; =\; frac\{1\}\{3\}$.

If the events are not mutually exclusive then:$mathrm\{P\}left(A\; hbox\{\; or\; \}\; B\; ight)=mathrm\{P\}left(A\; ight)+mathrm\{P\}left(B\; ight)-mathrm\{P\}left(A\; mbox\{\; and\; \}\; B\; ight)$.For example, when drawing a single card at random from a regular deck of cards, the chance of getting a heart or a face card (J,Q,K) (or one that is both) is $frac\{13\}\{52\}\; +\; frac\{12\}\{52\}\; -\; frac\{3\}\{52\}\; =\; frac\{11\}\{26\}$, because of the 52 cards of a deck 13 are hearts, 12 are face cards, and 3 are both: here the possibilities included in the "3 that are both" are included in each of the "13 hearts" and the "12 face cards" but should only be counted once.

"

Conditional probability " is theprobability of some event "A", given the occurrence of some other event "B".Conditional probability is written "P"("A"|"B"), and is read "the probability of "A", given "B". It is defined by:$P(A\; mid\; B)\; =\; frac\{P(A\; cap\; B)\}\{P(B)\}.,$If $P(B)=0$ then $P(A\; mid\; B)$ is undefined.**Theory**Like other theories, the theory of probability is a representation of probabilistic concepts in formal terms—that is, in terms that can be considered separately from their meaning. These formal terms are manipulated by the rules of mathematics and logic, and any results are then interpreted or translated back into the problem domain.

There have been at least two successful attempts to formalize probability, namely the

Kolmogorov formulation and the Cox formulation. In Kolmogorov's formulation (seeprobability space ), sets are interpreted as events and probability itself as a measure on a class of sets. InCox's theorem , probability is taken as a primitive (that is, not further analyzed) and the emphasis is on constructing a consistent assignment of probability values to propositions. In both cases, the laws of probability are the same, except for technical details.There are other methods for quantifying uncertainty,such as the

Dempster-Shafer theory orpossibility theory ,but those are essentially different and not compatible with the laws of probability as they are usually understood.**Applications**Two major applications of probability theory in everyday life are in

risk assessment and in trade oncommodity markets . Governments typically apply probabilistic methods inenvironmental regulation where it is called "pathway analysis ", oftenmeasuring well-being using methods that are stochastic in nature, and choosing projects to undertake based on statistical analyses of their probable effect on the population as a whole. It is not correct to say thatstatistics are involved in the modelling itself, as typically the assessments ofrisk are one-time and thus require more fundamental probability models, e.g. "the probability of another 9/11". Alaw of small numbers tends to apply to all such choices and perception of the effect of such choices, which makes probability measures a political matter.A good example is the effect of the perceived probability of any widespread Middle East conflict on oil prices - which have ripple effects in the economy as a whole. An assessment by a commodity trader that a war is more likely vs. less likely sends prices up or down, and signals other traders of that opinion. Accordingly, the probabilities are not assessed independently nor necessarily very rationally. The theory of

behavioral finance emerged to describe the effect of suchgroupthink on pricing, on policy, and on peace and conflict.It can reasonably be said that the discovery of rigorous methods to assess and combine probability assessments has had a profound effect on modern society. Accordingly, it may be of some importance to most citizens to understand how odds and probability assessments are made, and how they contribute to reputations and to decisions, especially in a

democracy .Another significant application of probability theory in everyday life is reliability. Many consumer products, such as

automobiles and consumer electronics, utilizereliability theory in the design of the product in order to reduce the probability of failure. The probability of failure may be closely associated with the product'swarranty .**Relation to randomness**In a deterministic universe, based on Newtonian concepts, there is no probability if all conditions are known. In the case of a roulette wheel, if the force of the hand and the period of that force are known, then the number on which the ball will stop would be a certainty. Of course, this also assumes knowledge of inertia and friction of the wheel, weight, smoothness and roundness of the ball, variations in hand speed during the turning and so forth. A probabilistic description can thus be more useful than Newtonian mechanics for analysing the pattern of outcomes of repeated rolls of roulette wheel. Physicists face the same situation in

kinetic theory of gases, where the system, while deterministic "in principle", is so complex (with the number of molecules typically the order of magnitude ofAvogadro constant $6cdot\; 10^\{23\}$) that only statistical description of its properties is feasible.A revolutionary discovery of 20th century physics was the random character of all physical processes that occur at microscopic scales and are governed by the laws of

quantum mechanics . Thewave function itself evolves deterministically as long as no observation is made, but, according to the prevailingCopenhagen interpretation , the randomness caused by the wave function collapsing when an observation is made, is fundamental. This means thatprobability theory is required to describe nature. Others never came to terms with the loss of determinism.Albert Einstein famously remarked in a letter toMax Born : "Jedenfalls bin ich überzeugt, daß der Alte nicht würfelt." ("I am convinced that God does not play dice"). Although alternative viewpoints exist, such as that ofquantum decoherence being the cause of an "apparent" random collapse, at present there is a firm consensus among the physicists that probability theory is necessary to describe quantum phenomena.Fact|date=February 2008**See also**

*Decision theory

*Equiprobable

*Fuzzy measure theory

*Game theory

*Information theory

* Important publications in probability

*List of scientific journals in probability

*Measure theory

*Negative probability

*Probabilistic argumentation

*Probabilistic logic

*Random fields

*Random variable

*Statistics

*List of statistical topics

*Stochastic process

*Wiener process

*Black Swan theory

*Calculus of predispositions **Footnotes****Sources***

Olav Kallenberg , "Probabilistic Symmetries and Invariance Principles". Springer -Verlag, New York (2005). 510 pp. ISBN 0-387-25115-4

* Kallenberg, O., "Foundations of Modern Probability," 2nd ed. Springer Series in Statistics. (2002). 650 pp. ISBN 0-387-95313-2**Quotations***

Damon Runyon , "It may be that the race is not always to the swift, nor the battle to the strong - but that is the way to bet."

*Pierre-Simon Laplace "It is remarkable that a science which began with the consideration of games of chance should have become the most important object of human knowledge." "Théorie Analytique des Probabilités", 1812.

*Richard von Mises "The unlimited extension of the validity of the exact sciences was a characteristic feature of the exaggerated rationalism of the eighteenth century" (in reference to Laplace). "Probability, Statistics, and Truth," p 9. Dover edition, 1981 (republication of second English edition, 1957).**External links***

Edwin Thompson Jaynes . "Probability Theory: The Logic of Science". Preprint: Washington University, (1996). — [*http://omega.albany.edu:8008/JaynesBook.html HTML index with links to PostScript files*] and [*http://bayes.wustl.edu/etj/prob/book.pdf PDF*]

* [*http://etext.lib.virginia.edu/cgi-local/DHI/dhi.cgi?id=dv1-43 "Dictionary of the History of Ideas":*] Certainty in Seventeenth-Century Thought

* [*http://etext.lib.virginia.edu/cgi-local/DHI/dhi.cgi?id=dv1-44 "Dictionary of the History of Ideas":*] Certainty since the Seventeenth Century

* [*http://www.economics.soton.ac.uk/staff/aldrich/Figures.htm Figures from the History of Probability and Statistics (Univ. of Southampton)*]

* [*http://www.economics.soton.ac.uk/staff/aldrich/Probability%20Earliest%20Uses.htm Probability and Statistics on the Earliest Uses Pages (Univ. of Southampton)*]

* [*http://members.aol.com/jeff570/stat.html Earliest Uses of Symbols in Probability and Statistics*] on [*http://members.aol.com/jeff570/mathsym.html Earliest Uses of Various Mathematical Symbols*]

* [*http://www.celiagreen.com/charlesmccreery/statistics/bayestutorial.pdf A tutorial on probability and Bayes’ theorem devised for first-year Oxford University students*]

* [*http://ubu.com/historical/young/index.html pdf file of An Anthology of Chance Operations (1963)*] atUbuWeb

*Wikimedia Foundation.
2010.*