- Mathematical notation
-
For information on rendering mathematical formulas in Wikipedia, see Help:Formula.See also: Table of mathematical symbols
Mathematical notation is a system of symbolic representations of mathematical objects and ideas. Mathematical notations are used in mathematics, the physical sciences, engineering, and economics. Mathematical notations include relatively simple symbolic representations, such as the numbers 0, 1 and 2, function symbols sin and +; conceptual symbols, such as lim, dy/dx, equations and variables; and complex diagrammatic notations such as Penrose graphical notation and Coxeter-Dynkin diagrams.
Contents
Definition
A mathematical notation is a writing system used for recording concepts in mathematics.
- The notation uses symbols or symbolic expressions which are intended to have a precise semantic meaning.
- In the history of mathematics, these symbols have denoted numbers, shapes, patterns, and change. The notation can also include symbols for parts of the conventional discourse between mathematicians, when viewing mathematics as a language.
The media used for writing are recounted below, but common materials currently include paper and pencil, board and chalk (or dry-erase marker), and electronic media. Systematic adherence to mathematical concepts is a fundamental concept of mathematical notation. (See also some related concepts: Logical argument, Mathematical logic, and Model theory.)
Expressions
A mathematical expression is a sequence of symbols which can be evaluated. For example, if the symbols represent numbers, the expressions are evaluated according to a conventional order of operations which provides for calculation, if possible, of any expressions within parentheses, followed by any exponents and roots, then multiplications and divisions and finally any additions or subtractions, all done from left to right. In a computer language, these rules are implemented by the compilers. For more on expression evaluation, see the computer science topics: eager evaluation, lazy evaluation, and evaluation operator.
Precise semantic meaning
Modern mathematics needs to be precise, because ambiguous notations do not allow formal proofs. Suppose that we have statements, denoted by some formal sequence of symbols, about some objects (for example, numbers, shapes, patterns). Until the statements can be shown to be valid, their meaning is not yet resolved. While reasoning, we might let the symbols refer to those denoted objects, perhaps in a model. The semantics of that object has a heuristic side and a deductive side. In either case, we might want to know the properties of that object, which we might then list in an intensional definition.
Those properties might then be expressed by some well-known and agreed-upon symbols from a table of mathematical symbols. This mathematical notation might include annotation such as
- "All x", "No x", "There is an x" (or its equivalent, "Some x"), "A set", "A function"
- "A mapping from the real numbers to the complex numbers"
In different contexts, the same symbol or notation can be used to represent different concepts. Therefore, to fully understand a piece of mathematical writing, it is important to first check the definitions that an author gives for the notations that are being used. This may be problematic if the author assumes the reader is already familiar with the notation in use.
History
Main article: History of mathematical notationCounting
It is believed that a mathematical notation to represent counting was first developed at least 50,000 years ago[1] — early mathematical ideas such as finger counting[2] have also been represented by collections of rocks, sticks, bone, clay, stone, wood carvings, and knotted ropes. The tally stick is a timeless way of counting. Perhaps the oldest known mathematical texts are those of ancient Sumer. The Census Quipu of the Andes and the Ishango Bone from Africa both used the tally mark method of accounting for numerical concepts.
The development of zero as a number is one of the most important developments in early mathematics. It was used as a placeholder by the Babylonians and Greek Egyptians, and then as an integer by the Mayans, Indians and Arabs. (See The history of zero for more information.)
Geometry becomes analytic
The mathematical viewpoints in geometry did not lend themselves well to counting. The natural numbers, their relationship to fractions, and the identification of continuous quantities actually took millennia to take form, and even longer to allow for the development of notation. It was not until the invention of analytic geometry by René Descartes that geometry became more subject to a numerical notation. Some symbolic shortcuts for mathematical concepts came to be used in the publication of geometric proofs. Moreover, the power and authority of geometry's theorem and proof structure greatly influenced non-geometric treatises, Isaac Newton's Principia Mathematica, for example.
Counting is mechanized
After the rise of Boolean algebra and the development of positional notation, it became possible to mechanize simple circuits for counting, first by mechanical means, such as gears and rods, using rotation and translation to represent changes of state, then by electrical means, using changes in voltage and current to represent the analogs of quantity. Today, computers use standard circuits to both store and change quantities, which represent not only numbers but pictures, sound, motion, and control.
Modern notation
The 18th and 19th centuries saw the creation and standardization of mathematical notation as used today. Euler was responsible for many of the notations in use today: the use of a, b, c for constants and x, y, z for unknowns, e for the base of the natural logarithm, sigma (Σ) for summation, i for the imaginary unit, and the functional notation f(x). He also popularized the use of π for Archimedes constant (due to William Jones' proposal for the use of π in this way based on the earlier notation of William Oughtred). Many fields of mathematics bear the imprint of their creators for notation: the differential operator is due to Leibniz,[3] the cardinal infinities to Georg Cantor (in addition to the lemniscate (∞) of John Wallis), the congruence symbol (≡) to Gauss, and so forth.
Computerized notation
The rise of expression evaluators such as calculators and slide rules were only part of what was required to mathematicize civilization. Today, keyboard-based notations are used for the e-mail of mathematical expressions, the Internet shorthand notation. The wide use of programming languages, which teach their users the need for rigor in the statement of a mathematical expression (or else the compiler will not accept the formula) are all contributing toward a more mathematical viewpoint across all walks of life. Mathematically oriented markup languages such as TeX, LaTeX and, more recently, MathML are powerful enough that they qualify as mathematical notations in their own right.
For some people, computerized visualizations have been a boon to comprehending mathematics that mere symbolic notation could not provide. They can benefit from the wide availability of devices, which offer more graphical, visual, aural, and tactile feedback.
Ideographic notation
In the history of writing, ideographic symbols arose first, as more-or-less direct renderings of some concrete item. This has come full circle with the rise of computer visualization systems, which can be applied to abstract visualizations as well, such as for rendering some projections of a Calabi-Yau manifold.
Examples of abstract visualization which properly belong to the mathematical imagination can be found, for example in computer graphics. The need for such models abounds, for example, when the measures for the subject of study are actually random variables and not really ordinary mathematical functions.
Non-Latin-based mathematical notation
Modern Arabic mathematical notation is based mostly on the Arabic alphabet and is used widely in the Arab world, especially in pre-university levels of education.
Some mathematical notations are mostly diagrammatic, and so are almost entirely script independent. Examples are Penrose graphical notation and Coxeter-Dynkin diagrams.
Braille-based mathematical notations used by blind people include Nemeth Braille and GS8 Braille.
See also
- Abuse of notation
- Begriffsschrift
- Bourbaki dangerous bend symbol
- History of mathematical notation
- ISO 31-11
- ISO/IEC 80000-2
- Mathematical Alphanumeric Symbols
- Notation in probability
- Scientific notation
- Table of mathematical symbols
- Typographical conventions in mathematical formulae
- Modern Arabic mathematical notation
Notes
- ^ An Introduction to the History of Mathematics (6th Edition) by Howard Eves (1990)p.9
- ^ Georges Ifrah notes that humans learned to count on their hands. Ifrah shows, for example, a picture of Boethius (who lived 480–524 or 525) reckoning on his fingers in Ifrah 2000, p. 48.
- ^ Gottfried Wilhelm Leibnitz
References
- Florian Cajori, A History of Mathematical Notations (1929), 2 volumes. ISBN 0-486-67766-4
- Ifrah, Georges (2000), The Universal History of Numbers: From prehistory to the invention of the computer., John Wiley and Sons, p. 48, ISBN 0-471-39340-1. Translated from the French by David Bellos, E.F. Harding, Sophie Wood and Ian Monk. Ifrah supports his thesis by quoting idiomatic phrases from languages across the entire world.
External links
- Earliest Uses of Various Mathematical Symbols
- Mathematical ASCII Notation how to type math notation in any text editor.
- Mathematics as a Language at cut-the-knot
- Stephen Wolfram: Mathematical Notation: Past and Future. October 2000. Transcript of a keynote address presented at MathML and Math on the Web: MathML International Conference.
Categories:
Wikimedia Foundation. 2010.