Asymptotic computational complexity
- Asymptotic computational complexity
In computational complexity theory, asymptotic computational complexity is the usage of the asymptotic analysis for the estimation of computational complexity of algorithms and computational problems, commonly associated with the usage of the big O notation.
In terms of the most commonly estimated computational resources, it is spoken about the asymptotic time complexity and asymptotic space complexity. Other asymptotically estimated resources include circuit complexity and various measures of parallel computation, such as the number of (parallel) processors.
Since the groundlaying 1965 paper of Hartmanis and Stearns [J. Hartmanis, R. Stearns. "On the computational complexity of algorithms," "Transactions of the American Mathematical Society", 1965 vol. 117, pp. 285-306] and the 1972 book by Garey and Johnson on NP-completeness, [Michael Garey, and David S. Johnson: "Computers and Intractability: A Guide to the Theory of NP-Completeness." New York: W. H. Freeman & Co., 1979.] the term "computational complexity" (of algorithms) most commonly refers to the asymptotic computational complexity.
Further, unless specified otherwise, the term "computational complexity" usually refers to the upper bound for the asymptotic computational complexity of an algorithm or a problem, which is usually written in terms of the "Big Oh", e.g.. Other types of (asymptotic) computational complexity estimates are lower bounds ("Big Omega" notation; e.g., Ω("n")) and asymptotically tight estimates, when the asymptotic upper and lower bounds coincide (written using the "Big Theta"; e.g., Θ("n" log "n")).
A further tacit assumption is that the worst case analysis of computational complexity is in question unless stated otherwise. An alternative approach is probabilistic analysis of algorithms.
In most practical cases deterministic algorithms or randomized algorithms are discussed, although theoretical computer science also considers nondeterministic algorithms and other advanced models of computation.
References
Wikimedia Foundation.
2010.
Look at other dictionaries:
Asymptotic analysis — This article is about the comparison of functions as inputs approach infinite. For asymptotes in geometry, see asymptotic curve. In mathematical analysis, asymptotic analysis is a method of describing limiting behavior. The methodology has… … Wikipedia
Computational resource — In computational complexity theory, a computational resource is a resource used by some computational models in the solution of computational problems. The simplest computational resources are computation time, the number of steps necessary to… … Wikipedia
Computational electromagnetics — Computational electromagnetics, computational electrodynamics or electromagnetic modeling is the process of modeling the interaction of electromagnetic fields with physical objects and the environment. It typically involves using computationally… … Wikipedia
Time complexity — In computer science, the time complexity of an algorithm quantifies the amount of time taken by an algorithm to run as a function of the size of the input to the problem. The time complexity of an algorithm is commonly expressed using big O… … Wikipedia
Circuit complexity — In theoretical computer science, circuit complexity is a branch of computational complexity theory in which Boolean functions are classified according to the size or depth of Boolean circuits that compute them. A Boolean circuit with n input bits … Wikipedia
Game complexity — Combinatorial game theory has several ways of measuring game complexity. This article describes five of them: state space complexity, game tree size, decision complexity, game tree complexity, and computational complexity. Contents 1 Measures of… … Wikipedia
Proof complexity — In computer science, proof complexity is a measure of efficiency of automated theorem proving methods that is based on the size of the proofs they produce. The methods for proving contradiction in propositional logic are the most analyzed. The… … Wikipedia
Analysis of algorithms — To analyze an algorithm is to determine the amount of resources (such as time and storage) necessary to execute it. Most algorithms are designed to work with inputs of arbitrary length. Usually the efficiency or running time of an algorithm is… … Wikipedia
Statistical inference — In statistics, statistical inference is the process of drawing conclusions from data that are subject to random variation, for example, observational errors or sampling variation.[1] More substantially, the terms statistical inference,… … Wikipedia
Filter design — is the process of designing a filter (in the sense in which the term is used in signal processing, statistics, and applied mathematics), often a linear shift invariant filter, which satisfies a set of requirements, some of which are contradictory … Wikipedia