Vapnik-Chervonenkis theory
- Vapnik-Chervonenkis theory
Vapnik-Chervonenkis theory (also known as VC theory) was developed during 1960-1990 by Vladimir Vapnik and Alexey Chervonenkis. The theory is a form of computational learning theory, which attempts to explain the learning process from a statistical point of view.
VC theory is related to statistical learning theory and to empirical processes. Richard M. Dudley and Vladimir Vapnik himself, among others, apply VC-theory to empirical processes.
VC theory covers at least four parts (as explained in "The Nature of Statistical Learning Theory"ref|nslt):
*Theory of consistency of learning processes
**What are (necessary and sufficient) conditions for consistency of a learning process based on the empirical risk minimization principle ?
*Nonasymptotic theory of the rate of convergence of learning processes
**How fast is the rate of convergence of the learning process?
*Theory of controlling the generalization ability of learning processes
**How can one control the rate of convergence (the generalization ability) of the learning process?
*Theory of constructing learning machines
**How can one construct algorithms that can control the generalization ability?
In addition, VC theory and VC dimension are instrumental in the theory of empirical processes, in the case of processes indexed by VC classes.
The last part of VC theory introduced a well-known learning algorithm: the support vector machine.
VC theory contains important concepts such as the VC dimension and structural risk minimization. This theory is related to mathematical subjects such as:
* reproducing kernel Hilbert spaces
* regularization networks
* kernels
* empirical processes
References
* cite book
last=Vapnik
first=Vladimir N
authorlink = Vladimir Vapnik
title=The Nature of Statistical Learning Theory
publisher = Springer-Verlag
series=Information Science and Statistics
year = 2000
isbn=978-0-387-98780-4
*cite book
last=Vapnik
first=Vladimir N
authorlink = Vladimir Vapnik
title="Statistical Learning Theory"
publisher = Wiley-Interscience
year = 1989
isbn=0-471-03003-1
* See references in articles: Richard M. Dudley, empirical processes, shattering.
Wikimedia Foundation.
2010.
Look at other dictionaries:
Theorie de Vapnik-Chervonenkis — Théorie de Vapnik Chervonenkis La théorie de Vapnik Chervonenkis (également connue sous le nom de théorie VC) est une théorie mathématique et informatique développée dans les années 1960 1990 par Vladimir Vapnik et Alexey Chervonenkis. C est une… … Wikipédia en Français
Théorie de vapnik-chervonenkis — La théorie de Vapnik Chervonenkis (également connue sous le nom de théorie VC) est une théorie mathématique et informatique développée dans les années 1960 1990 par Vladimir Vapnik et Alexey Chervonenkis. C est une forme de théorie de l… … Wikipédia en Français
Théorie de Vapnik-Chervonenkis — La théorie de Vapnik Chervonenkis (également connue sous le nom de théorie VC) est une théorie mathématique et informatique développée dans les années 1960 1990 par Vladimir Vapnik et Alexey Chervonenkis. C est une forme de théorie de l… … Wikipédia en Français
Vladimir Vapnik — Vladimir Naumovich Vapnik ( ru. Владимир Наумович Вапник) is one of the main developers of Vapnik Chervonenkis theory. He was born in the Soviet Union. He received his master s degree in mathematics at the [http://nuu.uz/?ln=en Uzbek State… … Wikipedia
Alexey Chervonenkis — Alexey Jakovlevich Chervonenkis (Russian: Алексей Яковлевич Червоненкис; born 7 September 1938) is a Soviet and Russian mathematician, and, with Vladimir Vapnik, was one of the main developers of the Vapnik–Chervonenkis theory, also known as the… … Wikipedia
NIP (model theory) — In model theory, a branch of mathematical logic, a complete theory T is said to satisfy NIP (or not the independence property ) if none of its formulae satisfy the independence property, that is if none of its formulae can pick out any given… … Wikipedia
Statistical learning theory — is an ambiguous term.#It may refer to computational learning theory, which is a sub field of theoretical computer science that studies how algorithms can learn from data. #It may refer to Vapnik Chervonenkis theory, which is a specific approach… … Wikipedia
Vladimir Vapnik — Vladimir Naumovich Vapnik (6 décembre 1935) est l un des principaux contributeurs à la théorie de Vapnik Chervonenkis. Né en Union Soviétique, il obtient en 1958 un mastère de mathématiques à l Université d État d Ouzbékistan, à Samarkand, en… … Wikipédia en Français
Computational learning theory — In theoretical computer science, computational learning theory is a mathematical field related to the analysis of machine learning algorithms. Contents 1 Overview 2 See also 3 References 3.1 Surveys … Wikipedia
VC dimension — In computational learning theory, the VC dimension (for Vapnik Chervonenkis dimension) is a measure of the capacity of a statistical classification algorithm, defined as the cardinality of the largest set of points that the algorithm can shatter … Wikipedia