Quadratic classifier

Quadratic classifier

A quadratic classifier is used in machine learning to separate measurements of two or more classes of objects or events by a quadric surface. It is a more general version of the linear classifier.

The classification problem

Statistical classification considers a set of vectors of observations x of an object or event, each of which has a known type "y". This set is referred to as the training set. The problem is then to determine for a given new observation vector, what the best class should be. For a quadratic classifier, the correct solution is assumed to be quadratic in the measurements, so "y" will be decided based on

: mathbf{x^T A x} + mathbf{b^T x} + c

In the special case where each observation consists of two measurements, this means that the surfaces separating the classes will be conic sections ("i.e." either a line, a circle or ellipse, a parabola or a hyperbola).

Quadratic discriminant analysis

Quadratic discriminant analysis (QDA) is closely related to linear discriminant analysis (LDA), where it is assumed that there are only two classes of points (so y in {0,1 } ), and that the measurements are normally distributed. Unlike LDA however, in QDA there is no assumption that the covariance of each of the classes is identical. When the assumption is true, the best possible test for the hypothesis that a given measurement is from a given class is the likelihood ratio test. Suppose the means of each class are known to be mu_{y=0},mu_{y=1} and the covariances Sigma_{y=0}, Sigma_{y=1} . Then the likelihood ratio will be given by

:Likelihood ratio = frac{ sqrt{2 pi |Sigma_{y=1}^{-1} exp left( -frac{1}{2}(x-mu_{y=1})^T Sigma_{y=1}^{-1} (x-mu_{y=1}) ight) }{ sqrt{2 pi |Sigma_{y=0}^{-1} exp left( -frac{1}{2}(x-mu_{y=0})^T Sigma_{y=0}^{-1} (x-mu_{y=0}) ight)} < t

for some threshold t. After some rearrangement, it can be shown that the resulting separating surface between the classes is a quadratic.

Other quadratic classifiers

While QDA is the most commonly used method for obtaining a classifier, other methods are also possible. One such method is to create a longer measurement vector from the old one by adding all pairwise products ofindividual measurements. For instance, the vector

: [x_1, ; x_2, ; x_3]

would become

: [x_1, ; x_2, ; x_3, ; x_1^2, ; x_1x_2, ; x_1 x_3, ; x_2^2, ; x_2x_3, ; x_3^2] .

Finding a quadratic classifier for the original measurements would then become the same as finding a linear classifier based on the expanded measurement vector. For linear classifiers based only on dot products, these expanded measurements do not have to be actually computed, since the dot product in the higher dimensional space is simply related to that in the original space. This is an example of the so-called kernel trick, which can be applied to linear discriminant analysis, as well as the support vector machine.


Wikimedia Foundation. 2010.

Игры ⚽ Поможем написать курсовую

Look at other dictionaries:

  • Quadratic — In mathematics, the term quadratic describes something that pertains to squares, to the operation of squaring, to terms of the second degree, or equations or formulas that involve such terms. Quadratus is Latin for square . Mathematics Algebra… …   Wikipedia

  • Linear classifier — In the field of machine learning, the goal of classification is to group items that have similar feature values, into groups. A linear classifier achieves this by making a classification decision based on the value of the linear combination of… …   Wikipedia

  • Linear discriminant analysis — (LDA) and the related Fisher s linear discriminant are methods used in statistics, pattern recognition and machine learning to find a linear combination of features which characterize or separate two or more classes of objects or events. The… …   Wikipedia

  • List of mathematics articles (Q) — NOTOC Q Q analog Q analysis Q derivative Q difference polynomial Q exponential Q factor Q Pochhammer symbol Q Q plot Q statistic Q systems Q test Q theta function Q Vandermonde identity Q.E.D. QED project QR algorithm QR decomposition Quadratic… …   Wikipedia

  • Optimal discriminant analysis — (ODA) and the related classification tree analysis (CTA) are statistical methods that maximize predictive accuracy. For any specific sample and exploratory or confirmatory hypothesis, optimal discriminant analysis (ODA) identifies the statistical …   Wikipedia

  • QDA — may refer to:* Qualitative Data Analysis as used in qualitative research * The .QDA extension is used for Quadruple D archives * Quadratic Discriminant Analysis as used in statistical classification or as a quadratic classifier in machine… …   Wikipedia

  • Support vector machine — Support vector machines (SVMs) are a set of related supervised learning methods used for classification and regression. Viewing input data as two sets of vectors in an n dimensional space, an SVM will construct a separating hyperplane in that… …   Wikipedia

  • List of statistics topics — Please add any Wikipedia articles related to statistics that are not already on this list.The Related changes link in the margin of this page (below search) leads to a list of the most recent changes to the articles listed below. To see the most… …   Wikipedia

  • Statistical classification — See also: Pattern recognition See also: Classification test In machine learning, statistical classification is the problem of identifying the sub population to which new observations belong, where the identity of the sub population is unknown, on …   Wikipedia

  • Perceptron — Perceptrons redirects here. For the book of that title, see Perceptrons (book). The perceptron is a type of artificial neural network invented in 1957 at the Cornell Aeronautical Laboratory by Frank Rosenblatt.[1] It can be seen as the simplest… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”