Multiclass classification

Multiclass classification

In machine learning, multiclass or multinomial classification is the problem of classifying instances into more than two classes.

While some classification algorithms naturally permit the use of more than two classes, others are by nature binary algorithms; these can, however, be turned into multinomial classifiers by a variety of strategies. Among these strategies are the one-vs.-all (or one-vs.-rest, OvA or OvR) strategy, where a single classifier is trained per class to distinguish that class from all other classes. Prediction is then performed by predicting using each binary classifier, and choosing the prediction with the highest confidence score (e.g., the highest probability of a classifier such as Naive Bayes).

Multiclass classification should not be confused with multi-label classification, where multiple classes are to be predicted for each problem instance.


Wikimedia Foundation. 2010.

Игры ⚽ Нужна курсовая?

Look at other dictionaries:

  • Classification rule — See also: Statistical classification and Classification in machine learning Given a population whose members can be potentially separated into a number of different sets or classes, a classification rule is a procedure in which the elements… …   Wikipedia

  • Statistical classification — See also: Pattern recognition See also: Classification test In machine learning, statistical classification is the problem of identifying the sub population to which new observations belong, where the identity of the sub population is unknown, on …   Wikipedia

  • One-class classification — tries to distinguish one class of objects from all other possible objects, by learning from a training set containing only the objects of that class. This is different from and more difficult than the traditional classification problem, which… …   Wikipedia

  • Multi-label classification — In machine learning, multi label classification is a variant of the classification problem where multiple target labels must be assigned to each instance. Multi label classification should not be confused with multiclass classification, which is… …   Wikipedia

  • Random multinomial logit — In statistics and machine learning, random multinomial logit (RMNL) is a technique for (multi class) statistical classification using repeated multinomial logit analyses via Leo Breiman s random forests. Rationale for the new methodSeveral… …   Wikipedia

  • Perceptron — Perceptrons redirects here. For the book of that title, see Perceptrons (book). The perceptron is a type of artificial neural network invented in 1957 at the Cornell Aeronautical Laboratory by Frank Rosenblatt.[1] It can be seen as the simplest… …   Wikipedia

  • Random forest — In machine learning, a random forest is a classifier that consists of many decision trees and outputs the class that is the mode of the classes output by individual trees. The algorithm for inducing a random forest was developed by Leo Breiman… …   Wikipedia

  • Margin Infused Relaxed Algorithm — (MIRA)[1] is a machine learning algorithm, an online algorithm for multiclass classification problems. It is designed to learn a set of parameters (vector or matrix) by processing all the given training examples one by one and updating the… …   Wikipedia

  • Random naive Bayes — extends the Naive Bayes classifier by adopting the random forest principles: random input selection (bagging, i.e. bootstrap aggregating) and random feature selection ( [Breiman, 2001] ). Naive Bayes classifier Naive Bayes is a probabilistic… …   Wikipedia

  • Linear discriminant analysis — (LDA) and the related Fisher s linear discriminant are methods used in statistics, pattern recognition and machine learning to find a linear combination of features which characterize or separate two or more classes of objects or events. The… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”