- Instance-based learning
-
In machine learning, instance-based learning or memory-based learning is a family of learning algorithms that, instead of performing explicit generalization, compare new problem instances with instances seen in training, which have been stored in memory. Instance-based learning is a kind of lazy learning.
It is called instance-based because it constructs hypotheses directly from the training instances themselves.[1] This means that the hypothesis complexity can grow with the data:[1] in the worst case, a hypothesis is a list of n training items and classification takes O(n). One advantage that instance-based learning has over other methods of machine learning is its ability to adapt its model to previously unseen data. Where other methods generally require the entire set of training data to be re-examined when one instance is changed, instance-based learners may simply store a new instance or throw an old instance away.[citation needed]
A simple example of an instance-based learning algorithm is the k-nearest neighbor algorithm. Daelemans and Van den Bosch describe variations of this algorithm for use in natural language processing (NLP), claiming that memory-based learning is both more psychologically realistic than other machine-learning schemes and more effective in practice.[2]
References
- ^ a b Stuart Russell and Peter Norvig (2003). Artificial Intelligence: A Modern Approach, second edition, p. 733. Prentice Hall. ISBN 0-13-080302-2
- ^ Walter Daelemans and Antal van den Bosch (2005). Memory-Based Language Processing. Cambridge University Press.
External links
- TiMBL, the Tilburg Memory Based Learner is an instance-based learning package geared toward NLP
Categories:- Machine learning
- Artificial intelligence stubs
Wikimedia Foundation. 2010.