- Adaptive resonance theory
Adaptive Resonance Theory (ART) is a
neural network architecture developed byStephen Grossberg andGail Carpenter .Learning model
The basic ART system is an unsupervised learning model. It typically consists of a comparison field and a recognition field composed of neurons, a vigilance parameter, and a reset module. The vigilance parameter has considerable influence on the system: higher vigilance produces highly detailed memories (many, fine-grained categories), while lower vigilance results in more general memories (fewer, more-general categories). The comparison field takes an input vector (a one-dimensional array of values) and transfers it to its best match in the recognition field. Its best match is the single neuron whose set of weights (weight vector) most closely matches the input vector. Each recognition field neuron outputs a negative signal (proportional to that neuron’s quality of match to the input vector) to each of the other recognition field neurons and inhibits their output accordingly. In this way the recognition field exhibits lateral inhibition, allowing each neuron in it to represent a category to which input vectors are classified. After the input vector is classified, the reset module compares the strength of the recognition match to the vigilance parameter. If the vigilance threshold is met, training commences. Otherwise, if the match level does not meet the vigilance parameter, the firing recognition neuron is inhibited until a new input vector is applied; training commences only upon completion of a search procedure. In the search procedure, recognition neurons are disabled one by one by the reset function until the vigilance parameter is satisfied by a recognition match. If no committed recognition neuron’s match meets the vigilance threshold, then an uncommitted neuron is committed and adjusted towards matching the input vector.
Training
There are two basic methods of training ART-based neural networks: slow and fast. In the slow learning method, the degree of training of the recognition neuron’s weights towards the input vector is calculated to continuous values with differential equations and is thus dependent on the length of time the input vector is presented. With fast learning, algebraic equations are used to calculate degree of weight adjustments to be made, and binary values are used. While fast learning is effective and efficient for a variety of tasks, the slow learning method is more biologically plausible and can be used with continuous-time networks (i.e. when the input vector can vary continuously).
Types of ART
ART 1
ART 1Carpenter, G.A. & Grossberg, S. (2003), [http://cns.bu.edu/Profiles/Grossberg/CarGro2003HBTNN2.pdf Adaptive Resonance Theory] , In M.A. Arbib (Ed.), The Handbook of Brain Theory and Neural Networks, Second Edition (pp. 87-90). Cambridge, MA: MIT Press] Grossberg, S. (1987), [http://www.cns.bu.edu/Profiles/Grossberg/Gro1987CogSci.pdf Competitive learning: From interactive activation to adaptive resonance] ,
Cognitive Science (Publication) , 11, 23-63] is the simplest variety of ART networks, accepting only binary inputs.ART 2
ART 2Carpenter, G.A. & Grossberg, S. (1987), [http://cns-web.bu.edu/Profiles/Grossberg/CarGro1987AppliedOptics.pdf ART 2: Self-organization of stable category recognition codes for analog input patterns] ,
Applied Optics , 26(23), 4919-4930] extends network capabilities to support continuous inputs.ART 2-A
ART 2-ACarpenter, G.A., Grossberg, S., & Rosen, D.B. (1991a), [http://cns.bu.edu/Profiles/Grossberg/CarGroRos1991NNART2A.pdf ART 2-A: An adaptive resonance algorithm for rapid category learning and recognition] ,
Neural Networks (Publication) , 4, 493-504] is a streamlined form of ART-2 with a drastically accelerated runtime, and with qualitative results being only rarely inferior to the full ART-2 implementation.ART 3
ART 3Carpenter, G.A. & Grossberg, S. (1990), [http://cns.bu.edu/Profiles/Grossberg/CarGro1990NN.pdf ART 3: Hierarchical search using chemical transmitters in self-organizing pattern recognition architectures] ,
Neural Networks (Publication) , 3, 129-152] builds on ART-2 by simulating rudimentary neurotransmitter regulation of synaptic activity by incorporating simulated sodium (Na+) and calcium (Ca2+) ion concentrations into the system’s equations, which results in a more physiologically realistic means of partially inhibiting categories that trigger mismatch resets.Fuzzy ART
Fuzzy ARTCarpenter, G.A., Grossberg, S., & Rosen, D.B. (1991b), [http://cns.bu.edu/Profiles/Grossberg/CarGroRos1991NNFuzzyART.pdf Fuzzy ART: Fast stable learning and categorization of analog patterns by an adaptive resonance system] ,
Neural Networks (Publication) , 4, 759-771] implements fuzzy logic into ART’s pattern recognition, thus enhancing generalizability. An optional (and very useful) feature of fuzzy ART is complement coding, a means of incorporating the absence of features into pattern classifications, which goes a long way towards preventing inefficient and unnecessary category proliferation.ARTMAP
Wikimedia Foundation. 2010.