- Liquid state machine
A liquid state machine (LSM) is a computational construct, like a
neural network . An LSM consists of a large collection of units (called "nodes", or "neurons"). Each node receives time varying input from external sources (the inputs) as well as from other nodes. Nodes are randomly connected to each other. Therecurrent nature of the connections turns the time varying input into a spatio-temporal pattern of activations in the network nodes. The spatio-temporal patterns of activation are read out by linear discriminant units.The soup of recurrently connected nodes will end up computing a large variety of nonlinear functions on the input. Given a large enough variety of such nonlinear functions, it is theoretically possible to obtain linear combinations (using the read out units) to perform whatever mathematical operation is needed to perform a certain task, such as
speech recognition orcomputer vision .The word liquid in the name comes from the analogy drawn to dropping a stone into a still body of water or other liquid. The falling stone will generate
ripple s in the liquid. The input (motion of the falling stone) has been converted into a spatio-temporal pattern of liquid displacement (ripples).LSMs have been put forward as a way to explain the operation of
brain s. LSMs are argued to be an improvement over the theory of artificial neural networks because:#Circuits are not hand coded to perform a specific task.
#Continuous time inputs are handled "naturally".
#Computations on various time scales can be done using the same network.
#The same network can perform multiple computations.Criticisms of LSMs as used in
computational neuroscience are that
#LSMs don't actually explain how the brain functions. At best they can replicate some parts of brain functionality.
#There is no guaranteed way to dissect a working network and figure out how or what computations are being performed.
#Very little control over the process.
#Inefficient from an implementation point of view because they require lots of computations, compared to custom designed circuits, or even neural networks.Universal function approximation
If a reservoir has fading memory and input separability, with help of a powerful readout,it can be proven the liquid state machine is a universal function approximator using
Stone-Weierstrass theorem .ref|Maass2004See also
*
Echo state network : similar concept inrecurrent neural network .References
*citation
author = Maass, Wolfgang; Markram, Henry
year = 2004
title = On the Computational Power of Recurrent Circuits of Spiking Neurons
journal = Journal of Computer and System Sciences
volume = 69
issue = 4
pages = 593–616
doi = 10.1016/j.jcss.2004.04.001
*citation
author = Wolfgang, .; Maass, .; Thomas, .; Natschläger, .; Henry, .; Markram,
year = 2002
title = Real-Time Computing Without Stable States: a New Framework for Neural Computation Based on Perturbations
journal = 14:
volume = 14
pages = 2531–2560
url = http://neco.mitpress.org/cgi/content/abstract/14/11/2531
*citation
author = Wolfgang, .; Maass, .; Thomas, .; Natschläger, .; Henry, .; Markram,
year = 2004
title = Computational Models for Generic Cortical Microcircuits
journal = In Computational Neuroscience: a Comprehensive Approach, Ch 18
volume = 18
pages = 575–605
date = 2004
*citation
author = Fernando, Chrisantha; Sojakka, Sampsa; Of Series Lecture Notes In Computer Science, ISBN
year = 2005
title = Pattern Recognition in a Bucket
journal = In Advances in Artificial Life
pages = 978–3
url = http://www.springerlink.com/content/xlnymhf0qp946rce/
Wikimedia Foundation. 2010.