Morphogenic network/Inspiration for morphogenic network

Morphogenic network/Inspiration for morphogenic network

=Inspiration for morphogenic network=

Genetic algorithm with branching

This theory was inspired by an attempt to apply the genetic algorithm to a classifier system. The traditional genetic algorithm fails to take into account that the gene pool is not fully mixed; that a cat won’t mate with a dog, indeed, can’t mate with a dog because the phenotypes are incompatible. In general, the more “local” the phenotypes are, the higher the chance of sexual reproduction. This is apt enough, as mating a refrigerator with a car won’t do much good, but mating roller skates with ice skates might produce roller blades.

Morphogenesis

The morphospace of evolution of a given gene is not “flat”; invariant, it is approximately defined by the genes of it’s nearest neighbors. That is, if a bit string is 111010, and there are five bit strings of 111000, two of 111011, and one of each of the other four strings one hamming distance away (differing by one bit), the a posteri probability of the bit evolving to 111000 is 5/(4+2+5=11). The difference in information this represents is –log(5/11), (not –log(1/6)). A string should evolve by taking a random walk, at a constant bit rate, through a posteri (not a priori) probability space.

In a fixed-topology neural network, the problem is posed differently: neurons cannot die or reproduce. However, it is still entirely possible for each neuron to evolve according to an a posteri probability space. The surrounding neurons represent different directions in state space. The uncorrelated variance, or “information deviation”, of their output signals represents their hamming distance from each other in a priori state space. Their relative “health” determines the population growth rate of strings in that state.

Far-from equilibrium systems; non-equilibrium thermodynamics

Later contributing to the idea was the concept of dissipative structures in non-equilibrium thermodynamics. This concept includes Gibbs free energy of formation, which, is related, aptly enough, to information entropy, through thermodynamic entropy. As the ensemble of neurons evolves towards the a priori probability density, it thereby evolves towards a lower-energy state, towards “thermodynamic equilibrium”.

Self-organizing maps.

This theory can be viewed as an extension to Kohonen’s self-organizing maps, where said extension is addition of a cofactor of information deviation to the lateral feedback.

Two advantages that this cofactor presents are:

*It allows information to flow faster: It allows neurons with different output signals to exist physically close to each-other without interfering with each-other. A neuron “on top” of them would take the outputs of the neurons as input. If the received signals are more disparate, the receiving neuron is ipso facto receiving more information.
*It allows neurons to cooperatively explore state-space: If neurons physically close to each-other have similar output signals, they will follow the local energy field among them to lower energy-states, like a swarm – this I owe also to the concept of emergence and swarm intelligence.

Nonlinear Dynamics

To nonlinear dynamics, this idea owes much: the concept of phase space, attractors, the contraction of volume within phase space. Nonlinear dynamics also helped me figure out what assumptions to make about the nature of the signals processed by the neural network. Also important is an insight from John Nash: nonlinear phase space can be locally linearized. How do you solve a nonlinear problem with a neural network? By solving an ensemble of locally linear problems!

Nonequilibrium statistical mechanics

This idea is, in retrospect, inspired by the belief that an information-theoretic approach to non-equilibrium statistical mechanics (the empirical physics of complex systems such as ecologies, after dispensing with the distinction between "living" and "not living" matter) should provide the necessary formalism for an ideal artificial neural network.

The reasoning for applying principles of nonequilibrium statistical mechanics to the design of artificial neural networks is thus: Insofar as the neural network is coupled with a "physical" system, and processes "information" from that system, and is "itself" a physical system, an "information" theoretic formalism of "physics" is the appropriate formalism to use when designing a neural network.


Wikimedia Foundation. 2010.

Игры ⚽ Поможем решить контрольную работу

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”