PHYSICS : Neural networks organise themselves
Neural networks dynamically organize themselves to operate in a range that is optimal for information processing, according to a theoretical model. The results provide an understanding of how neurons interact with each other and how they can build efficient networks.
Neurons signal to each other through junctions known as synapses. Using these connections they can build extended networks. Computer simulations of neural networks indicate that for specific connection patterns, properties such as computational power or memory capacity are maximized. This picture is supported by experimental findings in cell cultures. However, how neural networks can be tuned to the optimal setting is still an open question.
Michael Herrmann and colleagues argue that there is no need for fine-tuning. They factor in that synapses are not static — that is, the efficiency of transmission through synapses depends on the frequency of their use — and show in their model that neural networks dynamically organize themselves to operate in a favourable range.
Author contact:
Michael Herrmann (Universität Göttingen, Germany)
Tel: +49 551 517 6424; E-mail: michael@nld.ds.mpg.de
Monday, November 19, 2007
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment