Computing with dynamic attractors in neural networks

Biosystems. 1995;34(1-3):173-95. doi: 10.1016/0303-2647(94)01451-c.

Abstract

In this paper we report on some new architectures for neural computation, motivated in part by biological considerations. One of our goals is to demonstrate that it is just as easy for a neural net to compute with arbitrary attractors--oscillatory or chaotic--as with the more usual asymptotically stable fixed points. The advantages (if any) of such architectures are currently being investigated; but it seems reasonable that the much richer dynamics of recurrent networks, so obvious in recordings of brain activity, must be useful for something. On the other hand, the constraints of computing with biological wet-ware may make chaotic dynamics unavoidable in complex nervous systems. We hypothesize also that the as yet unrivaled capabilities of the human brain derive from an ability to integrate both analog intuitive pattern recognition operations, and digital symbolic logical operations at the ground level of its hardware. To investigate these possibilities, we have constructed a parallel distributed processing architecture inspired by the structure and dynamics of cerebral cortex. The construction assumes that cortex is a set of coupled associative memories with dynamic attractors. It is guided also by a particular concept of the physical structure required of macroscopic computational systems in general for reliable computation in the presence of noise. Our challenge is to accomplish real tasks that brains can do, using ordinary differential equations, in networks that are as faithful as possible to the known anatomy and dynamics of cortex.

Publication types

  • Research Support, Non-U.S. Gov't
  • Research Support, U.S. Gov't, Non-P.H.S.

MeSH terms

  • Algorithms
  • Animals
  • Cerebral Cortex / physiology
  • Humans
  • Mathematics
  • Models, Neurological
  • Nerve Net / physiology
  • Neural Networks, Computer*
  • Nonlinear Dynamics
  • Oscillometry