A model neural network with stochastic elements in its millisecond dynamics is investigated. The network consists of neuronal units which are modelled in close analogy to physiological neurons. Dynamical variables of the network are the cellular potentials, axonic currents and synaptic efficacies. The dynamics of the synapses obeys a modified Hebbian rule and, as proposed by v. d. Malsburg (1981, 1985), develop on a time scale of a tenth of a second. In a previous publication (Buhmann and Schulten 1986) we have confirmed that the resulting noiseless autoassociative network is capable of the well-known computational tasks of formal associative networks (Cooper 1973; Kohonen et al. 1984, 1981; Hopfield 1982). In the present paper we demonstrate that random fluctuations of the membrane potential improve the performance of the network. In comparison to a deterministic network a noisy neural network can learn at lower input frequencies and with lower average neural firing rates. The electrical activity of a noisy network is very reminiscent of that observed by physiological recordings. We demonstrate furthermore that associative storage reduces the effective dimension of the phase space in which the electrical activity of the network develops.
ASJC Scopus subject areas
- Computer Science(all)