Learning rate and attractor size of the single-layer perceptron

Martin S. Singleton, Alfred W. Hübler

Research output: Contribution to journalArticlepeer-review

Abstract

We study the simplest possible order one single-layer perceptron with two inputs, using the delta rule with online learning, in order to derive closed form expressions for the mean convergence rates. We investigate the rate of convergence in weight space of the weight vectors corresponding to each of the 14 out of 16 linearly separable rules. These vectors follow zigzagging lines through the piecewise constant vector field to their respective attractors. Based on our studies, we conclude that a single-layer perceptron with N inputs will converge in an average number of steps given by an Nth order polynomial in t l, where t is the threshold, and l is the size of the initial weight distribution. Exact values for these averages are provided for the five linearly separable classes with N=2. We also demonstrate that the learning rate is determined by the attractor size, and that the attractors of a single-layer perceptron with N inputs partition RN ⊗ RN.

Original languageEnglish (US)
Article number026704
JournalPhysical Review E - Statistical, Nonlinear, and Soft Matter Physics
Volume75
Issue number2
DOIs
StatePublished - Feb 26 2007

ASJC Scopus subject areas

  • Statistical and Nonlinear Physics
  • Statistics and Probability
  • Condensed Matter Physics

Fingerprint

Dive into the research topics of 'Learning rate and attractor size of the single-layer perceptron'. Together they form a unique fingerprint.

Cite this