The N300: An Index for Predictive Coding of Complex Visual Objects and Scenes

Research output: Contribution to journalArticlepeer-review


Predictive coding models can simulate known perceptual or neuronal phenomena, but there have been fewer attempts to identify a reliable neural signature of predictive coding for complex stimuli. In a pair of studies, we test whether the N300 component of the event-related potential, occurring 250–350-ms poststimulus-onset, has the response properties expected for such a signature of perceptual hypothesis testing at the level of whole objects and scenes. We show that N300 amplitudes are smaller to representative (“good exemplars”) compared with less representative (“bad exemplars”) items from natural scene categories. Integrating these results with patterns observed for objects, we establish that, across a variety of visual stimuli, the N300 is responsive to statistical regularity, or the degree to which the input is “expected” (either explicitly or implicitly) based on prior knowledge, with statistically regular images evoking a reduced response. Moreover, we show that the measure exhibits context-dependency; that is, we find the N300 sensitivity to category representativeness when stimuli are congruent with, but not when they are incongruent with, a category pre-cue. Thus, we argue that the N300 is the best candidate to date for an index of perceptual hypotheses testing for complex visual objects and scenes.
Original languageEnglish (US)
Article numbertgab030
JournalCerebral Cortex Communications
Issue number2
StatePublished - Apr 2021


  • N300
  • visual perception
  • statistical regularities
  • predictive coding


Dive into the research topics of 'The N300: An Index for Predictive Coding of Complex Visual Objects and Scenes'. Together they form a unique fingerprint.

Cite this