Abstract
Human syntactic processing shows many signs of taking place within a general-purpose short-term memory. But this kind of memory is known to have a severely constrained storage capacity - possibly constrained to as few as three or four distinct elements. This article describes a model of syntactic processing that operates successfully within these severe constraints, by recognizing constituents in a right-corner transformed representation (a variant of left-corner parsing)and mapping this representation to random variables in a Hierarchical Hidden Markov Model, a factored time-series model which probabilistically models the contents of a bounded memory store over time. Evaluations of the coverage of this model on a large syntactically annotated corpus of English sentences, and the accuracy of a bounded-memory parsing strategy based on this model, suggest this model may be cognitively plausible.
Original language | English (US) |
---|---|
Pages (from-to) | 1-30 |
Number of pages | 30 |
Journal | Computational Linguistics |
Volume | 36 |
Issue number | 1 |
DOIs | |
State | Published - Mar 2010 |
Externally published | Yes |
ASJC Scopus subject areas
- Language and Linguistics
- Linguistics and Language
- Computer Science Applications
- Artificial Intelligence