TY - JOUR
T1 - Depth-Bounded Statistical PCFG Induction as a Model of Human Grammar Acquisition
AU - Jin, Lifeng
AU - Schwartz, Lane
AU - Doshi-Velez, Finale
AU - Miller, Timothy
AU - Schuler, William
N1 - The authors would like to thank the anonymous reviewers and the editor for their helpful comments. Computations for this project were partly run on the Ohio Supercomputer Center. This research was funded by Defense Advanced Research Projects Agency award HR0011-15-2-0022 and by National Science Foundation grant 1816891. The content of the information does not necessarily reflect the position or the policy of the government, and no official endorsement should be inferred.
PY - 2021/4/21
Y1 - 2021/4/21
N2 - This article describes a simple PCFG induction model with a fixed category domain that predicts a large majority of attested constituent boundaries, and predicts labels consistent with nearly half of attested constituent labels on a standard evaluation data set of child-directed speech. The article then explores the idea that the difference between simple grammars exhibited by child learners and fully recursive grammars exhibited by adult learners may be an effect of increasing working memory capacity, where the shallow grammars are constrained images of the recursive grammars. An implementation of these memory bounds as limits on center embedding in a depth-specific transform of a recursive grammar yields a significant improvement over an equivalent but unbounded baseline, suggesting that this arrangement may indeed confer a learning advantage.
AB - This article describes a simple PCFG induction model with a fixed category domain that predicts a large majority of attested constituent boundaries, and predicts labels consistent with nearly half of attested constituent labels on a standard evaluation data set of child-directed speech. The article then explores the idea that the difference between simple grammars exhibited by child learners and fully recursive grammars exhibited by adult learners may be an effect of increasing working memory capacity, where the shallow grammars are constrained images of the recursive grammars. An implementation of these memory bounds as limits on center embedding in a depth-specific transform of a recursive grammar yields a significant improvement over an equivalent but unbounded baseline, suggesting that this arrangement may indeed confer a learning advantage.
UR - http://www.scopus.com/inward/record.url?scp=85128407477&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85128407477&partnerID=8YFLogxK
U2 - 10.1162/coli_a_00399
DO - 10.1162/coli_a_00399
M3 - Article
SN - 0891-2017
VL - 47
SP - 181
EP - 216
JO - Computational Linguistics
JF - Computational Linguistics
IS - 1
ER -