A comparison of grammatical proficiency measures in the automated assessment of spontaneous speech

Su Youn Yoon, Suma Bhat

Research output: Contribution to journalArticle

Abstract

We developed new measures that assess the level of grammatical proficiency for an automated speech proficiency scoring system. The new measures assess the range and sophistication in grammar usage based on natural language processing technology and a large corpus of learners’ spoken responses. First, we automatically identified a set of grammatical expressions associated with each proficiency level from the corpus. Next, we predicted the level of grammatical proficiency based on the similarity in the grammatical expression distribution between a learner's response and the corpus. We evaluated the strength of the association between the new measures and proficiency levels using spontaneous responses from an international English language assessment. The Pearson correlation test results showed that compared to commonly used syntactic complexity measures the proposed measures had stronger relationships with proficiency. We also explored the impact of system errors from a multi-stage automated process and found that the new measures were robust against the errors. Finally, we developed an automated scoring model which predicted the holistic oral proficiency scores. The new measures led to statistically significant improvement in agreement between human and machine scores over the previous system.

Original languageEnglish (US)
Pages (from-to)221-230
Number of pages10
JournalSpeech Communication
Volume99
DOIs
StatePublished - May 2018

    Fingerprint

Keywords

  • Automated scoring
  • Grammatical development
  • Natural language processing
  • Similarity measures
  • Syntactic complexity measures

ASJC Scopus subject areas

  • Software
  • Modeling and Simulation
  • Communication
  • Language and Linguistics
  • Linguistics and Language
  • Computer Vision and Pattern Recognition
  • Computer Science Applications

Cite this