A validated scoring rubric for explain-in-plain-english questions

Binglin Chen, Sushmita Azad, Rajarshi Haldar, Matthew West, Craig Zilles

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Previous research has identified the ability to read code and understand its high-level purpose as an important developmental skill that is harder to do (for a given piece of code) than executing code in one's head for a given input (code tracing), but easier to do than writing the code. Prior work involving code reading (Explain in plain English) problems, have used a scoring rubric inspired by the SOLO taxonomy, but we found it difficult to employ because it didn't adequately handle the three dimensions of answer quality: correctness, level of abstraction, and ambiguity. In this paper, we describe a 7-point rubric that we developed for scoring student responses to Explain in plain English questions, and we validate this rubric through four means. First, we find that the scale can be reliably applied with with a median Krippendorff's alpha (inter-rater reliability) of 0.775. Second, we report on an experiment to assess the validity of our scale. Third, we find that a survey consisting of 12 code reading questions had a high internal consistency (Cronbach's alpha = 0.954). Last, we find that our scores for code reading questions in a large enrollment (N = 452) data structures course are correlated (Pearson's R = 0.555) to code writing performance to a similar degree as found in previous work.

Original languageEnglish (US)
Title of host publicationSIGCSE 2020 - Proceedings of the 51st ACM Technical Symposium on Computer Science Education
PublisherAssociation for Computing Machinery
Pages563-569
Number of pages7
ISBN (Electronic)9781450367936
DOIs
StatePublished - Feb 26 2020
Event51st ACM SIGCSE Technical Symposium on Computer Science Education, SIGCSE 2020 - Portland, United States
Duration: Mar 11 2020Mar 14 2020

Publication series

NameAnnual Conference on Innovation and Technology in Computer Science Education, ITiCSE
ISSN (Print)1942-647X

Conference

Conference51st ACM SIGCSE Technical Symposium on Computer Science Education, SIGCSE 2020
CountryUnited States
CityPortland
Period3/11/203/14/20

Keywords

  • Code reading
  • Cs1
  • Experience report
  • Reliability
  • Validity

ASJC Scopus subject areas

  • Management of Technology and Innovation
  • Education

Fingerprint Dive into the research topics of 'A validated scoring rubric for explain-in-plain-english questions'. Together they form a unique fingerprint.

  • Cite this

    Chen, B., Azad, S., Haldar, R., West, M., & Zilles, C. (2020). A validated scoring rubric for explain-in-plain-english questions. In SIGCSE 2020 - Proceedings of the 51st ACM Technical Symposium on Computer Science Education (pp. 563-569). (Annual Conference on Innovation and Technology in Computer Science Education, ITiCSE). Association for Computing Machinery. https://doi.org/10.1145/3328778.3366879