Experiences and Lessons Learned Creating and Validating Concept Inventories for Cybersecurity

Alan T. Sherman, Geoffrey L. Herman, Linda Oliva, Peter A.H. Peterson, Enis Golaszewski, Seth Poulsen, Travis Scheponik, Akshita Gorti

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We reflect on our ongoing journey in the educational Cybersecurity Assessment Tools (CATS) Project to create two concept inventories for cybersecurity. We identify key steps in this journey and important questions we faced. We explain the decisions we made and discuss the consequences of those decisions, highlighting what worked well and what might have gone better. The CATS Project is creating and validating two concept inventories—conceptual tests of understanding—that can be used to measure the effectiveness of various approaches to teaching and learning cybersecurity. The Cybersecurity Concept Inventory (CCI) is for students who have recently completed any first course in cybersecurity; the Cybersecurity Curriculum Assessment (CCA) is for students who have recently completed an undergraduate major or track in cybersecurity. Each assessment tool comprises 25 multiple-choice questions (MCQs) of various difficulties that target the same five core concepts, but the CCA assumes greater technical background. Key steps include defining project scope, identifying the core concepts, uncovering student misconceptions, creating scenarios, drafting question stems, developing distractor answer choices, generating educational materials, performing expert reviews, recruiting student subjects, organizing workshops, building community acceptance, forming a team and nurturing collaboration, adopting tools, and obtaining and using funding. Creating effective MCQs is difficult and time-consuming, and cybersecurity presents special challenges. Because cybersecurity issues are often subtle, where the adversarial model and details matter greatly, it is challenging to construct MCQs for which there is exactly one best but non-obvious answer. We hope that our experiences and lessons learned may help others create more effective concept inventories and assessments in STEM.

Original languageEnglish (US)
Title of host publicationNational Cyber Summit Research Track, NCS 2020
EditorsKim-Kwang Raymond Choo, Tommy Morris, Eric Imsand, Gilbert L. Peterson
PublisherSpringer Science and Business Media Deutschland GmbH
Pages3-34
Number of pages32
ISBN (Print)9783030587024
DOIs
StatePublished - 2021
EventNational Cyber Summit, NCS 2020 - Huntsville, United States
Duration: Jun 2 2020Jun 4 2020

Publication series

NameAdvances in Intelligent Systems and Computing
Volume1271 AISC
ISSN (Print)2194-5357
ISSN (Electronic)2194-5365

Conference

ConferenceNational Cyber Summit, NCS 2020
CountryUnited States
CityHuntsville
Period6/2/206/4/20

Keywords

  • Computer science education
  • Concept inventories
  • Cryptography
  • Cybersecurity Assessment Tools (CATS)
  • Cybersecurity Concept Inventory (CCI)
  • Cybersecurity Curriculum Assessment (CCA)
  • Cybersecurity education
  • Multiple-choice questions

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Computer Science(all)

Fingerprint Dive into the research topics of 'Experiences and Lessons Learned Creating and Validating Concept Inventories for Cybersecurity'. Together they form a unique fingerprint.

  • Cite this

    Sherman, A. T., Herman, G. L., Oliva, L., Peterson, P. A. H., Golaszewski, E., Poulsen, S., Scheponik, T., & Gorti, A. (2021). Experiences and Lessons Learned Creating and Validating Concept Inventories for Cybersecurity. In K-K. R. Choo, T. Morris, E. Imsand, & G. L. Peterson (Eds.), National Cyber Summit Research Track, NCS 2020 (pp. 3-34). (Advances in Intelligent Systems and Computing; Vol. 1271 AISC). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-58703-1_1