TY - GEN
T1 - Psychometric Evaluation of the Cybersecurity Curriculum Assessment
AU - Herman, Geoffrey L.
AU - Huang, Shan
AU - Peterson, Peter A.
AU - Oliva, Linda
AU - Golaszewski, Enis
AU - Sherman, Alan T.
N1 - We thank the many people who contributed to the CATS project as Delphi experts, interview subjects, Hackathon participants, expert reviewers, student subjects, and former team members, including Michael Neary, Spencer Offenberger, Geet Parekh, Konstantinos Patsourakos, Dhananjay Phatak, and Julia Thompson. Support for this research was provided in part by the U.S. Department of Defense under CAE-R grants H98230-15-1-0294, H98230-15-1-0273, H98230-17-1-0349, H98230-17-1-0347; and by the National Science Foundation under UMBC SFS grants DGE-1241576, 1753681, and SFS Capacity Grants DGE-1819521, 1820531.
PY - 2023/3/2
Y1 - 2023/3/2
N2 - We present a psychometric evaluation of the Cybersecurity Curriculum Assessment (CCA), completed by 193 students from seven colleges and universities. The CCA builds on our prior work developing and validating a Cybersecurity Concept Inventory (CCI), which measures students' conceptual understanding of cybersecurity after a first course in the area. The CCA deepens the conceptual complexity and technical depth expectations, assessing conceptual knowledge of students who had completed multiple courses in cybersecurity. We review our development of the CCA and present our evaluation of the instrument using Classical Test Theory and Item-Response Theory. The CCA is a difficult assessment, providing reliable measurements of student knowledge and deeper information about high-performing students.
AB - We present a psychometric evaluation of the Cybersecurity Curriculum Assessment (CCA), completed by 193 students from seven colleges and universities. The CCA builds on our prior work developing and validating a Cybersecurity Concept Inventory (CCI), which measures students' conceptual understanding of cybersecurity after a first course in the area. The CCA deepens the conceptual complexity and technical depth expectations, assessing conceptual knowledge of students who had completed multiple courses in cybersecurity. We review our development of the CCA and present our evaluation of the instrument using Classical Test Theory and Item-Response Theory. The CCA is a difficult assessment, providing reliable measurements of student knowledge and deeper information about high-performing students.
KW - classical test theory
KW - concept inventories
KW - cybersecurity assessment tools (cats)
KW - cybersecurity curriculum assessment
KW - cybersecurity education
KW - item response theory
KW - psychometrics
UR - http://www.scopus.com/inward/record.url?scp=85149828266&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85149828266&partnerID=8YFLogxK
U2 - 10.1145/3545945.3569762
DO - 10.1145/3545945.3569762
M3 - Conference contribution
AN - SCOPUS:85149828266
T3 - SIGCSE 2023 - Proceedings of the 54th ACM Technical Symposium on Computer Science Education
SP - 228
EP - 234
BT - SIGCSE 2023 - Proceedings of the 54th ACM Technical Symposium on Computer Science Education
PB - Association for Computing Machinery
T2 - 54th ACM Technical Symposium on Computer Science Education, SIGCSE 2023
Y2 - 15 March 2023 through 18 March 2023
ER -