TY - JOUR
T1 - Judgments of physics problem difficulty among experts and novices
AU - Fakcharoenphol, Witat
AU - Morphew, Jason W.
AU - Mestre, José P.
N1 - Publisher Copyright:
© 2015 authors. Published by the American Physical Society.
PY - 2015/10/23
Y1 - 2015/10/23
N2 - Students' ability to effectively study for an exam, or to manage their time during an exam, is related to their metacognitive capacity. Prior research has demonstrated the effective use of metacognitive strategies during learning and retrieval is related to content expertise. Students also make judgments of their own learning and of problem difficulty to guide their studying. This study extends prior research by investigating the accuracy of novices' and experts' ability to judge problem difficulty across two experiments; here "accuracy" refers to whether or not their judgments of problem difficulty corresponds with actual exam performance in an introductory mechanics physics course. In the first experiment, physics education research (PER) experts judged the difficulty of introductory physics problems and provided the rationales behind their judgments. Findings indicate that experts use a number of different problem features to make predictions of problem difficulty. While experts are relatively accurate in judging problem difficulty, their content expertise may interfere with their ability to predict student performance on some question types. In the second experiment novices and "near experts" (graduate TAs) judged which question from a problem pair (taken from a real exam) was more difficult. The results indicate that judgments of problem difficulty are more accurate for those with greater content expertise, suggesting that the ability to predict problem difficulty is a trait of expertise which develops with experience.
AB - Students' ability to effectively study for an exam, or to manage their time during an exam, is related to their metacognitive capacity. Prior research has demonstrated the effective use of metacognitive strategies during learning and retrieval is related to content expertise. Students also make judgments of their own learning and of problem difficulty to guide their studying. This study extends prior research by investigating the accuracy of novices' and experts' ability to judge problem difficulty across two experiments; here "accuracy" refers to whether or not their judgments of problem difficulty corresponds with actual exam performance in an introductory mechanics physics course. In the first experiment, physics education research (PER) experts judged the difficulty of introductory physics problems and provided the rationales behind their judgments. Findings indicate that experts use a number of different problem features to make predictions of problem difficulty. While experts are relatively accurate in judging problem difficulty, their content expertise may interfere with their ability to predict student performance on some question types. In the second experiment novices and "near experts" (graduate TAs) judged which question from a problem pair (taken from a real exam) was more difficult. The results indicate that judgments of problem difficulty are more accurate for those with greater content expertise, suggesting that the ability to predict problem difficulty is a trait of expertise which develops with experience.
UR - http://www.scopus.com/inward/record.url?scp=84951111972&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84951111972&partnerID=8YFLogxK
U2 - 10.1103/PhysRevSTPER.11.020128
DO - 10.1103/PhysRevSTPER.11.020128
M3 - Article
AN - SCOPUS:84951111972
SN - 1554-9178
VL - 11
JO - Physical Review Special Topics - Physics Education Research
JF - Physical Review Special Topics - Physics Education Research
IS - 2
M1 - 020128
ER -