TY - JOUR
T1 - Automatic Detection of Mind Wandering from Video in the Lab and in the Classroom
AU - Bosch, Nigel
AU - D'Mello, Sidney K.
N1 - Funding Information:
This research was supported by the National Science Foundation (NSF) (DRL 1235958 and IIS 1523091).
Publisher Copyright:
© 2010-2012 IEEE.
PY - 2021
Y1 - 2021
N2 - We report two studies that used facial features to automatically detect mind wandering, a ubiquitous phenomenon whereby attention drifts from the current task to unrelated thoughts. In a laboratory study, university students $(N = 152)$(N=152) read a scientific text, whereas in a classroom study high school students $(N = 135)$(N=135) learned biology from an intelligent tutoring system. Mind wandering was measured using validated self-report methods. In the lab, we recorded face videos and analyzed these at six levels of granularity: (1) upper-body movement; (2) head pose; (3) facial textures; (4) facial action units (AUs); (5) co-occurring AUs; and (6) temporal dynamics of AUs. Due to privacy constraints, videos were not recorded in the classroom. Instead, we extracted head pose, AUs, and AU co-occurrences in real-time. Machine learning models, consisting of support vector machines (SVM) and deep neural networks, achieved $F_{1}$F1 scores of.478 and.414 (25.4 and 20.9 percent above-chance improvements, both with SVMs) for detecting mind wandering in the lab and classroom, respectively. The lab-based detectors achieved 8.4 percent improvement over the previous state-of-the-art; no comparison is available for classroom detectors. We discuss how the detectors can integrate into intelligent interfaces to increase engagement and learning by responding to wandering minds.
AB - We report two studies that used facial features to automatically detect mind wandering, a ubiquitous phenomenon whereby attention drifts from the current task to unrelated thoughts. In a laboratory study, university students $(N = 152)$(N=152) read a scientific text, whereas in a classroom study high school students $(N = 135)$(N=135) learned biology from an intelligent tutoring system. Mind wandering was measured using validated self-report methods. In the lab, we recorded face videos and analyzed these at six levels of granularity: (1) upper-body movement; (2) head pose; (3) facial textures; (4) facial action units (AUs); (5) co-occurring AUs; and (6) temporal dynamics of AUs. Due to privacy constraints, videos were not recorded in the classroom. Instead, we extracted head pose, AUs, and AU co-occurrences in real-time. Machine learning models, consisting of support vector machines (SVM) and deep neural networks, achieved $F_{1}$F1 scores of.478 and.414 (25.4 and 20.9 percent above-chance improvements, both with SVMs) for detecting mind wandering in the lab and classroom, respectively. The lab-based detectors achieved 8.4 percent improvement over the previous state-of-the-art; no comparison is available for classroom detectors. We discuss how the detectors can integrate into intelligent interfaces to increase engagement and learning by responding to wandering minds.
KW - Affective computing
KW - computer vision
KW - educational technology
KW - human-computer interaction
UR - http://www.scopus.com/inward/record.url?scp=85120527467&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85120527467&partnerID=8YFLogxK
U2 - 10.1109/TAFFC.2019.2908837
DO - 10.1109/TAFFC.2019.2908837
M3 - Article
AN - SCOPUS:85120527467
SN - 1949-3045
VL - 12
SP - 974
EP - 988
JO - IEEE Transactions on Affective Computing
JF - IEEE Transactions on Affective Computing
IS - 4
ER -