Abstract
The current paper explores possible solutions to the problem of detecting affective states from facial expressions during text/diagram comprehension, a context devoid of interactive events that can be used to infer affect. These data present an interesting challenge for face-based affect detection because likely locations of affective facial expressions within videos of students’ faces are entirely unknown. In the current study, students engaged in a text/diagram comprehension activity after which they self-reported their levels of confusion, frustration, and engagement. Data were chosen from various locations within the videos, and texture-based facial features were extracted to build affect detectors. Varying amounts of data were used as well to determine an appropriate window of data to analyze for each affect detector. Detector performance was measured using Area Under the ROC Curve (AUC), where chance level is .5 and perfect classification is 1. Confusion (AUC = .637), engagement (AUC = .554), and frustration (AUC = .609) were detected at above-chance levels. Prospects for improving the method of finding likely positions of affective states are also discussed.
Original language | English (US) |
---|---|
Number of pages | 4 |
State | Published - 2015 |
Externally published | Yes |
Event | 2015 International Conference of Educational Data Mining - Madrid, Spain Duration: Jun 26 2015 → Jun 29 2015 Conference number: 8 |
Conference
Conference | 2015 International Conference of Educational Data Mining |
---|---|
Abbreviated title | EDM 2015 |
Country/Territory | Spain |
City | Madrid |
Period | 6/26/15 → 6/29/15 |
Keywords
- affect detection
- facial expression recognition
- reading