Automated Detection of Engagement Using Video-Based Estimation of Facial Expressions and Heart Rate

Hamed Monkaresi, Nigel Bosch, Rafael A. Calvo, Sidney K. D'Mello

Research output: Contribution to journalArticlepeer-review

Abstract

We explored how computer vision techniques can be used to detect engagement while students (N = 22) completed a structured writing activity (draft-feedback-review) similar to activities encountered in educational settings. Students provided engagement annotations both concurrently during the writing activity and retrospectively from videos of their faces after the activity. We used computer vision techniques to extract three sets of features from videos, heart rate, Animation Units (from Microsoft Kinect Face Tracker), and local binary patterns in three orthogonal planes (LBP-TOP). These features were used in supervised learning for detection of concurrent and retrospective self-reported engagement. Area under the ROC Curve (AUC) was used to evaluate classifier accuracy using leave-several-students-out cross validation. We achieved an AUC =.758 for concurrent annotations and AUC =.733 for retrospective annotations. The Kinect Face Tracker features produced the best results among the individual channels, but the overall best results were found using a fusion of channels.

Original languageEnglish (US)
Article number7373578
Pages (from-to)15-28
Number of pages14
JournalIEEE Transactions on Affective Computing
Volume8
Issue number1
DOIs
StatePublished - Jan 1 2017
Externally publishedYes

Keywords

  • Engagement detection
  • facial expression
  • remote heart rate measurement
  • writing task

ASJC Scopus subject areas

  • Software
  • Human-Computer Interaction

Fingerprint

Dive into the research topics of 'Automated Detection of Engagement Using Video-Based Estimation of Facial Expressions and Heart Rate'. Together they form a unique fingerprint.

Cite this