TY - GEN
T1 - Multimodal affect detection in the wild
T2 - ACM International Conference on Multimodal Interaction, ICMI 2015
AU - Bosch, Nigel
N1 - Funding Information:
I would like to thank my advisor, Sidney D?Mello, for his guidance on this research. This research was supported by the National Science Foundation (NSF) (DRL 1235958) and the Bill & Melinda Gates Foundation. Any opinions, findings and conclusions, or recommendations expressed in this paper do not necessarily reflect the views of the NSF or the Bill & Melinda Gates Foundation.
PY - 2015/11/9
Y1 - 2015/11/9
N2 - Affect detection is an important component of computerized learning environments that adapt the interface and materials to students' affect. This paper proposes a plan for developing and testing multimodal affect detectors that generalize across differences in data that are likely to occur in practical applications (e.g., time, demographic variables). Facial features and interaction log features are considered as modalities for affect detection in this scenario, each with their own advantages. Results are presented for completed work evaluating the accuracy of individual modality faceand interaction-based detectors, accuracy and availability of a multimodal combination of these modalities, and initial steps toward generalization of face-based detectors. Additional data collection needed for cross-culture generalization testing is also completed. Challenges and possible solutions for proposed cross-cultural generalization testing of multimodal detectors are also discussed.
AB - Affect detection is an important component of computerized learning environments that adapt the interface and materials to students' affect. This paper proposes a plan for developing and testing multimodal affect detectors that generalize across differences in data that are likely to occur in practical applications (e.g., time, demographic variables). Facial features and interaction log features are considered as modalities for affect detection in this scenario, each with their own advantages. Results are presented for completed work evaluating the accuracy of individual modality faceand interaction-based detectors, accuracy and availability of a multimodal combination of these modalities, and initial steps toward generalization of face-based detectors. Additional data collection needed for cross-culture generalization testing is also completed. Challenges and possible solutions for proposed cross-cultural generalization testing of multimodal detectors are also discussed.
KW - Affect detection
KW - Classroom data
KW - Detector generalization
KW - In the wild
UR - http://www.scopus.com/inward/record.url?scp=84959303293&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84959303293&partnerID=8YFLogxK
U2 - 10.1145/2818346.2823316
DO - 10.1145/2818346.2823316
M3 - Conference contribution
AN - SCOPUS:84959303293
T3 - ICMI 2015 - Proceedings of the 2015 ACM International Conference on Multimodal Interaction
SP - 645
EP - 649
BT - ICMI 2015 - Proceedings of the 2015 ACM International Conference on Multimodal Interaction
PB - Association for Computing Machinery, Inc
Y2 - 9 November 2015 through 13 November 2015
ER -