Emotional expressions in audiovisual human computer interaction

L. S. Chen, T. S. Huang

Research output: Contribution to conferencePaperpeer-review

Abstract

Visual and auditory modalities are two of the most commonly used media in interactions between humans. In the present paper, we describe a system to continuously monitor the user's voice and facial motions for recognizing emotional expressions. Such an ability is crucial for intelligent computers that take on a social role such as a tutor or a companion. We outline methods to extract audio and visual features useful for classifying emotions. Audio and visual information must be handled appropriately in single-modal and bimodal situations. We report audio-only and video-only emotion recognition on the same subjects, in person-dependent and person-independent fashions, and outline methods to handle bimodal recognition.

Original languageEnglish (US)
Pages423-426
Number of pages4
StatePublished - Dec 1 2000
Externally publishedYes
Event2000 IEEE Internatinal Conference on Multimedia and Expo (ICME 2000) - New York, NY, United States
Duration: Jul 30 2000Aug 2 2000

Other

Other2000 IEEE Internatinal Conference on Multimedia and Expo (ICME 2000)
CountryUnited States
CityNew York, NY
Period7/30/008/2/00

ASJC Scopus subject areas

  • Engineering(all)

Fingerprint Dive into the research topics of 'Emotional expressions in audiovisual human computer interaction'. Together they form a unique fingerprint.

Cite this