Recognizing emotion in virtual agent, synthetic human, and human facial expressions

Jenay M. Beer, Arthur D. Fisk, Wendy A. Rogers

Research output: Chapter in Book/Report/Conference proceedingConference contribution


A growing interest in the HCI community is the design and development of embodied agents in virtual environments. For virtual environments where social interaction is needed, an agent's facial expression may communicate emotive state to users both young and old. However, younger and older adults differ in how they label human facial expressions (Ruffman et al., 2008). Such possible age-related differences in labeling virtual agent expressions may impact the user's social experience in a virtual environment. The purpose of the current research was to investigate age-related differences in emotion recognition of several on-screen characters of varying degrees of human-likeness. Participants performed a recognition task with three characters demonstrating four basic emotions or neutral. The results indicated age-related differences for all character types. Older adults commonly mislabeled the human and synthetic human emotions of anger, fear, sadness, and neutral. For the virtual agent face, older adults commonly mislabeled the emotions of anger, fear, happiness, and neutral.

Original languageEnglish (US)
Title of host publication54th Human Factors and Ergonomics Society Annual Meeting 2010, HFES 2010
PublisherHuman Factors and Ergonomics Society Inc.
Number of pages5
ISBN (Print)9781617820885
StatePublished - 2010
Externally publishedYes

Publication series

NameProceedings of the Human Factors and Ergonomics Society
ISSN (Print)1071-1813

ASJC Scopus subject areas

  • Human Factors and Ergonomics


Dive into the research topics of 'Recognizing emotion in virtual agent, synthetic human, and human facial expressions'. Together they form a unique fingerprint.

Cite this