Real-time humanoid avatar for multimodal human-machine interaction

Yun Fu, Renxiang Li, Thomas S. Huang, Mike Danielsen

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

A novel framework of multimodal human-machine or human-human interaction via real-time humanoid avatar communication is proposed for real-world mobile application. It integrates audio-visual analysis and synthesis modules to realize real-time head tracking, multichannel and runtime animations, visual TTS and real-time viseme detection and rendering. The 3-D avatar provides customized modeling for low-bit rate virtual communication by adopting M3G standard and supports MPEG-4 FAPs. A robust user head tracker and the associated head pose and motion estimation scheme are developed for real-time avatar animation control at remote locations. The framework is recognized as an effective design for realistic industrial products of human-to-human mobile communication.

Original languageEnglish (US)
Title of host publicationProceedings of the 2007 IEEE International Conference on Multimedia and Expo, ICME 2007
PublisherIEEE Computer Society
Pages991-994
Number of pages4
ISBN (Print)1424410177, 9781424410170
DOIs
StatePublished - Jan 1 2007
EventIEEE International Conference onMultimedia and Expo, ICME 2007 - Beijing, China
Duration: Jul 2 2007Jul 5 2007

Publication series

NameProceedings of the 2007 IEEE International Conference on Multimedia and Expo, ICME 2007

Other

OtherIEEE International Conference onMultimedia and Expo, ICME 2007
Country/TerritoryChina
CityBeijing
Period7/2/077/5/07

ASJC Scopus subject areas

  • Computer Graphics and Computer-Aided Design
  • Software

Fingerprint

Dive into the research topics of 'Real-time humanoid avatar for multimodal human-machine interaction'. Together they form a unique fingerprint.

Cite this