In this paper, an MPEG4 real-time performance-driven avatar is proposed. The facial motion parameters, namely head rotation angles and translation distances as well as a set of action unit weights, are estimated from live video on a per-frame basis using a two-step robust facial motion tracking algorithm. The estimated parameters are then converted into MPEG4 facial animation parameter (FAP) values by solving an overdetermined constrained linear least-squares optimization problem. These FAP values can be used to drive an MPEG4-compliant face model to generate on-the-fly realistic facial animation that resembles the facial actions of the performer in the video. Such a real-time performance-driven avatar has important applications in a broad range of areas including very low bit rate video communication, movie and game industries as well as human-computer intelligent interaction.