A fast 2D shape recovery approach by fusing features and appearance

Jianke Zhu, Michael R. Lyu, Thomas S. Huang

Research output: Contribution to journalArticlepeer-review


In this paper, we present a fusion approach to solve the nonrigid shape recovery problem, which takes advantage of both the appearance information and the local features. We have two major contributions. First, we propose a novel progressive finite Newton optimization scheme for the feature-based nonrigid surface detection problem, which is reduced to only solving a set of linear equations. The key is to formulate the nonrigid surface detection as an unconstrained quadratic optimization problem that has a closed-form solution for a given set of observations. Second, we propose a deformable Lucas-Kanade algorithm that triangulates the template image into small patches and constrains the deformation through the second-order derivatives of the mesh vertices. We formulate it into a sparse regularized least squares problem, which is able to reduce the computational cost and the memory requirement. The inverse compositional algorithm is applied to efficiently solve the optimization problem. We have conducted extensive experiments for performance evaluation on various environments, whose promising results show that the proposed algorithm is both efficient and effective.

Original languageEnglish (US)
Pages (from-to)1210-1224
Number of pages15
JournalIEEE transactions on pattern analysis and machine intelligence
Issue number7
StatePublished - 2009
Externally publishedYes


  • Image processing and computer vision
  • Medical image registration
  • Nonrigid augmented reality
  • Nonrigid detection
  • Real-time deformable registration

ASJC Scopus subject areas

  • Software
  • Computer Vision and Pattern Recognition
  • Computational Theory and Mathematics
  • Artificial Intelligence
  • Applied Mathematics


Dive into the research topics of 'A fast 2D shape recovery approach by fusing features and appearance'. Together they form a unique fingerprint.

Cite this