Integration of frequency and space for multiple motion estimation and shape-independent object segmentation

Alexia Briassouli, Narendra Ahuja

Research output: Contribution to journalArticlepeer-review


A video containing multiple objects undergoing independent translational and rotational motions is analyzed through a combination of spatial- and frequency-domain representations. The Fourier transform of the sequence is used to estimate the multiple translations and rotations in a computationally efficient manner, which is also robust to local inaccuracies and global illumination changes. A novel algorithm is presented for the simultaneous extraction of all objects undergoing translation and the background via a least squares technique that takes place entirely in the Fourier domain. Spatial information is combined with the frequency domain object extraction results, to further refine them. For the case of rotational or combined, rotational and translational motions, the moving objects are segmented using purely spatial information. We show that the combined analysis takes advantage of the strengths of both representations, by providing reliable and computationally efficient motion estimates and object segmentation. The proposed algorithm is shown to be robust to local noise and occlusion, because of its global nature. Experiments are performed on synthetic and real video sequences to demonstrate the capabilities of our approach.

Original languageEnglish (US)
Article number4454233
Pages (from-to)657-669
Number of pages13
JournalIEEE Transactions on Circuits and Systems for Video Technology
Issue number5
StatePublished - May 2008


  • Fourier transform (FT)
  • Motion segmentation
  • Phase-based motion estimation
  • Video analysis

ASJC Scopus subject areas

  • Media Technology
  • Electrical and Electronic Engineering


Dive into the research topics of 'Integration of frequency and space for multiple motion estimation and shape-independent object segmentation'. Together they form a unique fingerprint.

Cite this