Video event detection: From subvolume localization to spatiotemporal path search

Du Tran, Junsong Yuan, David Forsyth

Research output: Contribution to journalArticlepeer-review


Although sliding window-based approaches have been quite successful in detecting objects in images, it is not a trivial problem to extend them to detecting events in videos. We propose to search for spatiotemporal paths for video event detection. This new formulation can accurately detect and locate video events in cluttered and crowded scenes, and is robust to camera motions. It can also well handle the scale, shape, and intraclass variations of the event. Compared to event detection using spatiotemporal sliding windows, the spatiotemporal paths correspond to the event trajectories in the video space, thus can better handle events composed by moving objects. We prove that the proposed search algorithm can achieve the global optimal solution with the lowest complexity. Experiments are conducted on realistic video data sets with different event detection tasks, such as anomaly event detection, walking person detection, and running detection. Our proposed method is compatible with different types of video features or object detectors and robust to false and missed local detections. It significantly improves the overall detection and localization accuracy over the state-of-the-art methods.

Original languageEnglish (US)
Article number6567857
Pages (from-to)404-416
Number of pages13
JournalIEEE transactions on pattern analysis and machine intelligence
Issue number2
StatePublished - Feb 2014


  • Event detection
  • action detection
  • dynamic programming
  • max-path search
  • multiple event detection

ASJC Scopus subject areas

  • Software
  • Computer Vision and Pattern Recognition
  • Computational Theory and Mathematics
  • Artificial Intelligence
  • Applied Mathematics


Dive into the research topics of 'Video event detection: From subvolume localization to spatiotemporal path search'. Together they form a unique fingerprint.

Cite this