Long-Term Pedestrian Trajectory Prediction Using Mutable Intention Filter and Warp LSTM

Zhe Huang, Aamir Hasan, Kazuki Shin, Ruohua Li, Katherine Driggs-Campbell

Research output: Contribution to journalArticlepeer-review

Abstract

Trajectory prediction is one of the key capabilities for robots to safely navigate and interact with pedestrians. Critical insights from human intention and behavioral patterns need to be integrated to effectively forecast long-term pedestrian behavior. Thus, we propose a framework incorporating a mutable intention filter and a Warp LSTM (MIF-WLSTM) to simultaneously estimate human intention and perform trajectory prediction. The mutable intention filter is inspired by particle filtering and genetic algorithms, where particles represent intention hypotheses that can be mutated throughout the pedestrian's motion. Instead of predicting sequential displacement over time, our Warp LSTM learns to generate offsets on a full trajectory predicted by a nominal intention-aware linear model, which considers the intention hypotheses during filtering process. Through experiments on a publicly available dataset, we show that our method outperforms baseline approaches and demonstrate the robust performance of our method under abnormal intention-changing scenarios.

Original languageEnglish (US)
Article number9309334
Pages (from-to)542-549
Number of pages8
JournalIEEE Robotics and Automation Letters
Volume6
Issue number2
DOIs
StatePublished - Apr 2021

Keywords

  • Human-centered robotics
  • intention recognition
  • modeling and simulating humans

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Biomedical Engineering
  • Human-Computer Interaction
  • Mechanical Engineering
  • Computer Vision and Pattern Recognition
  • Computer Science Applications
  • Control and Optimization
  • Artificial Intelligence

Fingerprint Dive into the research topics of 'Long-Term Pedestrian Trajectory Prediction Using Mutable Intention Filter and Warp LSTM'. Together they form a unique fingerprint.

Cite this