Visual memory for robust path following

Ashish Kumar, Saurabh Gupta, David Fouhey, Sergey Levine, Jitendra Malik

Research output: Contribution to journalConference article

Abstract

Humans routinely retrace paths in a novel environment both forwards and backwards despite uncertainty in their motion. This paper presents an approach for doing so. Given a demonstration of a path, a first network generates a path abstraction. Equipped with this abstraction, a second network observes the world and decides how to act to retrace the path under noisy actuation and a changing environment. The two networks are optimized end-to-end at training time. We evaluate the method in two realistic simulators, performing path following and homing under actuation noise and environmental changes. Our experiments show that our approach outperforms classical approaches and other learning based baselines.

Original languageEnglish (US)
Pages (from-to)765-774
Number of pages10
JournalAdvances in Neural Information Processing Systems
Volume2018-December
StatePublished - Jan 1 2018
Externally publishedYes
Event32nd Conference on Neural Information Processing Systems, NeurIPS 2018 - Montreal, Canada
Duration: Dec 2 2018Dec 8 2018

Fingerprint

Demonstrations
Simulators
Data storage equipment
Experiments
Uncertainty

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Cite this

Kumar, A., Gupta, S., Fouhey, D., Levine, S., & Malik, J. (2018). Visual memory for robust path following. Advances in Neural Information Processing Systems, 2018-December, 765-774.

Visual memory for robust path following. / Kumar, Ashish; Gupta, Saurabh; Fouhey, David; Levine, Sergey; Malik, Jitendra.

In: Advances in Neural Information Processing Systems, Vol. 2018-December, 01.01.2018, p. 765-774.

Research output: Contribution to journalConference article

Kumar, A, Gupta, S, Fouhey, D, Levine, S & Malik, J 2018, 'Visual memory for robust path following', Advances in Neural Information Processing Systems, vol. 2018-December, pp. 765-774.
Kumar A, Gupta S, Fouhey D, Levine S, Malik J. Visual memory for robust path following. Advances in Neural Information Processing Systems. 2018 Jan 1;2018-December:765-774.
Kumar, Ashish ; Gupta, Saurabh ; Fouhey, David ; Levine, Sergey ; Malik, Jitendra. / Visual memory for robust path following. In: Advances in Neural Information Processing Systems. 2018 ; Vol. 2018-December. pp. 765-774.
@article{9035288c9ddc4a589050e0feb1c4e2ce,
title = "Visual memory for robust path following",
abstract = "Humans routinely retrace paths in a novel environment both forwards and backwards despite uncertainty in their motion. This paper presents an approach for doing so. Given a demonstration of a path, a first network generates a path abstraction. Equipped with this abstraction, a second network observes the world and decides how to act to retrace the path under noisy actuation and a changing environment. The two networks are optimized end-to-end at training time. We evaluate the method in two realistic simulators, performing path following and homing under actuation noise and environmental changes. Our experiments show that our approach outperforms classical approaches and other learning based baselines.",
author = "Ashish Kumar and Saurabh Gupta and David Fouhey and Sergey Levine and Jitendra Malik",
year = "2018",
month = "1",
day = "1",
language = "English (US)",
volume = "2018-December",
pages = "765--774",
journal = "Advances in Neural Information Processing Systems",
issn = "1049-5258",

}

TY - JOUR

T1 - Visual memory for robust path following

AU - Kumar, Ashish

AU - Gupta, Saurabh

AU - Fouhey, David

AU - Levine, Sergey

AU - Malik, Jitendra

PY - 2018/1/1

Y1 - 2018/1/1

N2 - Humans routinely retrace paths in a novel environment both forwards and backwards despite uncertainty in their motion. This paper presents an approach for doing so. Given a demonstration of a path, a first network generates a path abstraction. Equipped with this abstraction, a second network observes the world and decides how to act to retrace the path under noisy actuation and a changing environment. The two networks are optimized end-to-end at training time. We evaluate the method in two realistic simulators, performing path following and homing under actuation noise and environmental changes. Our experiments show that our approach outperforms classical approaches and other learning based baselines.

AB - Humans routinely retrace paths in a novel environment both forwards and backwards despite uncertainty in their motion. This paper presents an approach for doing so. Given a demonstration of a path, a first network generates a path abstraction. Equipped with this abstraction, a second network observes the world and decides how to act to retrace the path under noisy actuation and a changing environment. The two networks are optimized end-to-end at training time. We evaluate the method in two realistic simulators, performing path following and homing under actuation noise and environmental changes. Our experiments show that our approach outperforms classical approaches and other learning based baselines.

UR - http://www.scopus.com/inward/record.url?scp=85064804690&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85064804690&partnerID=8YFLogxK

M3 - Conference article

AN - SCOPUS:85064804690

VL - 2018-December

SP - 765

EP - 774

JO - Advances in Neural Information Processing Systems

JF - Advances in Neural Information Processing Systems

SN - 1049-5258

ER -