Abstract

Perception of obstacles remains a critical safety concern for autonomous vehicles. Real-world collisions have shown that the autonomy faults leading to fatal collisions originate from obstacle existence detection. Open source autonomous driving implementations show a perception pipeline with complex interdependent Deep Neural Networks. These networks are not fully verifiable, making them unsuitable for safety-critical tasks. In this work, we present a safety verification of an existing LiDAR based classical obstacle detection algorithm. We establish strict bounds on the capabilities of this obstacle detection algorithm. Given safety standards, such bounds allow for determining LiDAR sensor properties that would reliably satisfy the standards. Such analysis has as yet been unattainable for neural network based perception systems. We provide a rigorous analysis of the obstacle detection system with empirical results based on real-world sensor data.

Original languageEnglish (US)
Title of host publicationProceedings - 2022 IEEE 33rd International Symposium on Software Reliability Engineering, ISSRE 2022
PublisherIEEE Computer Society
Pages61-72
Number of pages12
ISBN (Electronic)9781665451321
DOIs
StatePublished - 2022
Event33rd IEEE International Symposium on Software Reliability Engineering, ISSRE 2022 - Charlotte, United States
Duration: Oct 31 2021Nov 3 2021

Publication series

NameProceedings - International Symposium on Software Reliability Engineering, ISSRE
Volume2022-October
ISSN (Print)1071-9458

Conference

Conference33rd IEEE International Symposium on Software Reliability Engineering, ISSRE 2022
Country/TerritoryUnited States
CityCharlotte
Period10/31/2111/3/21

Keywords

  • Autonomous vehicles
  • Object detection
  • Vehicle safety

ASJC Scopus subject areas

  • Software
  • Safety, Risk, Reliability and Quality

Fingerprint

Dive into the research topics of 'Verifiable Obstacle Detection'. Together they form a unique fingerprint.

Cite this