Certified Robust Control under Adversarial Perturbations

Jinghan Yang, Hunmin Kim, Wenbin Wan, Naira Hovakimyan, Yevgeniy Vorobeychik

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Autonomous systems increasingly rely on machine learning techniques to transform high-dimensional raw inputs into predictions that are then used for decision-making and control. However, it is often easy to maliciously manipulate such inputs and, as a result, predictions. While effective techniques have been proposed to certify the robustness of predictions to adversarial input perturbations, such techniques have been disembodied from control systems that make downstream use of the predictions. We propose the first approach for composing robustness certification of predictions with respect to raw input perturbations with robust control to obtain certified robustness of control to adversarial input perturbations. We use a case study of adaptive vehicle control to illustrate our approach and show the value of the resulting end-to-end certificates through extensive experiments.

Original languageEnglish (US)
Title of host publication2023 American Control Conference, ACC 2023
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages4090-4095
Number of pages6
ISBN (Electronic)9798350328066
DOIs
StatePublished - 2023
Event2023 American Control Conference, ACC 2023 - San Diego, United States
Duration: May 31 2023Jun 2 2023

Publication series

NameProceedings of the American Control Conference
Volume2023-May
ISSN (Print)0743-1619

Conference

Conference2023 American Control Conference, ACC 2023
Country/TerritoryUnited States
CitySan Diego
Period5/31/236/2/23

ASJC Scopus subject areas

  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Certified Robust Control under Adversarial Perturbations'. Together they form a unique fingerprint.

Cite this