A dual number abstraction for static analysis of Clarke Jacobians

Jacob Laurel, Rem Yang, Gagandeep Singh, Sasa Misailovic

Research output: Contribution to journalArticlepeer-review


We present a novel abstraction for bounding the Clarke Jacobian of a Lipschitz continuous, but not necessarily differentiable function over a local input region. To do so, we leverage a novel abstract domain built upon dual numbers, adapted to soundly over-approximate all first derivatives needed to compute the Clarke Jacobian. We formally prove that our novel forward-mode dual interval evaluation produces a sound, interval domain-based over-approximation of the true Clarke Jacobian for a given input region. Due to the generality of our formalism, we can compute and analyze interval Clarke Jacobians for a broader class of functions than previous works supported - specifically, arbitrary compositions of neural networks with Lipschitz, but non-differentiable perturbations. We implement our technique in a tool called DeepJ and evaluate it on multiple deep neural networks and non-differentiable input perturbations to showcase both the generality and scalability of our analysis. Concretely, we can obtain interval Clarke Jacobians to analyze Lipschitz robustness and local optimization landscapes of both fully-connected and convolutional neural networks for rotational, contrast variation, and haze perturbations, as well as their compositions.

Original languageEnglish (US)
Article number3498696
JournalProceedings of the ACM on Programming Languages
Issue numberPOPL
StatePublished - Jan 2022


  • Abstract Interpretation
  • Differentiable Programming
  • Robustness

ASJC Scopus subject areas

  • Software
  • Safety, Risk, Reliability and Quality


Dive into the research topics of 'A dual number abstraction for static analysis of Clarke Jacobians'. Together they form a unique fingerprint.

Cite this