Charting the Sociotechnical Gap in Explainable AI: A Framework to Address the Gap in XAI

Upol Ehsan, Koustuv Saha, Munmun De Choudhury, Mark O. Riedl

Research output: Contribution to journalArticlepeer-review


Explainable AI (XAI) systems are sociotechnical in nature; thus, they are subject to the sociotechnical gap-divide between the technical affordances and the social needs. However, charting this gap is challenging. In the context of XAI, we argue that charting the gap improves our problem understanding, which can reflexively provide actionable insights to improve explainability. Utilizing two case studies in distinct domains, we empirically derive a framework that facilitates systematic charting of the sociotechnical gap by connecting AI guidelines in the context of XAI and elucidating how to use them to address the gap. We apply the framework to a third case in a new domain, showcasing its affordances. Finally, we discuss conceptual implications of the framework, share practical considerations in its operationalization, and offer guidance on transferring it to new contexts. By making conceptual and practical contributions to understanding the sociotechnical gap in XAI, the framework expands the XAI design space.

Original languageEnglish (US)
Article number34
JournalProceedings of the ACM on Human-Computer Interaction
Issue number1 CSCW
StatePublished - Apr 16 2023
Externally publishedYes


  • AI ethics
  • AI governance
  • explainable ai
  • fate
  • framework
  • human-AI interaction
  • human-centered explainable ai
  • organizational dynamics
  • participatory design
  • responsible ai
  • sociotechnical gap
  • user study

ASJC Scopus subject areas

  • Social Sciences (miscellaneous)
  • Human-Computer Interaction
  • Computer Networks and Communications


Dive into the research topics of 'Charting the Sociotechnical Gap in Explainable AI: A Framework to Address the Gap in XAI'. Together they form a unique fingerprint.

Cite this