An exploratory user study of visual causality analysis

Chi Hsien Eric Yen, Aditya Parameswaran, Wai Tat Fu

Research output: Contribution to journalArticlepeer-review

Abstract

Interactive visualization tools are being used by an increasing number of members of the general public; however, little is known about how, and how well, people use visualizations to infer causality. Adapted from the mediation causal model, we designed an analytic framework to systematically evaluate human performance, strategies, and pitfalls in a visual causal reasoning task. We recruited 24 participants and asked them to identify the mediators in a fictitious dataset using bar charts and scatter plots within our visualization interface. The results showed that the accuracy of their responses as to whether a variable is a mediator significantly decreased when a confounding variable directly influenced the variable being analyzed. Further analysis demonstrated how individual visualization exploration strategies and interfaces might influence reasoning performance. We also identified common strategies and pitfalls in their causal reasoning processes. Design implications for how future visual analytics tools can be designed to better support causal inference are discussed.

Original languageEnglish (US)
Pages (from-to)173-184
Number of pages12
JournalComputer Graphics Forum
Volume38
Issue number3
DOIs
StatePublished - 2019

Keywords

  • Human-centered computing → Empirical studies in visualization
  • Visualization design and evaluation methods

ASJC Scopus subject areas

  • Computer Graphics and Computer-Aided Design

Fingerprint Dive into the research topics of 'An exploratory user study of visual causality analysis'. Together they form a unique fingerprint.

Cite this