Expressive querying for accelerating visual analytics

Tarique Siddiqui, Paul Luh, Zesheng Wang, Karrie Karahalios, Aditya G Parameswaran

Research output: Contribution to journalArticlepeer-review

Abstract

Data visualization is the primary means by which data analysts explore patterns, trends, and insights in their data. Unfortunately, existing visual analytics tools offer limited expressiveness and scalability when it comes to searching for visualizations over large datasets, making visual data exploration labor-intensive and time-consuming. In this work, we introduce the problem of visualization search and highlight two underlying challenges of search enumeration and visualization matching. To address them, we first present our work on Zenvisage that helps enumerate large collections of visualizations and supports simple visualization matching with the help of an interactive interface and an expressive visualization query language. For more finegrained and flexible visualization matching, including search for underspecified and approximate patterns, we extend Zenvisage to develop ShapeSearch. ShapeSearch supports a novel shape querying algebra that helps express a large class of pattern queries that are hard to specify with existing systems. ShapeSearch exposes multiple specification mechanisms: sketch, natural-language, and visual regular expressions that help users easily issue shape queries, while applying query-aware and perceptually-aware optimizations to efficiently execute them within interactive response times. To conclude, we discuss a number of open research problems to further improve the usability and performance of both Zenvisage and ShapeSearch.

Original languageEnglish (US)
Pages (from-to)85-94
Number of pages10
JournalCommunications of the ACM
Volume65
Issue number7
DOIs
StatePublished - Jul 2022

ASJC Scopus subject areas

  • Computer Science(all)

Fingerprint

Dive into the research topics of 'Expressive querying for accelerating visual analytics'. Together they form a unique fingerprint.

Cite this